I have different plugins in my Web api project with their own XML docs, and have one centralized Help page, but the problem is that Web Api's default Help Page only supports single documentation file
new XmlDocumentationProvider(HttpContext.Current.Server.MapPath("~/App_Data/Documentation.xml"))
How is it possible to load config from different files? I wan to do sth like this:
new XmlDocumentationProvider("PluginsFolder/*.xml")
You can modify the installed XmlDocumentationProvider at Areas\HelpPage to do something like following:
Merge multiple Xml document files into a single one:
Example code(is missing some error checks and validation):
using System.Xml.Linq;
using System.Xml.XPath;
XDocument finalDoc = null;
foreach (string file in Directory.GetFiles(#"PluginsFolder", "*.xml"))
{
if(finalDoc == null)
{
finalDoc = XDocument.Load(File.OpenRead(file));
}
else
{
XDocument xdocAdditional = XDocument.Load(File.OpenRead(file));
finalDoc.Root.XPathSelectElement("/doc/members")
.Add(xdocAdditional.Root.XPathSelectElement("/doc/members").Elements());
}
}
// Supply the navigator that rest of the XmlDocumentationProvider code looks for
_documentNavigator = finalDoc.CreateNavigator();
Kirans solution works very well. I ended up using his approach but by creating a copy of XmlDocumentationProvider, called MultiXmlDocumentationProvider, with an altered constructor:
public MultiXmlDocumentationProvider(string xmlDocFilesPath)
{
XDocument finalDoc = null;
foreach (string file in Directory.GetFiles(xmlDocFilesPath, "*.xml"))
{
using (var fileStream = File.OpenRead(file))
{
if (finalDoc == null)
{
finalDoc = XDocument.Load(fileStream);
}
else
{
XDocument xdocAdditional = XDocument.Load(fileStream);
finalDoc.Root.XPathSelectElement("/doc/members")
.Add(xdocAdditional.Root.XPathSelectElement("/doc/members").Elements());
}
}
}
// Supply the navigator that rest of the XmlDocumentationProvider code looks for
_documentNavigator = finalDoc.CreateNavigator();
}
I register the new provider from HelpPageConfig.cs:
config.SetDocumentationProvider(new MultiXmlDocumentationProvider(HttpContext.Current.Server.MapPath("~/App_Data/")));
Creating a new class and leaving the original one unchanged may be more convenient when upgrading etc...
Rather than create a separate class along the lines of XmlMultiDocumentationProvider, I just added a constructor to the existing XmlDocumentationProvider. Instead of taking a folder name, this takes a list of strings so you can still specify exactly which files you want to include (if there are other xml files in the directory that the Documentation XML are in, it might get hairy). Here's my new constructor:
public XmlDocumentationProvider(IEnumerable<string> documentPaths)
{
if (documentPaths.IsNullOrEmpty())
{
throw new ArgumentNullException(nameof(documentPaths));
}
XDocument fullDocument = null;
foreach (var documentPath in documentPaths)
{
if (documentPath == null)
{
throw new ArgumentNullException(nameof(documentPath));
}
if (fullDocument == null)
{
using (var stream = File.OpenRead(documentPath))
{
fullDocument = XDocument.Load(stream);
}
}
else
{
using (var stream = File.OpenRead(documentPath))
{
var additionalDocument = XDocument.Load(stream);
fullDocument?.Root?.XPathSelectElement("/doc/members").Add(additionalDocument?.Root?.XPathSelectElement("/doc/members").Elements());
}
}
}
_documentNavigator = fullDocument?.CreateNavigator();
}
The HelpPageConfig.cs looks like this. (Yes, it can be fewer lines, but I don't have a line limit so I like splitting it up.)
var xmlPaths = new[]
{
HttpContext.Current.Server.MapPath("~/bin/Path.To.FirstNamespace.XML"),
HttpContext.Current.Server.MapPath("~/bin/Path.To.OtherNamespace.XML")
};
var documentationProvider = new XmlDocumentationProvider(xmlPaths);
config.SetDocumentationProvider(documentationProvider);
I agree with gurra777 that creating a new class is a safer upgrade path. I started with that solution but it involves a fair amount of copy/pasta, which could easily get out of date after a few package updates.
Instead, I am keeping a collection of XmlDocumentationProvider children. For each of the implementation methods, I'm calling into the children to grab the first non-empty result.
public class MultiXmlDocumentationProvider : IDocumentationProvider, IModelDocumentationProvider
{
private IList<XmlDocumentationProvider> _documentationProviders;
public MultiXmlDocumentationProvider(string xmlDocFilesPath)
{
_documentationProviders = new List<XmlDocumentationProvider>();
foreach (string file in Directory.GetFiles(xmlDocFilesPath, "*.xml"))
{
_documentationProviders.Add(new XmlDocumentationProvider(file));
}
}
public string GetDocumentation(System.Reflection.MemberInfo member)
{
return _documentationProviders
.Select(x => x.GetDocumentation(member))
.FirstOrDefault(x => !string.IsNullOrWhiteSpace(x));
}
//and so on...
The HelpPageConfig registration is the same as in gurra777's answer,
config.SetDocumentationProvider(new MultiXmlDocumentationProvider(HttpContext.Current.Server.MapPath("~/App_Data/")));
Related
my first question is here
however since I was advised that questions should not change the original matter I created a new one.
I am saving user settings and I would like to save it in the list, I have had a look on setting by James however I found that that its not possible to save it in the list. So ia have decided to use Xamarin Essentials.
First I tried to save only a string value, which after some struggle I managed to work out and now I am trying to save an object
static void AddToList(SettingField text)
{
var savedList = new List<SettingField>(Preference.SavedList);
savedList.Add(text);
Preference.SavedList = savedList;
}
private void ExecuteMultiPageCommand(bool value)
{
var recognitionProviderSettings = new RecognitionProviderSettings
{SettingFields = new List<SettingField>()};
var set = new SettingField()
{
ProviderSettingId = "test",
Value = "test"
};
AddToList(set);
NotifyPropertyChanged("IsMultiPage");
}
and then the sterilization and des
public static class Preference
{
private static SettingField _settingField;
public static List<SettingField> SavedList
{
get
{
//var savedList = Deserialize<List<string>>(Preferences.Get(nameof(SavedList), "tesr"));
var savedList = Newtonsoft.Json.JsonConvert.DeserializeObject<SettingField>(Preferences.Get(nameof(SavedList), _settingField)) ;
SavedList.Add(savedList);
return SavedList ?? new List<SettingField>();
}
set
{
var serializedList = Serialize(value);
Preferences.Set(nameof(SavedList), serializedList);
}
}
static T Deserialize<T>(string serializedObject) => JsonConvert.DeserializeObject<T>(serializedObject);
static string Serialize<T>(T objectToSerialize) => JsonConvert.SerializeObject(objectToSerialize);
}
}
But Preferences.Get doesn't take object, is there any other way how can I save my setting to a object list? Please advise
I would recommend you to use SecureStorage. You can save your strings only into it. So the place where you have serilized your object as json. Just convert your json to string with .ToString() and save it into secure storage.
You may continue saving your serialized json object as string in Shared preferences but it is recommended to use SecureStorage Instead.
I am building a VSIX package to support a custom language in Visual Studio using MPF. I am in a custom designer and I need to find the files referenced in the project to resolve some dependencies. Where can I access this list?
I assume, that you´re using MPF to implement the project system for your custom language service. When doing so, you probably have a project root node which is derived from either ProjectNode or HierarchyNode...
If so, you could share the root node´s instance with the designer and try to find files by traversing the hierarchy, for instance...
internal class HierarchyVisitor
{
private readonly Func<HierarchyNode, bool> filterCallback;
public HierarchyVisitor(
Func<HierarchyNode, bool> filter)
{
this.filterCallback = filter;
}
public IEnumerable<HierarchyNode> Visit(
HierarchyNode node)
{
var stack = new Stack<HierarchyNode>();
stack.Push(node);
while (stack.Any())
{
HierarchyNode next = stack.Pop();
if (this.filterCallback(next))
{
yield return next;
}
for (
HierarchyNode child = next.FirstChild;
child != null;
child = child.NextSibling)
{
stack.Push(child);
}
}
}
}
To get a list of all nodes in the hierarchy, you could just do...
ProjectNode root = ...
var visitor = new HierarchyVisitor(x => true);
IEnumerable<HierarchyNode> flatList = visitor.Visit(root);
Or to filter for a certain file type, you could try something like this...
ProjectNode root = ...
var visitor = new HierarchyVisitor((HierarchyNode x) =>
{
const string XmlFileExtension = ".xml";
string path = new Uri(x.Url, UriKind.Absolut).LocalPath;
return string.Compare(
XmlFileExtension,
Path.GetFileExtension(path),
StringComparison.InvariantCultureIgnoreCase) == 0;
});
IEnumerable<HierarchyNode> xmlFiles = visitor.Visit(root);
This question is specifically related creating a tree using multithreading and recursion.
I have got the code running that will create the tree using recursion but the time required to create that tree is more than I want to spend.
The reason for the slowness of that is because I am calling TaxonomyManager in Ektron CMS which takes a little bit to return and all the calls add up quickly. I was wondering if there is a way to create a tree using multithreading.
(I don't have the code at present with me but I will add that code as soon as I get access to that code).
If I go this route what the chances of me corrupting the tree as the tree is one root node and multithreading is going to add those nodes to that node at some point.
Thanks for any input anyone may have.
Edit: Added code. TaxonomyNodes is my class doesn't have a lot of properties. Has Id,Name,Description, Path (Stores the path in similar way as Ektron), HasChildren flag, ParentId, and public List Children.
public List<TaxonomyNodes> CreateTree()
{
try
{
TaxonomyManager tManager = new TaxonomyManager();
TaxonomyCriteria criteria = new TaxonomyCriteria();
criteria.AddFilter(TaxonomyProperty.ParentId, CriteriaFilterOperator.EqualTo, 0);
criteria.OrderByDirection = EkEnumeration.OrderByDirection.Ascending;
criteria.OrderByField = TaxonomyProperty.Id;
List<TaxonomyData> tDataList = tManager.GetList(criteria);
int index = 0;
if (tDataList != null)
{
foreach (TaxonomyData item in tDataList)
{
if (item.Name != "Companies" && item.Name != "Content Information Centers")
root.Insert(index++, new TaxonomyNodes() { ParentId = 0, TaxonomyId = item.Id, TaxonomyDescription = item.Description, TaxonomyName = item.Name, TaxonomyPath = item.Path, HasChildren = item.HasChildren, Children = new List<TaxonomyNodes>() });
}
}
index = 0;
foreach (TaxonomyNodes itemT in root)
{
itemT.Children = CreateNodes(itemT.TaxonomyId, itemT);
}
return root;
}
catch (Exception)
{
throw;
}
}
private List<TaxonomyNodes> CreateNodes(long taxonomyId, TaxonomyNodes itemToAddTo)
{
try
{
TaxonomyManager tManager = new TaxonomyManager();
TaxonomyCriteria criteria = new TaxonomyCriteria();
criteria.AddFilter(TaxonomyProperty.ParentId, CriteriaFilterOperator.EqualTo, taxonomyId);
criteria.OrderByDirection = EkEnumeration.OrderByDirection.Ascending;
criteria.OrderByField = TaxonomyProperty.Id;
List<TaxonomyData> tDataList = tManager.GetList(criteria);
List<TaxonomyNodes> node = new List<TaxonomyNodes>();
if (tDataList != null)
{
foreach (TaxonomyData item in tDataList)
{
node.Add(new TaxonomyNodes() { ParentId = taxonomyId, Children = null, TaxonomyId = item.Id, TaxonomyDescription = item.Description, TaxonomyName = item.Name, TaxonomyPath = item.Path, HasChildren = item.HasChildren });
itemToAddTo.Children = node;
if (item.HasChildren)
{
CreateNodes(item.Id, node[node.Count - 1]);
}
else
{
return node;
}
}
}
return node;
}
catch (Exception)
{
throw;
}
}
Rather than get into multithreading, which, though it may work, is not officially supported by Ektron's APIs and may present other challenges, I would recommend some form of caching or other storage for the data that does not require recursive DB calls.
Options:
1 - Ektron's Taxonomy APIs do include a GetTree method, which can pull the entire tree, up to a specified number of levels, and child items in a single API call rather than recursively. This may perform better and would be easily cached.
2 - Ektron provides API-level caching that can be readily enabled in the web.config by changing
<framework defaultContainer="Default" childContainer="BusinessObjects" />
To:
<framework defaultContainer="Cache" childContainer="BusinessObjects" />
3 - Use an eSync Strategy which will output the data you need (better to use your own streamlined objects than to store Ektron's with all the extra data) to something like an XML file. See this sample, http://developer.ektron.com/Templates/CodeLibraryDetail.aspx?id=1989&blogid=116, which I wrote to do this very thing. It hooks into the DB sync complete event and triggers a console application which writes the entire Taxonomy structure out to an XML file.
I'm planning to implement a multi language website, so my first ideas were to use the resx files, but I have a requirements to let every text editable from the administration,
can i do such a feature with resx files or should I store them in a database (schemaless) or is there a better way to do this?
you can use xml or sql tables.
you should prepare a page for administrator and list all the words for translate.
base of language administrator logged on , update the translation of words into your table or xml file.
additional , for best performance load each language words to system catch .
write some code like this for entering words into table or xml.
<%=PLang.GetString("YourWordInEnglish")%>
in your aspx
...................
public static string GetString(string word)
{
try
{
if (String.IsNullOrWhiteSpace(word)) return "";
Dictionary<string, string> resourcesDictionary = GetResource(GetLanguageID());
if (resourcesDictionary != null)
{
if (!resourcesDictionary.ContainsKey(word.ToLower()))
{
Expression exp = new Expression();
exp.Word = exp.Translation = word;
exp.LanguageID = GetLanguageID();
exp.SiteID = Globals.GetSiteID();
if (exp.SiteID == 0 && exp.LanguageID == 0)
return word;
if (FLClass.createExpression(exp, ref resourcesDictionary) > 0)
return resourcesDictionary[word];
else
return word;
}
return resourcesDictionary[word.ToLower()];
}
else
return word;
}
catch
{
return word;
}
}
...................
function for edit
public class ViewExpressionListEdit : BaseWebService
{
[WebMethod(EnableSession = true)]
public bool updateExpression(ExpressionService expressionService)
{
Expression expression = new Expression();
expression.ExpressionID = expressionService.ExpressionID;
expression.Translation = expressionService.Translation;
expression.LanguageID = expressionService.LanguageID;
expression.SiteID = Globals.GetSiteID();
return FLClass.updateExpression(expression);
}
}
You can use XML files for translations, parse them on application startup and store translations in cache. You can use the FileSystemWatcher class to see when someone updates the files and then invalidate the cache.
I have code like this:
public bool Set(IEnumerable<WhiteForest.Common.Entities.Projections.RequestProjection> requests)
{
var documentSession = _documentStore.OpenSession();
//{
try
{
foreach (var request in requests)
{
documentSession.Store(request);
}
//requests.AsParallel().ForAll(x => documentSession.Store(x));
documentSession.SaveChanges();
documentSession.Dispose();
return true;
}
catch (Exception e)
{
_log.LogDebug("Exception in RavenRequstRepository - Set. Exception is [{0}]", e.ToString());
return false;
}
//}
}
This code gets called many times. After i get to around 50,000 documents that have passed through it i get an OutOfMemoryException.
Any idea why ? perhaps after a while i need to declare a new DocumentStore ?
thank you
**
UPDATE:
**
I ended up using the Batch/Patch API to perform the update I needed.
You can see the discussion here: https://groups.google.com/d/topic/ravendb/3wRT9c8Y-YE/discussion
Basically since i only needed to update 1 property on my objects, and after considering ayendes comments about re-serializing all the objects back to JSON, i did something like this:
internal void Patch()
{
List<string> docIds = new List<string>() { "596548a7-61ef-4465-95bc-b651079f4888", "cbbca8d5-be45-4e0d-91cf-f4129e13e65e" };
using (var session = _documentStore.OpenSession())
{
session.Advanced.DatabaseCommands.Batch(GenerateCommands(docIds));
}
}
private List<ICommandData> GenerateCommands(List<string> docIds )
{
List<ICommandData> retList = new List<ICommandData>();
foreach (var item in docIds)
{
retList.Add(new PatchCommandData()
{
Key = item,
Patches = new[] { new Raven.Abstractions.Data.PatchRequest () {
Name = "Processed",
Type = Raven.Abstractions.Data.PatchCommandType.Set,
Value = new RavenJValue(true)
}}});
}
return retList;
}
Hope this helps ...
Thanks alot.
I just did this for my current project. I chunked the data into pieces and saved each chunk in a new session. This may work for you, too.
Note, this example shows chunking by 1024 documents at a time, but needing at least 2000 before we decide it's worth chunking. So far, my inserts got the best performance with a chunk size of 4096. I think that's because my documents are relatively small.
internal static void WriteObjectList<T>(List<T> objectList)
{
int numberOfObjectsThatWarrantChunking = 2000; // Don't bother chunking unless we have at least this many objects.
if (objectList.Count < numberOfObjectsThatWarrantChunking)
{
// Just write them all at once.
using (IDocumentSession ravenSession = GetRavenSession())
{
objectList.ForEach(x => ravenSession.Store(x));
ravenSession.SaveChanges();
}
return;
}
int numberOfDocumentsPerSession = 1024; // Chunk size
List<List<T>> objectListInChunks = new List<List<T>>();
for (int i = 0; i < objectList.Count; i += numberOfDocumentsPerSession)
{
objectListInChunks.Add(objectList.Skip(i).Take(numberOfDocumentsPerSession).ToList());
}
Parallel.ForEach(objectListInChunks, listOfObjects =>
{
using (IDocumentSession ravenSession = GetRavenSession())
{
listOfObjects.ForEach(x => ravenSession.Store(x));
ravenSession.SaveChanges();
}
});
}
private static IDocumentSession GetRavenSession()
{
return _ravenDatabase.OpenSession();
}
Are you trying to save it all in one call?
The DocumentSession need to turn all of the objects that you pass it into a single request to the server. That means that it may allocate a lot of memory for the write to the server.
Usually we recommend on batches of about 1,024 items in you are doing bulks saves.
DocumentStore is a disposable class, so I worked around this problem by disposing the instance after each chunk. I highly doubt this is the most efficient way to run operations, but it will prevent significant memory overhead from happening.
I was running a sort of "delete all" operation like so. You can see the using blocks disposing both the DocumentStore and the IDocumentSession objects after each chunk.
static DocumentStore GetDataStore()
{
DocumentStore ds = new DocumentStore
{
DefaultDatabase = "test",
Url = "http://localhost:8080"
};
ds.Initialize();
return ds;
}
static IDocumentSession GetDbInstance(DocumentStore ds)
{
return ds.OpenSession();
}
static void Main(string[] args)
{
do
{
using (var ds = GetDataStore())
using (var db = GetDbInstance(ds))
{
//The `Take` operation will cap out at 1,024 by default, per Raven documentation
var list = db.Query<MyClass>().Skip(deleteSum).Take(5000).ToList();
deleteCount = list.Count;
deleteSum += deleteCount;
foreach (var item in list)
{
db.Delete(item);
}
db.SaveChanges();
list.Clear();
}
} while (deleteCount > 0);
}