Using CSOM to update multiple tasks with only 1 project publish - ms-project

I have the following code that updates certain task fields. The problem with this code is that I have to publish an entire project after updating each task one-by-one. So for example, if I had 500 tasks to update, the project would have to be published 500 times. As you can see, this is completely overkill, slow and unnecessary.
using (ProjectContext projectContext = GetClientContext(psCred))
{
projectContext.Load(projectContext.Projects);
projectContext.ExecuteQuery();
foreach (var project in projectContext.Projects)
{
var draftProject = project.CheckOut();
projectContext.Load(draftProject.Tasks);
projectContext.ExecuteQuery();
projectContext.Load(draftProject.Task);
projectContext.ExecuteQuery();
foreach (var task in draftProject.Tasks)
{
task["FIELD"] = "VALUE";
var job = draftProject.Update();
projectContext.WaitForQueue(job, int.MaxValue);
job = draftProject.Publish(true);
projectContext.WaitForQueue(job, int.MaxValue);
}
}
}
I hope there is a way to update update all of the project tasks at once with only one publish at the end just like how Microsoft Project desktop application does it.

For what it's worth, I was able to create a bunch of tasks and then use this code to update a custom field on all of the new/existing tasks with a single publish:
DraftProject projCheckedOut = proj2Edit.CheckOut();
_projContext.Load(projCheckedOut.Tasks, ts =>
ts.Include(
t => t.Id,
t => t.Name,
t => t.CustomFields,
t => t.OutlineLevel,
t => t.IsSummary));
_projContext.ExecuteQuery();
// Lookup custom field internal name from our StringDictionary
string intNamePref = _customFields["Custom Field1"];
var tasks = projCheckedOut.Tasks;
foreach (var t in tasks)
{
t[intNamePref] = 2;
}
projCheckedOut.Publish(true);
_projContext.ExecuteQuery();
Tip, to get the .Include lambda to work, I had to add this using:
using Microsoft.SharePoint.Client;

Related

Getting started querying sync data from Realm

I'm attempting to get a list of items from a MongoDB Atlas instance via Realm into my Xamarin Forms application. I'm pretty sure I've followed the instructions correctly, however I can't seem to see any values come in.
I'm using Realm and Realm.Fody version: 10.2.0
I've uploaded my data with the following script
mongoimport --uri mongodb+srv://[user]:[pass]#cluster0.[wxyz].mongodb.net/[company] --collection products --type json --file products --jsonArray
1057 document(s) imported successfully. 0 document(s) failed to import.
When I go to MongoDB Atlas, I see this
In my App.xaml.cs constructor, I create a realm app and log in with anonymous credentials
public const string MasterDataPartitionKey = "master_data";
ctor()
{
var app = Realms.Sync.App.Create("my-app-id");
app.LogInAsync(Credentials.Anonymous()).ContinueWith(task => User = task.Result);
}
After this, in my first ViewModel constructor (it's Rx, but isn't doing anything fancy)
ctor()
{
_listProductsCommand = ReactiveCommand.Create<Realm, IEnumerable<Product>>(ExecuteLoadProducts);
_listProductsCommand.ThrownExceptions.Subscribe(x => { /* checking for errors here */ }).DisposeWith(TrashBin);
Initialize
.ObserveOn(RxApp.MainThreadScheduler)
// note MasterDataPartitionKey is the value I've set on every single record's `_partitionKey` property.
.Select(_ => new SyncConfiguration(MasterDataPartitionKey, App.User))
.SelectMany(Realm.GetInstanceAsync)
.Do(_ => { }, ex => { /* checking for errors here */ })
.ObserveOn(RxApp.MainThreadScheduler)
.InvokeCommand(this, x => x._listProductsCommand)
.DisposeWith(TrashBin);
}
And later, a simple query
private static IEnumerable<Product> ExecuteLoadProducts(Realm realm)
{
var data = realm.All<ProductDto>();
var productDtos = data.ToList();
var products = productDtos.Select(x => x.Map());
return products;
}
Expected:
productDtos (and products) should have a count of 1057
Actual:
productDtos (and products) have a count of 0
I've looked in my Realm logs and I can see the connections and sync logs
My anonymous authentication is turned on
and I've made it so that all of my Collections have read access (in fact developer mode is on)
Here's an example of one of the records
Here's a snipped of the dto I'm trying to pull down
I feel as though I must be missing something simple. Can anyone see anything obvious that could be sending me sideways?

Bulk insert Azure Cosmos DB

I can found some samples stating following should be bulk inserts:
var options = new CosmosClientOptions() { AllowBulkExecution = true, MaxRetryAttemptsOnRateLimitedRequests = 1000 };
Client = new CosmosClient(ConnStr, options);
public async Task AddVesselsFromJSON(List<JObject> vessels)
{
List<Task> concurrentTasks = new List<Task>();
foreach (var vessel in vessels)
{
concurrentTasks.Add(VesselContainer.UpsertItemAsync(vessel));
}
await Task.WhenAll(concurrentTasks);
}
I am running the code on an Azure Function (App Plan) with 10 instances. However I can see it is only around 4 inserts pr seconds. With SQL bulk insert I can do thousands a second. It does not seem like above is bulk inserting have I missed something?
Check your Cosmos DB Scale settings. I ran into the same issue. When you change from manual to autoscale, the default max RUs are set to 4000. Change it to an appropriate number based on your scenario. You can use https://cosmos.azure.com/capacitycalculator/

How to avoid multiple repeat database queries

We are using asp.net mvc entity framework
We have a number of queries that represent large static data. Im wondering what is the best and simplest way to avoid querying the data each time.
We currently use a custom cache but it seems to have issues. Im wondering how others achieve this.
For example I have a locations data set that has over 10k records
Its used throughout the site and can be queried when the app starts doesn't need to get fetched again.
Just use the System.Runtime.Caching.MemoryCache
Maybe this helps:
void SomeAction()
{
var test1 = LoadIds();
var test2 = LoadIds();
// test1 & test2 should be the same
}
int[] LoadIds() {
var random = new Random();
return MemoryCache.Default.GetCached("LoadIds", 60, () => new int[10].Select(x => random.Next()).ToArray() );
}
public static class CacheExtensions {
public static T GetCached<T>(this ObjectCache cache, string key, int cacheTime, Func<T> acquire)
{
if (cache.Contains(key))
return (T) cache[key];
var result = acquire();
if (cacheTime > 0)
cache.Add(new CacheItem(key, result), new CacheItemPolicy {AbsoluteExpiration = DateTime.Now + TimeSpan.FromMinutes(cacheTime)});
return result;
}
}
Now instead of using this random initialization...
MemoryCache.Default.GetCached("LoadIds", 60, () => new int[10].Select(x => random.Next()).ToArray());
... you can use your query. Just make sure to add a .ToList() or .ToArray() at the end, since you want to cache the data and not the just the query. e.g.
MemoryCache.Default.GetCached("LoadIds", 60, () => dbContext.Persons.Select(x => x.Id).ToArray());
Or in your case maybe
MemoryCache.Default.GetCached("LoadLocations", 60, () => dbContext.Locations.ToArray());
This extension method above is not perfect though. In case that two requests access the same content at the same time it might double load that data...
I'm using this method in large scale web apps, while abstracting out the MemoryCache in order to replace it with other types of caches like Redis, azure storage or even the HttpContext.Current.Items for caches per request.

Fetching data from cache if available, fetch from database if not

I have a page that need to run a query against a large dataset very often. To ease the burden on the database, I've set up a cache that will refresh itself every 5 minutes.
The logic is:
When a call is made, check if there is data in cache, if it is, run the queryu on the cache. If not, start a task of fetching from all rows from database while running a query on my repository to get out just the data needed for that call. When all rows is fetched, put it in the cache so it can be accessed on the next call. The problem is that I sometimes get a: "Message = "There is already an open DataReader associated with this Command which must be closed first." I guess this is because it runs two queries to the same repository at the same time (one for all rows and one for the query). I've got MARS enabled in my connections string.
My code
public IQueryable<TrackDto> TrackDtos([FromUri] int[] Ids)
{
if (HttpContext.Current.Cache["Tracks"] != null && ((IQueryable<TrackDto>)HttpContext.Current.Cache["Tracks"]).Any())
{
var trackDtos = Ids.Length > 0
? ((IQueryable<TrackDto>)HttpContext.Current.Cache["Tracks"]).Where(trackDto => Ids.Contains(trackDto.Id).AsQueryable()
: ((IQueryable<TrackDto>)HttpContext.Current.Cache["Tracks"]).AsQueryable();
return trackDtos;
}
else
{
UpdateTrackDtoCache(DateTime.Today);
var trackDtos = Ids.Length > 0
? WebRepository.TrackDtos.Where(trackDto => trackDto.Date == DateTime.Today && Ids.Contains(trackDto.Id)).AsQueryable()
: WebRepository.TrackDtos.Where(trackDto => trackDto.Date == DateTime.Today).AsQueryable().AsQueryable();
return trackDtos;
}
}
private IQueryable<TrackDto> MapTrackDtosFromDb(DateTime date)
{
return WebRepository.TrackDtos.Where(tdto => tdto.Date == date.Date);
}
private void UpdateTrackDtoCache(DateTime date)
{
if (CacheIsUpdating)
return;
CacheIsUpdating = true;
var task = Task.Factory.StartNew(
state =>
{
var context = (HttpContext)state;
context.Cache.Insert("Tracks", MapTrackDtosFromDb(date), null, Cache.NoAbsoluteExpiration,
new TimeSpan(0, 5, 0));
CacheIsUpdating = false;
},
HttpContext.Current);
}
I believe you are running DML or DDL sql queries using the same active connection. And MARS does not allow that. You can execute multiple select statements or bulk insert but if you run multiple update, delete statements or your sql execution will throw this kind of errors. Even if you run an update sql query while running a select statement on the same command you will get this error. For more info read this
http://msdn.microsoft.com/en-us/library/h32h3abf(v=vs.110).aspx

How to configure links with HasNavigationPropertiesLink after latest WebAPI beta (26-Jun-2013) Update

I have a basic POCO (No database) structure implementing an OData Service with the latest WebAPI update. Unfortunately, the latest update broke the HasNavigationPropertiesLink code that I had to generate links which can be used for $expand operations. Here is my old code:
var jobs = modelBuilder.EntitySet<Job>("Jobs");
jobs.EntityType.NavigationProperties,
(entityContext, navigationProperty) => new
Uri(entityContext.UrlHelper.Link(ODataRouteNames.PropertyNavigation,
new
{
Controller = "Jobs",
parentId = entityContext.EntityInstance.ID,
NavigationProperty = navigationProperty.Name
})));
And here is my new code (that doesn't work):
var jobs = modelBuilder.EntitySet<Job>("Jobs");
jobs.EntityType.NavigationProperties,
(entityContext, navigationProperty) => new
Uri(entityContext.Url.Link(<??WHAT GOES HERE??>,
new
{
Controller = "Jobs",
parentId = entityContext.EdmObject,
NavigationProperty = navigationProperty.Name
})),
true);
Any help is much appreciated - this doesn't seem to have been documented in the updates.
looks like the version of the OData bits you are using is very old. In our current version, you can use the ODataConventionsModelBuilder to create a model that defines navigation properties and links following conventions, so unless you need to generate custom links, it's a better way to go. However, if you want to generate a custom navigation link, the link generation code looks something similar to this:
var jobs = builder.EntitySet<Job>("Jobs");
jobs.HasNavigationPropertiesLink(customers.EntityType.NavigationProperties,
(context, navigationProperty) =>
{
var result = "http://mydomain.com/prefix/odataPath";
//In order to generate this link you can use context.Url.ODataLink(new EntityPathSegment("Jobs"), ...);
return new Uri(result);
}, followsConventions: true);
it is better to use ODataConventionsModelBuilder as Javier has suggested. But if you still want to set up your own odata model you can do it:
var jobs = builder.EntitySet<Job>("Jobs");
jobs.HasNavigationPropertiesLink(customers.EntityType.NavigationProperties,
(context, navigationProperty) => context.GenerateNavigationPropertyLink(navigationProperty, false)
, followsConventions: true);

Resources