Put simply, I have a dot net web app and it needs to record the users Timezone information (in order to send out the correct time inside emails).
using NodaTime.TimeZones;
var winmap = TzdbDateTimeZoneSource.Default.WindowsMapping.MapZones
.FirstOrDefault(x => x.TzdbIds.Contains(tzinfo));
if (winmap == null) throw new Exception("Invalid timezone");
NodaTime 2.4.8
https://nodatime.org/2.4.x/api/NodaTime.TimeZones.TzdbDateTimeZoneSource.html#NodaTime_TimeZones_TzdbDateTimeZoneSource_WindowsMapping
The "Asia/Kolkata" timezone doesn't seem to exist, and I'm not sure what is needed to make it work. Is there a better way to achieve this?
The problem is that the Windows/TZDB mapping file (example) doesn't contain "Asia/Kolkata", it contains "Asia/Calcutta".
Accounting for this in user code is relatively tricky, which is why in NodaTime 3.0 we introduced TzdbDateTimeZoneSource.TzdbToWindowsIds.
After updating to 3.0, you can use:
if (!TzdbDateTimeZoneSource.Default.TzdbToWindowsIds.TryGetValue(tzinfo, out var windowsZoneId))
{
throw new Exception($"Unmapped time zone ID '{tzinfo}'");
}
// Use windowsZoneId here
If you really need to stick with 2.4.8, you could canonicalize both tzinfo and all the entries in TzdbDateTimeZoneSource.Default.WindowsMapping.MapZones.TzdbIds, but that will be generally worse.
(The update from 2.4.8 to 3.0.0 should be seamless for most users. It's a breaking change primarily due to removing binary serialization, which I hope you're not using...)
Related
Using the older "Windows.Azure.ServiceBus" library, I am able to setup a SqlFilter that takes as its parameters a TimeSpan, but when I try the same using the "Microsoft.Azure.ServiceBus" library it fails with the following error:
Object is not of supported type: TimeSpan. Only following types are
supported through HTTP: string,int,long,bool,double,DateTime
What I am trying to do:
I want to have 2 subscriptions on my topic (highPriority, normalPriority)
Messages have a user property called "StartDate"
If StartDate <= 1 day, then it should go to highPriority subscription, else it should go to normalPriority. [i.e (StartDate - sys.EnqueuedDateTimeUtc) <= 24 hours].
Code that works (when using the older .net-framework Windows.Azure.ServiceBus package):
SqlFilter highMessagesFilter =
new SqlFilter("(StartDate-sys.EnqueuedTimeUtc) <= #TimeSpanImmediateWindow");
highMessagesFilter.Parameters.Add("#TimeSpanImmediateWindow", TimeSpan.FromDays(1));
var subscription = SubscriptionClient.CreateFromConnectionString(connectionString,topicName, subName1);
subscription.RemoveRule(RuleDescription.DefaultRuleName);
subscription.AddRule(new RuleDescription()
{
Name = RuleDescription.DefaultRuleName,
Filter = highMessagesFilter,
Action = new SqlRuleAction("set priorityCalc = (StartDate-sys.EnqueuedTimeUtc)")
});
Where as, this code (using: Microsoft.Azure.ServiceBus) doesnt work:
var filter = new SqlFilter("(StartDate-sys.EnqueuedTimeUtc) <= #TimeSpanHoursImmediateWindow");
filter.Parameters.Add("#TimeSpanHoursImmediateWindow",TimeSpan.FromDays(1));
var ruleDescription = new RuleDescription
{
Filter = filter,
Action = new SqlRuleAction(#"
SET HighPriority = TRUE;
SET Window = StartDate - sys.EnqueuedTimeUtc
"),
Name = RuleDescription.DefaultRuleName,
};
await managementClient.UpdateRuleAsync(topicPath,subscriptionName,ruleDescription);
The above code throws the following error:
Object is not of supported type: TimeSpan. Only following types are
supported through HTTP: string,int,long,bool,double,DateTime
If instead of managementClient.UpdateRuleAsync, I try using the following code:
var subClient = new SubscriptionClient(connectionString, topicPath, subscriptionName);
await subClient.RemoveRuleAsync(RuleDescription.DefaultRuleName);
await subClient.AddRuleAsync(ruleDescription);
It fails, with the following error (ServiceBusException):
Message The service was unable to process the request; please retry
the operation. For more information on exception types and proper
exception handling, please refer to
http://go.microsoft.com/fwlink/?LinkId=761101
The microsoft link is to a list of exceptions, and FilterException, but its not!
Here is the stack trace of the 2nd exception:
at
Microsoft.Azure.ServiceBus.Amqp.AmqpSubscriptionClient.OnAddRuleAsync(RuleDescription
description) in
C:\source\azure-service-bus-dotnet\src\Microsoft.Azure.ServiceBus\Amqp\AmqpSubscriptionClient.cs:line
132 at
Microsoft.Azure.ServiceBus.SubscriptionClient.AddRuleAsync(RuleDescription
description) in
C:\source\azure-service-bus-dotnet\src\Microsoft.Azure.ServiceBus\SubscriptionClient.cs:line
499 at UserQuery.Main() in
C:\Users\XXXX\AppData\Local\Temp\LINQPad6_quhgasgl\niqvie\LINQPadQuery.cs:line
82
So my questions are:
Can I use TimeSpan parameters using .net standard library? or will I have to use the older .net framework library if I wanted to use TimeSpans.
Is there a better way to implement what I am trying to do, a way that would work with the newer .net standard library? (FYI: I thought about sending the calculation as the parameter (decimal) and then the parameter would be a double, instead of a TimeSpan). And in fact, thats what I might end up doing.
If the library is throwing an exception stating that TimeSpan is not a supported type, then it's pretty much what you have. Note that the .NET Standard client has two implementations, the ManagementClient and some operations via entity clients, such as subscription client. The latter is implemented using AMQP. ManagementClient is entirely based on HTTP. While it would be ideal to use the AMQP implementation, it's incomplete. I would recommend relying on the ManagementClient. This is likely the reason why modifying rules using a subscription client is throwing an exception.
Regarding a better way - your idea sounds right. As long as it's not a type that the new client doesn't accept. Also, you can raise an issue with the library team at https://github.com/Azure/azure-sdk-for-net/issues if you'd like to know the reason why TimeSpan is no longer supported.
strange problem here. On local development in asp.net webforms (4.5 / 4.7) I am finding httpruntime.Cache always null even when properly set. I attempted it on another iis express workstation and found the same behavior, even with a tester single page web page. That same page in production IIS 7.5 works and is storing and delivering from cache. The code specifically is below, but I have tried a tester storing a simple string in httpruntime.Cache.
var cache = System.Runtime.Caching.MemoryCache.Default;
var luCacheKey = "lu_" + dsName;
var ic = HttpRuntime.Cache.Get(luCacheKey) as ICollection;
if (ic == null) {
and from the tester
var item = HttpRuntime.Cache.Get("x");
if (item == null)
{
HttpContext.Current.Cache.Insert("x", "test" , null, DateTime.Now.AddHours(1), Cache.NoSlidingExpiration);
Response.Write("added to cache<br>");
}
else {
Response.Write("already in cache");
}
So, I am wondering if there is something perhaps in web.config that I could look at or is this expected IIS express behavior? Note, System.runtime.Caching does work properly.
var cache = System.Runtime.Caching.MemoryCache.Default;
var ic = cache[luCacheKey] as ICollection;
if (ic == null)
{
var filterCriteria = new BinaryOperator("LookupGroup", dsName, BinaryOperatorType.Equal);
var lookups = xpoSession.GetClassInfo(typeof(Lookups));
ic = xpoSession.GetObjects(lookups, filterCriteria, new SortingCollection(), 0, 0, false, false);
var cachePolicy = new System.Runtime.Caching.CacheItemPolicy() { AbsoluteExpiration = DateTime.Now + TimeSpan.FromMinutes(30) };
cache.Add(new System.Runtime.Caching.CacheItem(luCacheKey, ic), cachePolicy);
You incorrectly add your object to the cache.
Instead of DateTime.Now follow the docs and put DateTime.UtcNow. This resolves a common issue where your machine is in a "non-zero" time zone which prevents the inner logic of the cache to manage your expirations correctly.
From the docs
To avoid possible issues with local time such as changes from standard time to daylight saving time, use UtcNow rather than Now for this parameter value.
https://msdn.microsoft.com/en-us/library/4y13wyk9(v=vs.110).aspx
Adding more information as follow up on why the behavior may change between servers.
This change in behavior may be caused by having .NET 4.7 installed on the machine. The article linked below says that Microsoft will fix this in the next version of .NET and in the next hotfix.
Quoting parts of the Microsoft page:
Symptoms:
Assume that you have Microsoft .NET Framework 4.7 installed on a
computer. When you try to insert items into the Cache object by using
the Cache.Insert (string, object, CacheDependency, DateTime, TimeSpan)
Insert overload method, you may notice that the inserted Cache items
expire much earlier or later than the specified DateTime (expiration
time).
Cause:
The internal implementation of System.Web.Caching.Cache uses
Coordinated Universal Time (UTC) time-stamp for absolute expiration.
But this particular Cache.Insert (string, object, CacheDependecy,
DateTime, TimeSpan) Insert overload method does not make sure whether
the expiration time is converted to UTC. Therefore, expiration for
items that are inserted into the Cache object by using this overload
will occur earlier or later than expected, depending on the computer
time zone difference from Greenwich Mean Time (GMT).
Workaround:
The temporary workaround for this issue is to use either the Cache.Add method or a different Cache.Insert overload method.
Resolution:
This issue will be fixed in the next version of the .NET Framework, and will also be available in the next hotfix for the .NET Framework 4.7.
References:
https://support.microsoft.com/en-us/help/4035412/fix-expiration-time-issue-when-you-insert-items-by-using-the-cache-ins
http://vimvq1987.com/2017/08/episerver-caching-issue-net-4-7/
I want to get files from a list for all the files whose filedate > today's cutOff - so, I have the following codelet
string[] MyFiles = Directory.GetFiles(MyConfig.pathTransmittedFiles, "*.adf")
.Where(file => new FileInfo(file).LastWriteTime > dtCutOff).ToArray();
I have a file whose LastWriteTime is "{11/3/2015 1:33:26 PM}" being picked up by my collection with dtCutOff == "{11/3/2015 1:33:26 PM}"! So '>' didn't seem to work.
First, I would try running it without the Where clause, just to make sure that all files you expect are indeed part of the initial array returned from Directory.GetFiles. It's entirely possible that date/time comparison is not the source of the discrepancy. It may be more related to the issue Ivan linked to in the question comments, or it may be permission related, or some other thing.
Next, be aware that DateTime violates SRP in that it has a Kind property, which is one of the three DateTimeKind enumeration values. It's either Local, Utc, or Unspecified.
In the case of DateTime.Now, the Kind will be DateTimeKind.Local. File.GetLastWriteTime also returns its value with local kind. Therfore, if you always derive your dtCutOff from DateTime.Now in the manner you showed in the question, then it will almost always be the correct comparison function.
The "almost" stems from the fact that DateTimeKind.Local can actually represent two different kinds under the covers. In other words, there are actually four kinds, but two of them are exposed by one. This is described as "DateTime's Deep Dark Secret" in Jon Skeet's blog post More Fun with DateTime, and is also mentioned in the comments in the .NET Framework Reference Source. In practice, you should only encounter this in the ambiguous hour during a fall-back daylight saving time transition (such as just occurred last Sunday 2015-11-01 in the US).
Now, to the more likely case that your dtCutOff is actually derived not from DateTime.Now, but rather from user input or database lookup or some other mechanism, then its possible that it actually represents the local time in some other time zone than the one on your local computer. In other words, if the dtCutOff has a Kind of DateTimeKind.Utc, then the value is in terms of UTC. If it has a Kind of DateTimeKind.Unspecified, then the value might be in terms of UTC, or the local time zone, or some other time zone entirely.
Here's the kicker: Comparison of two DateTime values only evaluates the value underlying the Ticks property. It does not consider Kind.
Since file times are absolute points in universal time (on NTFS anyway), then you really should use the File.GetLastWriteTimeUtc method, rather than the methods that work in local time.
There are two approaches you could use:
Load the modified property as UTC, using:
myResult.modified = File.GetLastWriteTimeUtc(myFile);
Populate dtOffset appropriately.
If you're loading from the current time, then use DateTime.UtcNow.
If you're loading from other input, ensure the value is converted to UTC to match the input scenario. For example, use .ToUniversalTime() if the value is in terms of the local time zone, or use the conversion functions in the TimeZoneInfo class if the value is in another time zone.
OR
Change your modified property to be a DateTimeOffset instead of a DateTime.
Load that using:
myResult.modified = new DateTimeOffset(File.GetLastWriteTimeUtc(myFile));
Define dtCutOff as a DateTimeOffset, and populate appropriately.
If you're loading from the current time, then use DateTimeOffset.UtcNow.
If you're loading from other input, ensure the offset is set to match the input scenario. Use TimeZoneInfo functions if you need to convert from another time zone.
DateTimeOffset has many advantages over DateTime, such as not violating SRP. It's always representing an absolute moment in time. In this scenario, it helps to know that comparison operators on DateTimeOffset always reflect that absolute moment. (In other words, it internally adjusts to UTC before doing the comparison.)
This code works:
var cutffDate = new DateTime(2015,1,1); // or whatever
var allFiles = Directory.GetFiles(MyConfig.pathTransmittedFiles, "*.adf");
var datedFiles = allFiles.Where(f => (new FileInfo(f)).LastWriteTime > cutffDate);
Update:
Since your issue seems to be a precision-related one you could change the comparison to:
const long precision = 10; // vary this as needed
allFiles.Where(f =>
(new FileInfo(f)).LastWriteTime.ToFileTime()/precision > cutffDate.ToFileTime()/precision);
Alternatively you could use ...LastAccessTime.Ticks/TimeSpan.TicksPerMillisecond
In addition to that you may need to convert all DateTime values to UTC (LastAccessTimeUtc and DateTime.UtcNow) to make sure it's not some weird timezone issue
Since the files were fed into the queue once a day so the scale of precision is not required down to a millisecond or something. So that one-second TimeSpan difference is acceptable to do the trick and make my case work.
string[] MyFiles = Directory.GetFiles(MyConfig.pathTransmittedFiles, "*.adf")
.Where(file => new FileInfo(file).LastWriteTime - TimeSpan.FromSeconds(1) > dtCutOff)
.ToArray();
Now my file with modified date "{11/3/2015 1:33:26 PM}" didn't go into my collection when my cutOffDate is "{11/3/2015 1:33:26 PM}" while my other file with modified date "{11/3/2015 1:33:27 PM}" have successfully passed into my collection as expected!! So, it works and that's how it should work after all these advises! Thanks ye-all.
Looks like your Where clause lambda might be incorrect. Try this.
string[] MyFiles = Directory.GetFiles(MyConfig.pathTransmittedFiles, "*.adf").Where(file => file.modified > dtCutOff).ToArray();
I was wondering if anyone can offer any pointers on this one. I'm trying to return ItemStats from the Tridion UGC web service but I'm getting the following error when trying to bind the results:-
The closed type TridionWebUGC.CDS.ItemStat does not have a corresponding LastRatedDate settable property.
An example of code is:-
WebServiceClient ugcCall2 = new WebServiceClient();
Uri uri = new Uri("http://new.ugc.service/odata.svc");
CDS.ContentDeliveryService cds = new CDS.ContentDeliveryService(uri);
var myItemStats = cds.ItemStats.Where(p => p.PublicationId == 68 && p.Id == 17792 && p.Type==16);
I can get comments and ratings with no problem. E.g.
var myComments = cds.Comments.Where(p => p.ItemId == 17805).OrderBy(p => p.CreationDate);
It's just ItemStats that are giving me an issue. Anybody any ideas?
Thanks
John
Unfortunately, the metadata of the UGC WebService is not correct in regards to the ItemsStats. For you it means that the webservice metadata does not expose the fact that the ItemStat entity contains the LastRatedDate property. This makes your .NET proxies not to be aware of this property and makes your query fail.
To work-around this defect you have two option:
Add to your service the following property: cds.IgnoreMissingProperties = true;. Advantage of this approach is that you're done with it in 2 sec. Disadvantage is that you will not be able to access that property (in case you actually use it);
Modify the proxies generated by Visual Studio and manually add that property to the ItemStat class. Advantage of this approach is that you will be able to access the property from your project. Disadvantage is that it's totally not manageable from the coding point of view, you need to be careful when you upgrade or regenerate the proxies and it's easy to make a mistake while manually adding the property.
Note 1: to access the metadata of your webServer from the browser your can go to /odata.svc/$metadata.
Note 2: on a closer look there are 2 properties missing from the webService metadata: LastRatedDate and LastCommentedDate.
Hope this helps.
For a web application, I need to get a list or collection of all SalesOrders that meet the folowing criteria:
Have a WarehouseKey.ID equal to "test", "lucmo" or "Inno"
Have Lines that have a QuantityToBackorder greater than 0
Have Lines that have a RequestedShipDate greater than current day.
I've succesfully used these two methods to retrieve documents, but I can't figure out how return only the ones that meet above criteria.
http://msdn.microsoft.com/en-us/library/cc508527.aspx
http://msdn.microsoft.com/en-us/library/cc508537.aspx
Please help!
Short answer: your query isn't possible through the GP Web Services. Even your warehouse key isn't an accepted criteria for GetSalesOrderList. To do what you want, you'll need to drop to eConnect or direct table access. eConnect has come a long way in .Net if you use the Microsoft.Dynamics.GP.eConnect and Microsoft.Dynamics.GP.eConnect.Serialization libraries (which I highly recommend). Even in eConnect, you're stuck with querying based on the document header rather than line item values, though, so direct table access may be the only way you're going to make it work.
In eConnect, the key piece you'll need is generating a valid RQeConnectOutType. Note the "ForList = 1" part. That's important. Since I've done something similar, here's what it might start out as (you'd need to experiment with the capabilities of the WhereClause, I've never done more than a straightforward equal):
private RQeConnectOutType getRequest(string warehouseId)
{
eConnectOut outDoc = new eConnectOut()
{
DOCTYPE = "Sales_Transaction",
OUTPUTTYPE = 1,
FORLIST = 1,
INDEX1FROM = "A001",
INDEX1TO = "Z001",
WhereClause = string.Format("WarehouseId = '{0}'", warehouseId)
};
RQeConnectOutType outType = new RQeConnectOutType()
{
eConnectOut = outDoc
};
return outType;
}
If you have to drop to direct table access, I recommend going through one of the built-in views. In this case, it looks like ReqSOLineView has the fields you need (LOCNCODE for the warehouseIds, QTYBAOR for backordered quantity, and ReqShipDate for requested ship date). Pull the SOPNUMBE and use them in a call to GetSalesOrderByKey.
And yes, hybrid solutions kinda suck rocks, but I've found you really have to adapt if you're going to use GP Web Services for anything with any complexity to it. Personally, I isolate my libraries by access type and then use libraries specific to whatever process I'm using to coordinate them. So I have Integration.GPWebServices, Integration.eConnect, and Integration.Data libraries that I use practically everywhere and then my individual process libraries coordinate on top of those.