MrakLogic CTS :register() method can save this query for how long, whether using this method will generate this error after a period of time, and now there are many places in the system using this method, how should I troubleshoot this problem
Related
We have a Xamarin Forms app.
We have a few existing users which had been using the app since last year where the database file name is MyAppName.db
Our new app strategy requires us to include the .db (Sqlite) along with the app to ensure we can include persistent information and does not require internet when you install the app, meaning we hope to overwrite the db file for existing users.
After we publish this new change where the database file is now MyNewDbFile.db, our users complain that they do not see any data, the app does not crash tough.
We capture error reports and we can see a common error in the our tool stating about "row value missused", a quick search indicates the value are not present and so the "SELECT * FROM TABLE WHERE COLUMNAME" query does not work causing the exception.
We can confirm we are using
Xamarin.Forms version 4.5.x
sqlite-net-sqlcipher version 1.6.292
We do not have any complex logic and cannot see a very specific reason causing this as this is not produced when we test our apps from the Alpha channel or the TestFlight.
We investigated the heart of the problem and can confirm that the troublemaker is the device Culture information.
We have been testing with English as the device language, however the minute the app is used by folks with any other language except English the query causes exception.
We make use of the Xamarin Essentials library to get device location (which is optional).
The latitude and longitude returned are culture specific.
So if you are using a Spanish language on device, the format for the Latitude and Longitude information is a , (comma) and NOT a . (dot).
That is, a Latitude value is something similar to 77.1234
where it is separated by a . (dot)
However for users with different region settings on device, the format would change to
77,1234 as value where it is separated by a , (comma).
The query requires us to have the values as string and that when not separated by a . (dot) fails to execute:
Query:
if (deviceLocation != null)
queryBuilder.Append($" ORDER BY((Latitude - {deviceLocation.Latitude})*(Latitude - {deviceLocation.Latitude})) +((Longitude - {deviceLocation.Longitude})*(Longitude - {deviceLocation.Longitude})) ASC");
Where the deviceLocation object is of type Xamarin.Essentials.Location which has double Latitude and Longitude values but when used as a string or even if you explicitly do deviceLocation.Latitude.ToString(), it would return the formatted values as per the device CultureInfo.
We can confirm that the problem is due to a mismatch of the format type and causing the query to throw an exception and in return making the experience as if there is no data.
I am running following wavefront query to get SINGLE VALUE which is Average over the given time range for which the query is being called. This works fine in the wavefront dashboard but it times out when called through the Wavefront REST API. Can this query be optimized so that it does not timeout or is there an issue running it through REST api:
mavg(1vw, avg(ts(telegraf.response.times.99.percentile , accountid="123" and env="prod" and myvar!="true”)))
I tried with following but does not help
mavg(1vw, avg(align(900s, mean, ts(telegraf.response.times.99.percentile , accountid=“123” and env=“prod” and myvar!=“true”))))
I found some special character in place of double quotes that came in while copy-pasting the query. It worked fine when I typed the whole query manually.
I am trying to access the Time Entries object via the Kronos API v2.
The documentation says that there are two required Query Parameters: start_date and end_date.
I am able to query the endpoint including one of the parameters at a time but am not able to enter both. And, I find the documentation quite lacking.
The root of the endpoint is:
https://secure.saashr.com/ta/rest/v2/companies/{cid}/time-entries
Here are things I have tried to suffix to the above endpoint:
?start_date=2019-11-01&end_date=2019-12-01
?start_date=2019-11-01|end_date=2019-12-01
?start_date=2019-11-01 end_date=2019-12-01
?start_date=2019-11-01?end_date=2019-12-01
?start_date=2019-11-01:end_date=2019-12-01
?filter=start_date:=:2019-11-01|end_date:=:2019-12-01
I also tried including quotes around the dates.
Everything results in some 400 level error when querying the API. With most of the above suffixes, it recognizes start_date but not the end_date. In this case, the error is:
{'code': 400, 'message': 'Missing required: end_date'}]
Note, above {cid} is replaced with the company's id.
In summary, how should I include two query parameters in the Kronos API?
The first option is correct.
https://secure.saashr.com/ta/rest/v2/companies/{cid}/time-entries?start_date=2019-11-01&end_date=2019-12-01
should work just fine.
Could you provide full URL you set in request?
I am attempting to create a database of Digital Object Identifier (DOI) found on the internet.
By manually searching the CommonCrawl Index Server manually I have obtained some promising results.
However I wish to develop a programmatic solution.
This may result in my process only requiring to read the index files and not the underlying WARC data files.
The manual steps I wish to automate are these:-
1). for each CommonCrawl Currently available index collection(s):
2). I search ... "Search a url in this collection: (Wildcards -- Prefix: http://example.com/* Domain: *.example.com) " e.g. link.springer.com/*
3). this returns almost 6MB of json data that contains approx 22K unique DOIs.
How can I browse all available CommonCrawl indexes instead of searching for specific URLs?
From reading the API documentation for CommonCrawl I cannot see how I can browse all the indexes to extract all DOIs for all domains.
UPDATE
I found this example java code https://github.com/Smerity/cc-warc-examples/blob/master/src/org/commoncrawl/examples/S3ReaderTest.java
that shows how to access a common crawl dataset.
However when I run it I receive this exception
"main" org.jets3t.service.S3ServiceException: Service Error Message. -- ResponseCode: 404, ResponseStatus: Not Found, XML Error Message: <?xml version="1.0" encoding="UTF-8"?><Error><Code>NoSuchKey</Code><Message>The specified key does not exist.</Message><Key>common-crawl/crawl-data/CC-MAIN-2016-26/segments/1466783399106.96/warc/CC-MAIN-20160624154959-00160-ip-10-164-35-72.ec2.internal.warc.gz</Key><RequestId>1FEFC14E80D871DE</RequestId><HostId>yfmhUAwkdNeGpYPWZHakSyb5rdtrlSMjuT5tVW/Pfu440jvufLuuTBPC25vIPDr4Cd5x4ruSCHQ=</HostId></Error>
In fact every file I try to read results in the same error. Why is that?
what is the correct common crawl uri's for their datasets?
The data set location has changed since more than one year, see announcement. However, many examples and libraries still contain the old pointers. You can access the index files for all crawls back to 2013 on s3://commoncrawl/cc-index/collections/CC-MAIN-YYYY-WW/indexes/cdx-00xxx.gz - replace YYYY-WW with year and week of the crawle and expand xxx to 000-299 to get all 300 index parts. New crawl data is announced on the Common Crawl group, or read more about how to access the data.
To get the example code to work replace lines 24 and 25 with:
String fn = "crawl-data/CC-MAIN-2013-48/segments/1386163035819/warc/CC-MAIN-20131204131715-00000-ip-10-33-133-15.ec2.internal.warc.gz";
S3Object f = s3s.getObject("commoncrawl", fn, null, null, null, null, null, null);
Also note that the commoncrawl group have an updated example.
Background
I have an issue where roughly once a month the AIFQueueManager table is populated with ~150 records which relate to messages which had been sent to AX (where they "successfully failed"; i.e. errorred due to violation of business rules, but returned an exception as expected) over 6 months ago.
Question
What tables are involved in the AIF inbound message process / what order to events occur in? e.g. XML file is picked up and recorded in the AifDocumentLog, data's extracted and added to the AifQueueManager and AifGatewayQueue tables, records from here are then inserted in the AifMessageLog, etc.
Thanks in advance.
There are 4 main AIF classes, I will be talking about the inbound only, and focusing on the included file system adapter and flat XML files. I hope this makes things a little less hazy.
AIFGatewayReceiveService - Uses adapters/channels to read messages in from different sources, and dumps them in the AifGatewayQueue table
AIFInboundProcessingService - This processes the AifGatewayQueue table data and sends to the Ax[Document] classes
AIFOutboundProcessingService - This is the inverse of #2. It creates XMLs with relevent metadata
AIFGatewaySendService - This is the inverse of #1, where it uses adapters/channels to send messages out to different locations from the AifGatewayQueue
For #1
So #1 basically fills the AifGatewayQueue, which is just a queue of work. It loops through all of your channels and then finds the relevant adapter by ClassId. The adapters are classes that implement AifIntegrationAdapter and AifReceiveAdapter if you wanted to make your own custom one. When it loops over the different channels, it then loops over each "message" and tries to receive it into the queue.
If it can't process the file for some reason, it catches exceptions and throws them in the SysExceptionTable [Basic>Periodic>Application Integration Framework>Exceptions]. These messages are scraped from the infolog, and the messages are generated mostly from the receive adaptor, which would be AifFileSystemReceiveAdapter for my example.
For #2
So #2 is processing the inbound messages sitting in the queue (ready/inprocess). The AifRequestProcessor\processServiceRequest does the work.
From this method, it will call:
Various calls to Classes\AifMessageManager, which puts records in the AifMessageLog and the AifDocumentLog.
This key line: responseMessage = AifRequestProcessor::executeServiceOperation(message, endpointActionPolicy); which actually does the operation against the Ax[Document] classes by eventually getting to AifDispatcher::callServiceMethod(...)
It gets the return XML and packages that into an AifMessage called responseMessage and returns that where it may be logged. It also takes that return value, and if there is a response channel, it submits that back into the AifGatewayQueue
AifQueueManager is actually cleared and populated on the fly by calling AifQueueManager::createQueueManagerData();.