azure time series insights explorer issues - azure-timeseries-insights

We are started working on integrating Azure Time Series Insights. When we send simple format of payload to TSI we are able to see the data in the TSI ex: { "DeviceID":"dev1","temp":10.4,"pressure":"20.4"}.
We were able to see the data in the explorer(in all explorer) dev1->temp or dev1->pressure and able to plot the data.
But when we are trying to send the packet for in the below format we are unable to get the tagId values under deviceId. We can only see eventData and can plot but when we explore the event data we can actually see the data exists. I am not sure what i am missing here. by the way we are using Gen2(L2) version
[{
"deviceId": "RDevice01",
"timestamp": "2020-25-01A09:25:45:4840",
"series": [{
"tagId": "Axis1",
"value": 0.75
}, {
"tagId": "Axis2",
"value": 0.001
}, {
"tagId": "Axis3",
"value": 0.001
}, {
"tagId": "Axis4",
"value": -4.08319
}, {
"tagId": "Axis5",
"value": -1.93166
}, {
"tagId": "Axis6",
"value": -4.08319
}, {
"tagId": "ErrorAxis1",
"value": "String 0"
}, {
"tagId": "ErrorAxis2",
"value": "String 1"
}, {
"tagId": "ErrorAxis3",
"value": "String 2"
}, {
"tagId": "ErrorAxis4",
"value": "String 3"
}, {
"tagId": "ErrorAxis5",
"value": "String 4"
}, {
"tagId": "ErrorAxis6",
"value": "String 5"
}]
}]

If you've changed your JSON telemetry payload such that you now have nested JSON and your series are within an array, you'll need to have a TS ID (composite or singular) within the array objects to trigger flattening, see here: https://learn.microsoft.com/en-us/azure/time-series-insights/concepts-json-flattening-escaping-rules#example-b
You can re-create a new TSI instance with a composite TS ID of deviceId and tagId, and then that'll work as you expect.

Related

Not able to filter out required property in Azure TSI Gen1 Get Events API response

I am using the below request body to fetch only the required property values.
"searchSpan": {
"from": {
"dateTime": "2021-11-20T00:00:00.000Z"
},
"to": {
"dateTime": "2021-11-20T23:00:00.000Z"
}
},
"predicateString": "[Params.Name] = 'power'",
"take": 100
}
}
The URL is like below:
https://12345678a-bcde-3e91-blah-2292933292aa.env.timeseries.azure.com/events?api-version=2016-12-12
Despite specifying the required property the response returns all properties as if it has not seen the predicate string. What might I be doing wrong?
{
"warnings": [],
"events": [
{
"schema": {
"rid": 0,
"$esn": "my-event-hub",
"properties": [
{
"name": "mytimestamp",
"type": "DateTime"
},
{
"name": "Params.Name",
"type": "String"
},
{
"name": "Params.Value",
"type": "Double"
}
]
},
"$ts": "2021-11-20T10:01:50Z",
"values": [
"2021-11-20T10:01:50Z",
"energy",
60
]
},
{
"schemaRid": 0,
"$ts": "2021-11-20T10:01:50Z",
"values": [
"2021-11-20T10:01:50Z",
"power",
10
]
},
{
"schemaRid": 0,
"$ts": "2021-11-20T10:01:50Z",
"values": [
"2021-11-20T10:01:50Z",
"strength",
200
]
},
]
}
Edit
I'm getting "Properties count error" in the TSI overview page. This might quite be the root cause but I don't know for sure
"For Time Series Insights environment ABC: You have used all 641/600 properties in your environment".

Logic App > Cosmos PartitionKey Not Matching Error

I'm scared to put this out there because it should be so easy and I am facing the same issue as the post here, here and here and I have tried each of the answers to no avail. Below is the current Resulting Input (redacted) and Related CodeView of the inputs.
The Result
{
"method": "post",
"headers": {
"x-ms-documentdb-raw-partitionkey": "\"2020\""
},
"path": "/dbs/xxxx/colls/smtp/docs",
"host": {
"connection": {
"name": "/subscriptions/..."
}
},
"body": {
"category": [
[
"cat facts"
]
],
"email": "example#test.com",
"event": "processed",
"id": "yada",
"partitionKey": "\"2020\"",
"sg_event_id": "yada yada",
"sg_message_id": "yada",
"smtp-id": "yada",
"timestamp": 1604345542
}
}
The Code View
{
"inputs": {
"body": {
"category": [
"#items('For_each')['category']"
],
"email": "#items('For_each')['email']",
"event": "#items('For_each')['event']",
"id": "#items('For_each')['sg_message_id']",
"partitionKey": "\"#{formatDateTime(utcNow(),'yyyy')}\"",
"sg_event_id": "#items('For_each')['sg_event_id']",
"sg_message_id": "#items('For_each')['sg_message_id']",
"smtp-id": "#items('For_each')['smtp-id']",
"timestamp": "#items('For_each')['timestamp']"
},
"headers": {
"x-ms-documentdb-raw-partitionkey": "\"#{formatDateTime(utcNow(),'yyyy')}\""
}
}
The error I'm getting is the usual one - PartitionKey extracted from document doesn't match the one specified in the header
I just can't see what I'm missing here now.
Thanks all.
First, as Matias comments, check your partition key path.
Then, change this code "partitionKey": "\"#{formatDateTime(utcNow(),'yyyy')}\"", to "partitionKey": "#{formatDateTime(utcNow(),'yyyy')}", in your document.
It works fine on my side:

Data Factory: JSON data is interpreted as expression - ErrorCode=InvalidTemplate, ErrorMessage=Unable to parse expression

I want to copy items from
CosmosDB databaseA/productCollection
to
CosmosDB databaseB/productCollection
Therefore I decided to use Azure Data Factory.
I actived also "Export as-is to JSON files or Cosmos DB collection".
The read operation works as expected.
Unfortunately, the write operation stops because of an error related to the data:
ErrorCode=InvalidTemplate, ErrorMessage=Unable to parse expression 'Currency'
{
"ProductName": "Sample",
"Price": {
"#Currency": "GBP",
"$": "2624.83"
}
}
I'm not able to change to input data itself.
The output data has to equal the input data.
Is there possiblity, that #Currency will not be interpreted as an expression
In ARM, this part is failling:
Price.{#Currency}
I had the same problem and I was able to resolve accordingly.
I am using a Pipeline with a Source that is a Dataset referencing JSON data.
Clicking the button highlighted below.
I had to change the JSON from
{
"name": "SourceDataset",
"properties": {
"linkedServiceName": {
"referenceName": "StorageAccountLink",
"type": "LinkedServiceReference"
},
"annotations": [],
"type": "Json",
"typeProperties": {
"location": {
"type": "AzureBlobStorageLocation",
"container": "test-data"
}
},
"schema": {
"type": "object",
"properties": {
"#context": {
"type": "string"
},
"value": {
"type": "array",
"items": {
"type": "object",
"properties": {
"id": {
"type": "string"
}
}
}
}
}
}
}
}
To ( Escaping the # with ## )
{
"name": "SourceDataset",
"properties": {
"linkedServiceName": {
"referenceName": "StorageAccountLink",
"type": "LinkedServiceReference"
},
"annotations": [],
"type": "Json",
"typeProperties": {
"location": {
"type": "AzureBlobStorageLocation",
"container": "test-data"
}
},
"schema": {
"type": "object",
"properties": {
"##context": {
"type": "string"
},
"value": {
"type": "array",
"items": {
"type": "object",
"properties": {
"id": {
"type": "string"
}
}
}
}
}
}
}
}
I tried to reproduce your issue but it works for me. I used copy activity to transfer data from account A to account B.
Additional, if this operation is just need to be executed once, please consider using Azure Cosmos DB Migration Tool. It's free for usage. You could export the data from cosmos db A as json file then import it into cosmos db B very simply.Also, it could be executed in the cmd so that it could be made as a scheduled job on the windows system.

How can I select a filtered child document collection when querying a top level document in cosmos db

I'm trying to filter the child documents returned when querying a parent document using the sql api in cosmos db.
For example given this document:
{
"customerName": "Wallace",
"customerReference": 666777,
"orders": [
{
"date": "20181105T00:00:00",
"amount": 118.84,
"description": "Laptop Battery"
},
{
"date": "20181105T00:00:00",
"amount": 81.27,
"description": "Toner"
},
{
"date": "20181105T00:00:00",
"amount": 55.12,
"description": "Business Cards"
},
{
"date": "20181105T00:00:00",
"amount": 281.00,
"description": "Espresso Machine"
}]
}
I would like to query the customer to retrieve the name, reference and orders over 100.00 to produce a results like this
[{
"customerName": "Wallace",
"customerReference": 666777,
"orders": [
{
"date": "20181105T00:00:00",
"amount": 118.84,
"description": "Laptop Battery"
},
{
"date": "20181105T00:00:00",
"amount": 281.00,
"description": "Espresso Machine"
}]
}]
the query I have so far is as follows
SELECT c.customerName, c.customerReference, c.orders
from c
where c.customerReference = 666777
and c.orders.amount > 100
this returns an empty set
[]
and if you remove "and c.orders.amount > 100" it matches the document and returns all orders.
To reproduce this issue I simply set up a new database, added a new collection and copied the json example in as the only document. The index policy is left as the default which I've copied below.
{
"indexingMode": "consistent",
"automatic": true,
"includedPaths": [
{
"path": "/*",
"indexes": [
{
"kind": "Range",
"dataType": "Number",
"precision": -1
},
{
"kind": "Range",
"dataType": "String",
"precision": -1
},
{
"kind": "Spatial",
"dataType": "Point"
}
]
}
],
"excludedPaths": []
}
Cosmos DB doesn't support the deep filtering in the way I attempted in my original query.
To achieve the results described you need to use a subquery using a combination of ARRAY and VALUE as follows:
SELECT
c.customerName,
c.customerReference,
ARRAY(SELECT Value ord from ord in c.orders WHERE ord.amount > 100) orders
from c
where c.customerReference = 666777
note the use of 'ord' - 'order' is a reserved word.
The query then produces the correct result - eg
[{
"customerName": "Wallace",
"customerReference": 666777,
"orders": [
{
"date": "20181105T00:00:00",
"amount": 118.84,
"description": "Laptop Battery"
},
{
"date": "20181105T00:00:00",
"amount": 281.00,
"description": "Espresso Machine"
}
]
}]

Google Calendar Events after 7pm are not retrieved until the Next Day using Google Events API

I've updated this because this question makes more sense than the original. I'm still including the test code that I used to show what's happening.
Please see the comments for more up to date information on what I've found.
Executed API to grab events for today (there should be 1):
/calendar/v3/calendars/{calid}/events?calendarId={calid}&singleEvents=true&timeMin=2013-04-24T00:00:00.000Z&timeMax=2013-04-25T00:00:00.000Z
It retrieves nothing. No events are returned.
If I execute the following (ie, the next day):
/calendar/v3/calendars/{calid}/events?calendarId={calid}&singleEvents=true&timeMin=2013-04-25T00:00:00.000Z&timeMax=2013-04-26T00:00:00.000Z
It retrieves the following which is set for the 24th (yet after 7pm), not the 25th:
{
"kind": "calendar#events",
"etag": "\"GZxpEFttRDAOmLHnWRxLHHWPGwk/01XVNYQjwJ5jTmd05uIgK9e6Uhw\"",
"summary": "test calendar",
"description": "test calendar",
"updated": "2013-04-24T13:09:12.000Z",
"timeZone": "America/Chicago",
"items": [
{
"kind": "calendar#event",
"etag": "\"GZxpEFttRDAOmLHnWRxLHHWPGwk/Z2NhbDAwMDAxMzY2MTQ4ODczODI3MDAw\"",
"id": "d1mdj3dasor22f0nm0lbohru7s",
"status": "confirmed",
"htmlLink": "https://www.google.com/calendar/event?eid=ZDFtZGozZGFzb3IyMmYwbm0wbGJvaHJ1N3MgYnZzdG9vbHMuY29tX2xidWt1ZmlnczJjMmFycjViODgycDVhYWhvQGc",
"created": "2013-04-16T21:47:53.000Z",
"updated": "2013-04-16T21:47:53.827Z",
"summary": "this should happen 4/24/2013",
"description": "testing event",
"creator": {
"email": "{my email address}",
"displayName": "my name"
},
"organizer": {
"email": "{calid}",
"displayName": "test calendar",
"self": true
},
"start": {
"dateTime": "2013-04-24T21:21:19-05:00"
},
"end": {
"dateTime": "2013-04-24T21:21:19-05:00"
},
"iCalUID": "d1mdj3dasor22f0nm0lbohru7s#google.com",
"sequence": 0,
"extendedProperties": {
"private": {
"evtKey": "key"
}
},
"reminders": {
"useDefault": true
}
}
]
}
It should be retrieving it for the first api call on the 24th, not the 2nd api call on the 25th unless I'm missing something.
Any ideas? Thanks!
a bit late but if you still needs it. I think the event you retrieved d1mdj3dasor22f0nm0lbohru7 seems to be in default timezone of the calendar, which is America/Chicago. That explains why it only showed when you set your query to 25th UTC time

Resources