App insights: Can you concatenate two properties together? - azure-application-insights

I have a custom event with a json (string) property called EventInfo. Sometimes this property will be larger than the 150 character limit set on event properties, so I have to split it into multiple properties, ie EventInfo0, EventInfo1, ect.
For example (shortened for simplicity)
EventInfo0: [{ "label" : "likeButton", "stat],
EventInfo1: [us" : "success" }]
I found out how to look at EventInfo as a json in app insights like:
customEvents
| where name == "people"
| extend Properties = todynamic(tostring(customDimensions.Properties))
| extend type=parsejson(Properties.['EventInfo'])
| mvexpand type
| project type, type.label, type.status]
Is there a way I can concatenate EventInfo0 and EventInfo1 to create the full json string, and query that like above?

According to the documentation, the 150 character limit is on the key, and not on the entire payload. So splitting as you're doing it may not actually be required.
https://learn.microsoft.com/en-us/azure/azure-monitor/app/data-model-event-telemetry#custom-properties
that said, to answer your questions - while it's not efficient to do this at query time, the following could work:
datatable(ei0:string, ei1:string)
[
'[{ "label" : "likeButton", "stat]', '[us" : "success" }]',
'[{ "lab]', '[el" : "bar", "hello": "world" }]'
]
| project properties = parse_json(strcat(substring(ei0, 1, strlen(ei0) - 2), substring(ei1, 1, strlen(ei1) - 2)))
| project properties.label
properties_label
----------------
likeButton
bar

Related

using KQL to Identify the absence of a value within a JSON message

I am storing JSON messages within an ADX table. The datatype of the JSON column is a string. Within the JSON message is an array that looks like this
"FilingEntities": [
{
"FilingEntity": 0,
"FilingMethod": 1,
"FilingSubMethod": -1
},
{
"FilingEntity": 29,
"FilingMethod": 1,
"FilingSubMethod": -1
},
{
"FilingEntity": 66,
"FilingMethod": 2,
"FilingSubMethod": -1
}
]
what I am trying to do is write a query that identifies the messages where there is only one instance of a filing array. For example, it looks like this
"FilingEntities": [
{
"FilingEntity": 0,
"FilingMethod": 1,
"FilingSubMethod": -1
}
]
So far I have been trying to just get the JSON parsed using
EventReceivedRaw
| extend DynamicJson = todynamic(JSONRaw)
| mv-expand DynamicJson
| project UniqueEventGuid, TimeStampInCST, DynamicJson, JSONRaw
but can't really figure out how to interrogate the message to get to the result I am looking for.
The datatype of the JSON column is a string
for efficiency, you should strongly consider re-typing this column to be dynamic, so that you don't have to do query-time parsing.
what I am trying to do is write a query that identifies the messages where there is only one instance of a filing array
you could use the array_length() function.
for example:
EventReceivedRaw
| extend DynamicJson = todynamic(JSONRaw)
| where array_length(DynamicJson.FilingEntities) == 1
| project UniqueEventGuid, TimeStampInCST, DynamicJson, JSONRaw

Azure Data Explorer: how to update a table from a raw JSON event using Kusto and update policies?

I ingest raw telemetry data as JSON records into a single-column table called RawEvents, in the column called Event. This is what a record/event looks like:
{
"clientId": "myclient1",
"IotHubDeviceId": "myiothubdevice1",
"deviceId": "mydevice1",
"timestamp": "2022-04-12T10:29:00.123",
"telemetry": [
{
"telemetryId: "total.power"
"value": 123.456
},
{
"telemetryId: "temperature"
"value": 34.56
},
...
]
}
The RawEvents table is created and set up like this:
.create table RawEvents (Event: dynamic)
.create table RawEvents ingestion json mapping 'MyRawEventMapping' '[{"column":"Event","Properties":{"path":"$"}}]'
There is also the Telemetry table that will be used for queries and analysis. The Telemetry table has the strongly-typed columns that match raw data record structure from the RawEvents table. It gets created like this:
.create table Telemetry (ClientId:string, IotHubDeviceId:string, DeviceId:string, Timestamp:datetime, TelemetryId:string, Value: real)
In order to get Telemetry table updated with records whenever a new raw event gets ingested into RawEvents, I have tried to define a data transformation function and to use that function inside an update policy which would be attached to the Telemetry table.
To that end, I have used the following script to verify that my data transformation logic works as expected:
datatable (event:dynamic)
[
dynamic(
{
"clientId": "myclient1",
"IotHubDeviceId": "myiothubdevice1",
"deviceId": "mydevice1",
"timestamp": "2022-04-12T10:29:00.123",
"telemetry": [
{
"telemetryId": "total.power",
"value": 123.456
},
{
"telemetryId": "temperature",
"value": 34.56
}
]
}
)
]
| evaluate bag_unpack(event)
| mv-expand telemetry
| evaluate bag_unpack(telemetry)
Executing that script gives me the desired output which matches the Telemetry table structure:
clientId deviceId IotHubDeviceId timestamp telemetryId value
myclient1 mydevice1 myiothubdevice1 2022-04-12T10:29:00.123Z total.power 123.456
myclient1 mydevice1 myiothubdevice1 2022-04-12T10:29:00.123Z temperature 34.56
Next, I have created a function called ExpandTelemetryEvent which contains that same data transformation logic applied to RawEvents.Event:
.create function ExpandTelemetryEvent() {
RawEvents
| evaluate bag_unpack(Event)
| mv-expand telemetry
| evaluate bag_unpack(telemetry)
}
And as a final step, I have tried to create an update policy for the Telemetry table which would use RawEvents as a source and ExpandTelemetryEvent() as the transformation function:
.alter table Telemetry policy update #'[{"Source": "RawEvents", "Query": "ExpandTelemetryEvent()", "IsEnabled": "True"}]'
This is where I got the error message saying
Error during execution of a policy operation: Caught exception while validating query for Update Policy: 'IsEnabled = 'True', Source = 'RawEvents', Query = 'ExpandTelemetryEvent()', IsTransactional = 'False', PropagateIngestionProperties = 'False''. Exception: Request is invalid and cannot be processed: Semantic error: SEM0100: 'mvexpand' operator: Failed to resolve scalar expression named 'telemetry'
I sort of understand why the policy cannot be applied. With the sample script, the data transformation worked because there was enough information to infer what the telemetry is, whereas in this case there is nothing in the RawEvents.Event which would provide the information about the structure of the raw events which will be stored in the Event column.
How can this be solved? Is this the right approach at all?
As the bag_unpack plugin documentation indicates:
The plugin's output schema depends on the data values, making it as "unpredictable" as the data itself. Multiple executions of the plugin, using different data inputs, may produce different output schema.
Use well-defined transformation instead
RawEvent
| project clientId = event.clientId, deviceId = event.deviceId, IotHubDeviceId = event.IotHubDeviceId, timestamp = event.timestamp, event.telemetry
| mv-expand event_telemetry
| extend telemetryId = event_telemetry.telemetryId, value = event_telemetry.value
| project-away event_telemetry

Aggregate values from customMeasurements column

For my company I need to extract data from Azure Application Insights.
All the relevant data is stored in the customMeasurements. Currently, the table looks something like this:
name | itemType | customMeasurements
-----------------------------------------------------------
AppName | customEvent | {
Feature1:1,
Feature2:0,
Feature3:0
}
-----------------------------------------------------------
AppName | customEvent | {
Feature1:0,
Feature2:1,
Feature3:0
}
I'm trying to find a Kusto query which will aggregate all enabled features (which would have a value of '1'), but I'm unable to do so.
I tried several things to get this resolved like the following:
customEvents
| extend test = tostring(customMeasurements.["Feature2"])
| summarize count() by test
This actually showed me the number rows that have Feature2 set to '1' but I want to be able to extract all features that have been enabled without specifying them in the query (as they can have custom names).
Could somebody point me in the right direction please
perhaps, something like the following could give you a direction:
datatable(name:string, itemType:string, customMeasurements:dynamic)
[
'AppName', 'customEvent', dynamic({"Feature1":1,"Feature2":0,"Feature3":0}),
'AppName', 'customEvent', dynamic({"Feature1":0,"Feature2":1,"Feature3":0}),
]
| mv-apply customMeasurements on
(
extend feature = tostring(bag_keys(customMeasurements)[0])
| where customMeasurements[feature] == 1
)
| summarize enabled_features = make_set(feature) by name

kusto query with dynamic object value without key

I have a lot of data looking like
{"tuesday":"<30, 60>"}
{"friday":"<0, 5>"}
{"saturday":"<5, 10>"}
{"friday":"<0, 5>"}
{"saturday":"<5, 10>"}
{"sunday":"0"}
{"monday":"<0, 5>"}
All i want is the value regardless of the key.
My query:
customEvents
| where name == "eventName"
| extend d = parse_json(tostring(customDimensions.['Properties']))
| project d
| take 7
d is a dynamic object and I can do d.monday for the value, but I'd like to get the value without the key. Is this possible with Kusto?
Thanks
for the case of a single-property as you've demonstrated above, using the parse operator could work:
datatable(d:dynamic)
[
,dynamic({"tuesday":"<30, 60>"})
,dynamic({"friday":"<0, 5>"})
,dynamic({"saturday":"<5, 10>"})
,dynamic({"friday":"<0, 5>"})
,dynamic({"saturday":"<5, 10>"})
,dynamic({"sunday":"0"})
,dynamic({"monday":"<0, 5>"})
]
| parse d with * ':"' value '"' *
| project value
Notes:
In case your values are not necessarily encapsulated in double quotes (e.g. are numerics), then you should be able to specify kind=regex for the parse operator, and use a conditional expression for the existence of the double quotes.
In case you have potentially more than 1 property per property bag, using extract_all() is an option.
Relevant Docs:
https://learn.microsoft.com/en-us/azure/kusto/query/parseoperator
https://learn.microsoft.com/en-us/azure/kusto/query/extractallfunction

Application Insights Extract Nested CustomDimensions

I have some data in Application Insights Analytics that has a dynamic object as a property of custom dimensions. For example:
| timestamp | name | customDimensions | etc |
|-------------------------|---------|----------------------------------|-----|
| 2017-09-11T19:56:20.000 | Spinner | { | ... |
MyCustomDimension: "hi"
Properties:
context: "ABC"
userMessage: "Some other"
}
Does that make sense? So a key/value pair inside of customDimensions.
I'm trying to bring up the context property to be a proper column in the results. So expected would be :
| timestamp | name | customDimensions | context| etc |
|-------------------------|---------|----------------------------------|--------|-----|
| 2017-09-11T19:56:20.000 | Spinner | { | ABC | ...
MyCustomDimension: "hi"
Properties:
context: "ABC"
userMessage: "Some other"
}
I've tried this:
customEvents | where name == "Spinner" | extend Context = customDimensions.Properties["context"]
and this:
customEvents | where name == "Spinner" | extend Context = customDimensions.Properties.context
but neither seem to work. They give me a column at the end named "Context" but the column is empty - no values.
Any ideas?
EDIT:
Added a picture for clarifying the format of the data:
edited to working answer:
customEvents
| where name == "Spinner"
| extend Properties = todynamic(tostring(customDimensions.Properties))
| extend Context = Properties.context
you need an extra tostring and todynamic in here to get what you expect (and what i expected!)
the explanation i was given:
Dynamic field "promises" you the upper/outer level of key / value access (this is how you access customDimensions.Properties).
Accessing internal structure of that json depends on the exact format of customDimensions.Properties content. It doesn’t have to be json by itself. Even if it looks like a well structured json, it still may be just a string that is not exactly well formatted json.
So basically, it by default won't attempt to parse strings inside of a dynamic/json block because they don't want to spend a lot of time possibly trying and failing to convert nested content to json infinitely.
I still think that extra tostring shouldn't be required inside there, since todynamic should already be allowing both string and dynamic in validly, so i'm checking to see if the team that owns the query stuff can make that step better.
Thanks sooo much.. just to expand on the answer from John. We needed to graph duration of end-points using custom events. This query made it so we could specify the duration as our Y-axis in the chart:
customEvents
| extend Properties = todynamic(tostring(customDimensions.Properties))
| extend duration = todouble(todecimal(Properties.duration))
| project timestamp, name, duration

Resources