I am wondering how to get the data from Azure Log Analytics with AzureKusto library for R.
The server parameter reads:
server: The URI of the server, usually of the form
'https://clustername.location.kusto.windows.net'. addr, address,
network address, datasource, host
I have no idea where to find this information.
Thanks in advance : )
the SDK you're looking at has been written to query Kusto/ADX clusters, and not Azure Log Analytics resources.
that said, you could potentially still use it, and Query data in Azure Monitor using Azure Data Explorer (Preview)
Related
How can HTTP queries be run in the Azure Monitor workbooks?
I read all the documentation here and still can not find how could I use my application health checks http endpoints to report on my application status in an Azure Monitor woorkbook.
I have an ASP.NET application if it matters. It exposes endpoints which I would like to call from the workbook and do different visualizations depending on the data returned.
You'd use the "Custom Endpoint" data source in the Query step.
https://learn.microsoft.com/en-us/azure/azure-monitor/visualize/workbooks-data-sources#custom-endpoint
The endpoint needs to support CORS, because the calls would be coming from the azure portal, and also needs to support https, because the portal itself is loaded via https.
The endpoint is also expected to return JSON content, and then you can use JSONPath inside the settings of the custom endpoint's Result Settings tab to transform it into grid data.
Our ASP.NET Core app logs trace messages to App Insights. We need to be able to query them and filter by some customDimentions. However, I have found 3 APIs and am not sure which one to use:
App Insights REST API
Azure Log Analytics REST API
Azure Data Explorer .NET SDK (Preview)
Firstly, I don't understand the relationships between these options. I thought that App Insights persisted its data to Log Analytics; but if that's the case I would expect to only be able to query through Log Analytics.
Regardless, I just need to know which is the best to use and I wish that documentation were clearer. My instinct says to use the App Insights API, since we only need data from App Insights and not from other sources.
The difference between #1 and #2 is mostly historical and converging.
Application Insights existed as a product before log analytics, and were based on different underlying database technologies
Both Application Insights and Log Analytics converged to use the same underlying database, based on ADX (Azure Data Explorer), and the same exact REST API service to query either. So while your #1 and #2 links are different, they point to effectively the same service backend by the same team, but the pathing/semantics are subtly different where the service looks depending on the inbound request.
both AI and LA introduce the concept of multi-tenancy and a specific set of tables/schema on top of their azure resources. They effectively hide the entire database from you, and make it look like one giant database.
there is now the possibility (suggested) to even have your Application Insights data placed in a Log Analytics Workspace:
https://learn.microsoft.com/en-us/azure/azure-monitor/app/create-workspace-resource
this lets you put the data for multiple AI applications/components into the SAME log analytics workspace, to simplify query across different apps, etc
Think of ADX as any other kind of database offering. If you create an ADX cluster instance, you have to create database, manage schema, manage users, etc. AI and LA do all that for you. So in your question above, the third link to ADX SDK would be used to talk to an ADX cluster/database directly. I don't believe you can use it to directly talk to any AI/LA resources, but there are ways to enable an ADX cluster to query AI/LA data:
https://learn.microsoft.com/en-us/azure/data-explorer/query-monitor-data
And ways to have a LA/AI query also join with an ADX cluster using the adx keyword in your query:
https://learn.microsoft.com/en-us/azure/azure-monitor/logs/azure-monitor-data-explorer-proxy
tl;dr: I want to reference an external data source from a Kusto query in Application Insights.
My application is writing logs to Application Insights, and we're querying it using Kusto in the Azure portal. To give an example of what I'm trying to do:
We're currently looking at these logs to find an action that triggers when a visitor viewed a blog post on our site. This is working well on a per blog-post level, but now we want to group this data by the category these blog posts are in, or by the tags they have, but that's not information I have within the logs.
The information we log contains unique info about that blog post (unique url, our internal id, etc) that I could use to look up this information in another data source (e.g. our SQL DB where this relation is stored), but I have no idea if/how this is possible. So that's the question, is this possible? Can I query a SQL DB, or get data in JSON via a URL or something?
Alternative solutions would be to move the reporting elsewhere (e.g. PowerBI) and just use AI as a data source, or to actually log all the category/tag info, but I really don't want to go down that route.
Kusto supports accessing external data (blobs, Azure SQL, Cosmos DB), however
Application Insights / Azure Monitor and other multi-tenant services are blocking this functionality due to security and resource governance concerns.
You could try setting-up your own Azure Data Explorer (Kusto) cluster, where this functionality will be available, and then access your Application Insights data using cross-cluster query, or by exporting the data from Application Insights and hooking up EventGrid ingestion into your Kusto cluster.
Relevant links:
Kusto supporting external data:
https://learn.microsoft.com/en-us/azure/data-explorer/kusto/query/schema-entities/externaltables
Querying data inside Application Insights:
https://learn.microsoft.com/en-us/azure/data-explorer/query-monitor-data
Continuous export data from Application Insights:
https://learn.microsoft.com/en-us/azure/azure-monitor/app/export-telemetry
Data ingestion into Kusto from EventGrid:
https://learn.microsoft.com/en-us/azure/data-explorer/ingest-data-event-grid
Wanting to validate my ARM template was deployed ok and to get an understanding of the telemetry options...
Under what circumstances do the following get logged to Log Analytics?
DataPlaneRequests
MongoRequests
QueryRuntimeStatistics
Metrics
From what I can tell arduously in the last few days connecting in different ways.
DataPlaneRequests are logged for:
SQL API calls
Table API calls even when the account was setup for SQL API
Graph API calls against an account setup for Graph API
Table API calls against an account setup for Table API
MongoRequests are logged for:
Mongo requests even when the account was setup for SQL API
However I haven't been able to see anything for QueryRuntimeStastics (even when turning on PopulateQueryMetrics) nor have I seen any AzureMetrics appear?
Thanks Alex for spending time and trying out different options of logging for Azure Cosmos DB.
There are primarily two types of monitoring paths for Azure Cosmos DB.
Metrics: These are low latency (<5 min) and aggregated metrics which are exposed on Azure Monitor API for consumption. THese metrics are primarily used for diagnosis of the app for any live site issues.
Logs: These are raw request logs coming at 2hours+ latency and are used for customer for primarily audit scenarios to understand who accessed the data.
Depending on your need you can choose either of the approaches.
DataPlaneRequests by default shows all the requests across all the API's and Mongo Requests only show Mongo specific calls. Please note Mongo requests would also be seen in Data Plane requests.
Metrics would not be see in Log Analytics due to a knowwn which our partner team is fixing.
Let me know if you have any further questions here.
I've installed "bigrquery" like this:
devtools::install_github("hadley/bigrquery")
library(bigrquery)
And i get this error, when trying to extract data:
Error: Access Denied: Job triple-xxx-xxx:job_zu6P-qSxxx7DBVICij6_QyDv0: RUN_QUERY_JOB
I've looked here and on the web and everyone says that you just need 2 things to extrac data from Google BigQuery:
1.-Have a Project for it (BigQuery Enabled):
2.-Put a billing address for BigQuery.
I've done that, but still got the problem.
IMPORTAT:
For other packages that interact with Google products (Google Analytics), e.g RGA; you need to create a Client ID (OAUTH), do i need to to this with "bigrquery"???
Someone can update the method to get the data?
Ps. I can get the data in the broswer (with the Web Interface provided by Google). But not in R from "bigrquery" - I'm using the version hosted on CRAN.
Ps2. I don't want that the "authentications" to be stored in the cache, is there a way to make "bigrquery" to ask for authentication everytime it tries to connect to BigQuery?
I found this issue on this post, but with the solution out-of-date:
Google App Engine authorization for Google BigQuery
This error means that the user that was running the query was not authorized to run jobs in the project (triple-xxx-xxx). You'd need to add the user that is running the query to the project via the developers console (https://console.developers.google.com/project).
To answer some of your other questions:
You don't need to create a clientid to use bigquery.
I'm not sure if there is a way to force bigrquery to re-authorize every time. That said, looking at the source code (https://github.com/hadley/bigrquery/blob/master/R/auth.r) you may be able to call set_access_cred with null to clear the authentication.