Kusto Query for aggregated Graphs in Dashboards (ADX) - azure-data-explorer

I am creating dashboards in Azure data explorer (ADX) with Kusto Query Language. In ADX I have an option to create multiple charts within a single Dashboard by using Add tile option. But I would like to know is there a way to use one Query to create aggregated charts (skip the Add tile option).
for example - I have a table that has Virtual machine OS , status, size, cost . In a single Dashboard I need to have 2 seperate graphs. 1) Status of diff VM with OS, 2) Size and cost of VM in second graph. So I need to aggregate the graphs within dashboard. I am not sure how to seperate 2 Queries in same Dashboard. I did search and did not find any results.Thanks for support.

This capability does not exist today but it is high on the list of future improvements. Feel free to add a suggestion though in the Azure data explorer user voice

Related

Ingesting Google Analytics data into S3 or Redshift

I am looking for options to ingest Google Analytics data(historical data as well) into Redshift. Any suggestions regarding tools, API's are welcomed. I searched online and found out Stitch as one of the ETL tools, help me know better about this option and other options if you have.
Google Analytics has an API (Core Reporting API). This is good for getting the occasional KPIs, but due to API limits it's not great for exporting great amounts of historical data.
For big data dumps it's better to use the Link to BigQuery ("Link" because I want to avoid the word "integration" which implies a larger level of control than you actually have).
Setting up the link to BigQuery is fairly easy - you create a project in the Google Cloud Console, enable billing (BigQuery comes with a fee, it's not part of the GA360 contract), add your email address as BigQuery Owner in the "IAM&Admin" section, go to your GA account and enter the BigQuery Project ID in the GA Admin section, "Property Settings/Product Linking/All Products/BigQuery Link". The process is described here: https://support.google.com/analytics/answer/3416092
You can select between standard updates and streaming updated - the latter comes with an extra fee, but gives you near realtime data. The former updates data in BigQuery three times a day every eight hours.
The exported data is not raw data, this is already sessionized (i.e. while you will get one row per hit things like the traffic attribution for that hit will be session based).
You will pay three different kinds of fees - one for the export to BigQuery, one for storage, and one for the actual querying. Pricing is documented here: https://cloud.google.com/bigquery/pricing.
Pricing depends on region, among other things. The region where the data is stored might also important be important when it comes to legal matters - e.g. if you have to comply with the GDPR your data should be stored in the EU. Make sure you get the region right, because moving data between regions is cumbersome (you need to export the tables to Google Cloud storage and re-import them in the proper region) and kind of expensive.
You cannot just delete data and do a new export - on your first export BigQuery will backfill the data for the last 13 months, however it will do this only once per view. So if you need historical data better get this right, because if you delete data in BQ you won't get it back.
I don't actually know much about Redshift, but as per your comment you want to display data in Tableau, and Tableau directly connects to BigQuery.
We use custom SQL queries to get the data into Tableau (Google Analytics data is stored in daily tables, and custom SQL seems the easiest way to query data over many tables). BigQuery has a user-based cache that lasts 24 hours as long as the query does not change, so you won't pay for the query every time the report is opened. It still is a good idea to keep an eye on the cost - cost is not based on the result size, but on the amount of data that has to be searched to produce the wanted result, so if you query over a long timeframe and maybe do a few joins a single query can run into the dozens of euros (multiplied by the number of users who use the query).
scitylana.com has a service that can deliver Google Analytics Free data to S3.
You can get 3 years or more.
The extraction is done through the API. The schema is hit level and has 100+ dimensions/metrics.
Depending on the amount of data in your view, I think this could be done with GA360 too.
Another option is to use Stitch's own specfication singer.io and related open source packages:
https://github.com/singer-io/tap-google-analytics
https://github.com/transferwise/pipelinewise-target-redshift
The way you'd use them is piping data from into the other:
tap-google-analytics -c ga.json | target-redshift -c redshift.json
I like Skyvia tool: https://skyvia.com/data-integration/integrate-google-analytics-redshift. It doesn't require coding. With Skyvia, I can create a copy of Google Analytics report data in Amazon Redshift and keep it up-to-date with little to no configuration efforts. I don't even need to prepare the schema — Skyvia can automatically create a table for report data. You can load 10000 records per month for free — this is enough for me.

Possible to jump from new Metrics Explorer to Search?

One highly useful feature of the Classic Metrics Explorer is the ability to click the chart to jump directly to a Search blade with corresponding filters and time range already set up.
For example, if I have a Classic Metrics Explorer chart on my dashboard with Failed Requests for the last 12 hours filtered to a specific Cloud Role Name, I can click the chart to get to a dedicated blade for that chart, then click the chart on that blade (heck, I can even click-drag to easily filter time further first) to get to a Search blade which shows failed requests for the last 12 hours for that specific Cloud Role Name. This allows me with just a few clicks to easily drill down into the specific telemetry items related to anything I see in the original chart, such as traces for the failed requests.
The gif below demonstrates this, clicking on a Classic Metrics Explorer chart on the dashboard to get to its own blade, and then clicking the chart there to get to the Search blade:
This does not seem to be possible with the new Metrics explorer. Clicking a chart on the dashboard gets you to the Metrics blade for that chart, but clicking the chart there does not have any effect. See the gif below:
This means that whenever I want to drill down into some data I see on a Metrics chart/blade, I have to go to the search blade and manually set up the correct filters. When there's a high number of services, this takes significantly more time to the point of being completely out of the question, which is a shame, because the new Metrics charts are better in other ways (e.g. integration with the dashboard-wide time range).
Have I missed something, or is it simply not possible to go from the new Metrics charts to relevant telemetry items? Is there a workaround that makes the job of going from charts to related telemetry items easier when using new Metrics charts?
Unfortunately, this is a known limitation of the new metrics explorer.

Create dashboard views that are based off of current date (ie. show resource overview for next 2 months)

I'm attempting to create a report that is based off the current date. So, for example, creating a line graph that shows total work for all resources for the next 2 months. It would be very similar to the resource overview dashboard, but it wouldn't be pulling in data from the entire project.
The 'Resource Usage' view below has been very helpful, as it would be visual aids based on the hour allocations below.
We can create a graph like the one below in the reporting module, I would like the graph to only look at the next 2 months (instead of the entire project duration).
The goal is to look at capacity and future work allocation to easily look at resource availability to aid in assigning future tasks.
Thoughts? Tips? Advice?
You should be able to use the built-in Report capability in MS Project 2016. Try modifying the Progress Versus Cost chart in the Cost Overview report.

Get more than 7 dimensions in google analytics

I am fetching my data from google analytics core api. I came to know that we can fetch only 7 dimensions using api, But here I need to fetch more than 7 dimensions with correct metrics. Is there anyway (other than using paid google analytics) to fetch more than 7 dmensions with correct metrics from google analytics.
If not, then is there any mathematical formula through which we can find intersection of dimensions fetched using 2 different dimensions having one dimension in common.
Thanks
The only way I have found around the 7 dimension limit is to be creative with filters. It will mean that you are sending more queries to the server, but if you aren't worried about your quota limit then its doable.
Example: ga:visitorType only has two values. New Visitor, Returning Visitor
So you could run a query with your 7 dimensions. Then add A filter ga:visitorType=New Visitor then run the same 7 dimensions with a filter added of ga:visitorType=Returning Visitor
Bingo you just got an 8th dimension.
Again you need to be a little careful because you are going to increase the number of queries you run by each value of dimension you are doing creative filtering on. I wouldn't recommend doing it with say ga:landingPagePath because the number of landing pages you have is to big. You will end up running too many queries. But there are a few you can do this with. ga:isMobile, ga:deviceCategory, ga:date. Just remember the more creative filters you add the more queries you end up running. I had one where I ended up expanding it to 80 requests just to get the data back I needed. Scary but it worked.
If you have specific dimensions that can be used to identify a session at a specific time (e.g. a session ID and browser timestamp), you can execute multiple queries then patch them together.
I built a python program that will do exactly this: https://github.com/aiqui/ga-download
This program can bring together multiple groups of dimensions, so that any number of dimensions can be downloaded and combined into a single CSV file.

Google Analytics: How to display traffic data of all my sites as one graf?

As a webmaster I support N different client sites. How to display all their traffic data at one joint graph conveniently?
If "n" is something in the hundreds or thousands you can use the recently announced APIs for Large Companies.
If your needs are somewhat smaller you can pull the data into a Google spreadsheet (via Google apps script) and use the Chart functions (or Google Charts) to create a combined graph.

Resources