I just started working on Google Analytics stuff and I'm pretty new to this. I am now granted access to GA account of my Organization's marketing Website for several European countries(single login).
My requirement is to copy different European countries GA data into a single table structure in SQL server. Wondering if anyone of you have done this before? Any suggestions are highly appreciated.
As already written earlier, there are several ways of doing this. I prefer to integrate Google Analytics and SQL Server with no coding, using Skyvia tool: Google Analytics and SQL Server Integration. It allows me to create a copy of Google Analytics report data in SQL Server and keep it up-to-date with little to no configuration efforts. I don’t even need to prepare the schema — Skyvia can automatically create a table for report data. You can load 10000 records per month for free — this is enough for me.
There is a number of ways of doing this. Google Analytics does have the ability to export data as CSV but its going to be hard to match up the data properly.
If you are up for a bit of programming. start with the Google Analytics API it will allow you to extract data from Google analytics and insert it where you like. You can use any programming language that is capable of preforming a HTTP Post and HTTP Get. However i recommend looking into one of Googles client libraries.
If you have the ability to use SSIS to you can use Targit Google Analytics SSIS its a custom connection manager and data reader for extracting data from Google analytics it is free to use. Note: Full disclosure I am the lead developer on that project.
Related
Our ASP.NET Core app logs trace messages to App Insights. We need to be able to query them and filter by some customDimentions. However, I have found 3 APIs and am not sure which one to use:
App Insights REST API
Azure Log Analytics REST API
Azure Data Explorer .NET SDK (Preview)
Firstly, I don't understand the relationships between these options. I thought that App Insights persisted its data to Log Analytics; but if that's the case I would expect to only be able to query through Log Analytics.
Regardless, I just need to know which is the best to use and I wish that documentation were clearer. My instinct says to use the App Insights API, since we only need data from App Insights and not from other sources.
The difference between #1 and #2 is mostly historical and converging.
Application Insights existed as a product before log analytics, and were based on different underlying database technologies
Both Application Insights and Log Analytics converged to use the same underlying database, based on ADX (Azure Data Explorer), and the same exact REST API service to query either. So while your #1 and #2 links are different, they point to effectively the same service backend by the same team, but the pathing/semantics are subtly different where the service looks depending on the inbound request.
both AI and LA introduce the concept of multi-tenancy and a specific set of tables/schema on top of their azure resources. They effectively hide the entire database from you, and make it look like one giant database.
there is now the possibility (suggested) to even have your Application Insights data placed in a Log Analytics Workspace:
https://learn.microsoft.com/en-us/azure/azure-monitor/app/create-workspace-resource
this lets you put the data for multiple AI applications/components into the SAME log analytics workspace, to simplify query across different apps, etc
Think of ADX as any other kind of database offering. If you create an ADX cluster instance, you have to create database, manage schema, manage users, etc. AI and LA do all that for you. So in your question above, the third link to ADX SDK would be used to talk to an ADX cluster/database directly. I don't believe you can use it to directly talk to any AI/LA resources, but there are ways to enable an ADX cluster to query AI/LA data:
https://learn.microsoft.com/en-us/azure/data-explorer/query-monitor-data
And ways to have a LA/AI query also join with an ADX cluster using the adx keyword in your query:
https://learn.microsoft.com/en-us/azure/azure-monitor/logs/azure-monitor-data-explorer-proxy
tl;dr: I want to reference an external data source from a Kusto query in Application Insights.
My application is writing logs to Application Insights, and we're querying it using Kusto in the Azure portal. To give an example of what I'm trying to do:
We're currently looking at these logs to find an action that triggers when a visitor viewed a blog post on our site. This is working well on a per blog-post level, but now we want to group this data by the category these blog posts are in, or by the tags they have, but that's not information I have within the logs.
The information we log contains unique info about that blog post (unique url, our internal id, etc) that I could use to look up this information in another data source (e.g. our SQL DB where this relation is stored), but I have no idea if/how this is possible. So that's the question, is this possible? Can I query a SQL DB, or get data in JSON via a URL or something?
Alternative solutions would be to move the reporting elsewhere (e.g. PowerBI) and just use AI as a data source, or to actually log all the category/tag info, but I really don't want to go down that route.
Kusto supports accessing external data (blobs, Azure SQL, Cosmos DB), however
Application Insights / Azure Monitor and other multi-tenant services are blocking this functionality due to security and resource governance concerns.
You could try setting-up your own Azure Data Explorer (Kusto) cluster, where this functionality will be available, and then access your Application Insights data using cross-cluster query, or by exporting the data from Application Insights and hooking up EventGrid ingestion into your Kusto cluster.
Relevant links:
Kusto supporting external data:
https://learn.microsoft.com/en-us/azure/data-explorer/kusto/query/schema-entities/externaltables
Querying data inside Application Insights:
https://learn.microsoft.com/en-us/azure/data-explorer/query-monitor-data
Continuous export data from Application Insights:
https://learn.microsoft.com/en-us/azure/azure-monitor/app/export-telemetry
Data ingestion into Kusto from EventGrid:
https://learn.microsoft.com/en-us/azure/data-explorer/ingest-data-event-grid
I am looking for a way to quickly upload offline data into Google Analytics. This is possible using Data Import which is a feature provided by Google Analytics itself. But doing this on daily basis is a hectic task. Is there any other functionality available using which i can automatically upload data on daily basis and view the report?
You can automate data imports by using the Management API. Data Import is documented here.
To follow the examples you first need to install the Google API client for the programming language of your choice. Then you create the custom data source (same as for the manual upload) and send data there via the uploadData method. Run this at a schedule (e.g. via cron) and the task stops being hectic.
Say, I'm developing a Windows (if OS is important) application that will be available to download for free and I would like then to collect some usage statistics. In the easiest case - count of application launches. It looks superfluous to maintain a server (e.g. VDS) just for this.
I've been thinking to use Google Analytics for this (manually send requests to GA server). This will probably work, but it is not GA designed for - the idea looks like a hack.
What are the options here?
I don't think this is a hack. It's all just data about user interaction. There is little logical difference between opening a desktop app and clicking a button vs opening a web page and following a link. Both are measurable user actions you can track, aggregate and put on graphs.
In fact, Google provides a lower level HTTP based "Measurement Protocol" that is intended for exactly that.
https://developers.google.com/analytics/devguides/collection/protocol/v1/
From the overview:
The Google Analytics Measurement Protocol allows developers to make
HTTP requests to send raw user interaction data directly to Google
Analytics servers. This allows developers to measure how users
interact with their business from almost any environment
Just put an HTTP request with the correct parameters in your application launch or button click code and it will collect the data. Any data you want to collect.
In other answers to this question there are suggestions like making web services or storing the data locally but why reinvent the wheel? Google Analytics already provides the collecting and reporting tools and it seems like a good solution.
My GA Account has a number(50) of profiles associated with it and I am trying to build an api which shows me the basic information like visits, bounce rates etc. for each profile.
This query gets me what I want from GA, but for each profile:
URL ="https://www.google.com/analytics/feeds/data?ids=ga:11111&start-date=2011-07-01&end-date=2011-07-02&metrics=ga:visitors&prettyprint=true&alt=json"
The id is table id and the metrics gives me the information I want.
Now the problem is, I want to show all the information together. So, everytime I will have to send 50 requests to the API, which just doesn't work out. Is there a way I can get the information for all the profiles associated with me in a single request?
You unfortunately will be required to perform 50 requests if you want metrics for 50 different profiles. You can easily automate this, however, by using a combination of the Management API and the Data Export API.
The Management API allows you to pull information about the account. For example, you can very easily pull all profile IDs and names associated with an Analytics account through this API for use in an automated query.
The Data Export API, which I am sure you already are familiar with, is the only way to pull collected data/statistics for individual profiles.
If you are concerned about speed, you might want to build an automated process that uses both the Management API and the Data Export API. Pull all of the profiles associated with your account with the Management API, then loop through each and pull the basic data you'd like through the Data Export API. Have this run at regular intervals based on your needs and cache it between runs. This way it won't execute every time the page is hit (though you honestly might be fine, depending on your traffic - I've found it to be extremely quick).