Scheduling an Unsampled Report via the API - google-analytics

I would like to schedule a few unsampled reports to run monthly on the first of the month. For each report I need one unsampled report for the previous month, and another for the previous year. Using the GA web interface, I can schedule a monthly report for 6 months, but I don't see a way to schedule a report to include the previous year's worth of data. Some other limitations are that I have to remember to schedule the report every 6 months, and that I can't see what reports have already been scheduled. All of which leads me to the conclusion that I have to use the API if I want to accomplish this.
So first off, according to the documentation, I believe I should be able to do that via the "Unsampled Reports: insert" api method.
https://developers.google.com/analytics/devguides/config/mgmt/v3/mgmtReference/management/unsampledReports/insert
First off, is that a correct assumption? Does the insert trigger an unsampled report for immediate processing?
Secondly, can I configure a report in the API the same way as I configure it in the web interface? For example, for certain reports I set the type to Flat Table. Not sure how I would specify that in the API or is that irrelevant when it comes to a custom report?
Thirdly, does the output end up in Google Drive the same as if I ran the unsampled report via the web interface?

I'd strongly recommend reading the developer overview of the unsampled reporting methods of the Management API. It'll give you a good sense of how the process works.
To answer some of your specific questions:
First off, is that a correct assumption? Does the insert trigger an unsampled report for immediate processing?
The process isn't necessarily immediate, but yes it does trigger a new unsampled report for processing.
Secondly, can I configure a report in the API the same way as I configure it in the web interface? For example, for certain reports I set the type to Flat Table.
No you don't get those same report types, you just get data back. You can, however, configure it the same way you'd configure a Core Reporting API request. To play around with how that works, I'd check out the Query Explorer.
Thirdly, does the output end up in Google Drive the same as if I ran the unsampled report via the web interface?
Yes, the end result should be the same.

Related

SOAP API method for SSRS report usage statistics

I've developed a .Net web application using the SOAP API (ReportingService2010) to list details on SSRS reports.
For the next step, I need to get some usage statistics such as which reports are accessed the most, which reports are accessed most frequently etc.
I know you can get some of this from the ExecutionLog table, but I'd like to avoid the SQL approach. Is there a way to get usage statistics like this directly through the SOAP API?
Thanks.
Nope. Best you can get from the stock API is snapshot/cache history information. You could, however, extend the existing API (pulling the information from ExecutionLogStorage). Even though you'd still be building the methods yourself, at least you could wrap them up nicely within the existing webservice.

Execute an SSRS Report from ASP.Net (unseen) supplying parameters, including printer

I'm new to SQL Server Reporting Services. I can't seem to find the right walkthrough, but maybe I missed it.
I have an application that generates an order, and, once completed, I want it to generate a report that is sent directly to a printer based on the user generating the report. Is there a good way to do this with SSRS? I have the sample report deployed and I can run it from the Reports page.
Can anyone tell me if there is just not a good way to do this? Years ago, I used Word with VB6 to automatically send reports successfully on a data sync with tablet inspection software for like 10 different inspectors, all organized the MS Access. But that was not server-side. Is it really not possible to shoot off a report with a couple parameters and have SQL Server Reporting Services send the report to a specified printer?

How to log and analyze certain user actions on my website

I have a simple page that provides a search experience. It allows users to search via a query form, filter results, and perform more in-depth searches based on the results of the first search.
I would like to get some metrics around the user experience and how they are using the page. Most of the user actions translate in a new query string. For example:
how many users perform a search and then follow up with another search / filter
how many times a wildcard is used in the search query
how many results does a user browse before a new search
I am also limited of using google analytics and the sort because of copyright issues (maybe I can make a case if it is really the way to go for open web analytics or smth). Server side I am thinking of using cookies to track users and log4net to log what they do, then dump the info in a db and do analysis from there. Or log to the event viewer and use the Log Viewer to get the info from there.
What do you think is the overall better approach?
I would recommend you use an existing, off-the-shelf solution for this, rather than building your own - it's the kind of project that very rapidly grows in size. You go from the 3 metrics in your question to "oh, and can you break that down by the country from which the user browses?", "what languages affect the questions?", "do they end up buying anything if they click results for bananas?". And then, before you know it, you've built your own web analytics tool...
So, you can either use "web analytics as service" offerings like Google Analytics, or use a more old-fashioned log-parsing solution. Most of the questions you want to answer can be derived from the data in the IIS web logs; there are numerous applications to parse that data, including open source and free solutions.
It's been a long while since I used a log file based analytics tool, but my ISP provides AWStats, which seems pretty good - to do what you want, you'll have to set up specific measurements around your search page; not sure if AWStats does that (Google Analytics definitely does); check the Wikipedia list for log file analysis tools which do that.
Obviously you need to log every submit of the search page.
In particular you need to log:
DateTime.Now
SearchString
SessionID
You could also store a counter in the Session that will be incremeted each time a user loads a page, that is not the search page.
If the users performs a search you could read that value from the session, store it in the database and reset the counter.
Be aware that the metric of "how many results does a user browsw before a new search" should only be taken as an estimate and not as a real metric, due to cookie support, multitabbing, page reloads et cetera.

Fastest way to get basic information from google analytics api

My GA Account has a number(50) of profiles associated with it and I am trying to build an api which shows me the basic information like visits, bounce rates etc. for each profile.
This query gets me what I want from GA, but for each profile:
URL ="https://www.google.com/analytics/feeds/data?ids=ga:11111&start-date=2011-07-01&end-date=2011-07-02&metrics=ga:visitors&prettyprint=true&alt=json"
The id is table id and the metrics gives me the information I want.
Now the problem is, I want to show all the information together. So, everytime I will have to send 50 requests to the API, which just doesn't work out. Is there a way I can get the information for all the profiles associated with me in a single request?
You unfortunately will be required to perform 50 requests if you want metrics for 50 different profiles. You can easily automate this, however, by using a combination of the Management API and the Data Export API.
The Management API allows you to pull information about the account. For example, you can very easily pull all profile IDs and names associated with an Analytics account through this API for use in an automated query.
The Data Export API, which I am sure you already are familiar with, is the only way to pull collected data/statistics for individual profiles.
If you are concerned about speed, you might want to build an automated process that uses both the Management API and the Data Export API. Pull all of the profiles associated with your account with the Management API, then loop through each and pull the basic data you'd like through the Data Export API. Have this run at regular intervals based on your needs and cache it between runs. This way it won't execute every time the page is hit (though you honestly might be fine, depending on your traffic - I've found it to be extremely quick).

Tracking CPU and memory usage with Google Analytics

I'm looking for a good way to track CPU and memory usage for a variety of web applications and to be able to cross-reference this information with information on Google Analytics. For example, I'd like to be able to generate a report that shows the CPU and memory usage along with number of hits averaged over minute periods. One way I thought this could be solved is by adding custom page-level variables to Google Analytics for tracking CPU and memory usage. My questions:
For those familiar with GA reporting as it pertains to custom variables, is this possible?
Is there a better way to generate the kind of report I'm seeking? Perhaps even without using GA?
Thanks.
You can use the Google analytics API to push this data directly from the web page via javascript, or from the server using whatever language is relevant.
I've seen at least one large implementation use the API for UX A/B testing by way of event tracking, but there's no reason you couldn't store whatever related data you'd like.

Resources