I'm currently writing a script that aims to write Google Calendars for 50 people over 6 months, at one event per day. These events can have different start/end times and be whole day events or not. The calendars will all be written on the same Google account, and users will subscribe to them using their personal accounts.
The data for the calendars is originally stored on a Google Spreadsheet. My script reads this spreadsheet in one step using getRange.getValues method, as recommended, but I can't find a way to "batch" the writing except by using a .createEvent method inside a for loop. This loop gets killed about 1/10th of the way through because it is too long to run (about 9000 calendar events for the whole batch, exceeds the 5 min range).
If I comment out the .createEvent line, the script runs completely just fine.
Is there a way to optimize writing to Google Calendar (passing an array to .createEvent or something else...) ?
There's no batch methods for the Calendar API. You have to insert one by one in the "slow" loop.
The solution I use in my scripts is to divide the load into "doable" chunks. You can save on each row if you have processed it or not, or you can save somewhere (e.g. a cell or script property) the row you stopped last. And then, configure your script to stop by itself when it has worked its share (e.g. 3000 events).
I'm not sure if you have seen it already, but there's a new dashboard where you can see that your quotas for the various APIs. It seems that after you solve this time limit quota, you'll run into the calendars. Good news is that this spread the load solution will also help you to deal with it.
Related
I have successfully created a GTM trigger and tag using the click_text parameter. When I preview and when I published the change both were successful in showing up on my Google Analytics 4 debug and real time tabs. I cannot seem to find a recorded total for this new tag trigger in either GTM or GA4 anywhere. Does this exist in either of these, or do I need to create an event in GA4 unrelated to what I set up in GTM. I have read most of Google's provided documentation on this specific step and it stops flat at this step of things.
Thank you in advance.
If you see your event in real time data report in GA, you're good. The data is in that property. It, however, is not yet available for aggregation, so you won't be able to count them or use them in other reports.
You should wait up to two days for the data to be in the non-real time reports. Vast majority of the data will be available for aggregation in one day, however. Some starts showing up in hours. GA 360 (paid version of GA) shortens the two days to four hours until all data is there.
I'd also suggest using Adswerve plugin for GA debugging: it will print all DataLayer changes as well as everything that is being sent to GA in the console. It's much more comfortable than using real time hits report and it will show you all dimensions that are being sent to GA.
I have a mobile app which plays audio tracks. It uses Firebase Analytics to record events such as 'track names' played. Within the Firebase 'StreamView' one can access trending events and see which are the most popular tracks being played at any given moment. I would like to gain access to this list to and use it within my app to display a list of "tracks being played now".
I've looked into gaining access to Analytic trending event data and think Firebase Cloud Functions may provide a method of extracting the information I need. However, I'm not certain this is the correct, or easiest, method.
Could someone let me know whether extracting trending events is possible and, if so, point me in the correct direction?
Thanks
EDIT - Actually, there is a much better and easier way to get access to real-time events that have occurred in your app over the last 30 minutes. You can do so using the Google Analytics Data API.
Using the API you can filter through the event data for the past 30 mins, and inspect relevant custom dimensions on the play_track event for the track that was played (or provide a custom dimension filter to further specify the event data you get back).
This would be the ideal way to achieve what you're looking for. You might still want to use Cloud Firestore if you'd like to keep a longer record of trending tracks being played (e.g. in the last hour, last 24 hours, etc... though). Also note that the API is still in alpha.
-- END OF EDIT
Other Solutions
Option 1 - Use Cloud Firestore
This is probably the easiest solution - you can create a record of which tracks are being played whenever the event occurs by creating a simple collection in Cloud Firestore, and updating records for tracks being played there. It would require additional effort in logging and retrieving which tracks are played beyond just using Google Analytics, but should be straightforward to implement.
Note you'll probably want to check out the Firestore pricing guide first before selecting this option.
Option 2 - Using Firebase Cloud Functions
You can trigger a Cloud Function each time a play_track event is logged. The event will need to be marked as a conversion event in order for it to trigger a Cloud Function, and within the Cloud Function you can access the event parameters to identify which track is being played, and over time maintain a record somewhere for which tracks are being played to determine the most trending tracks. To maintain state you could use something like Firestore to keep track of which "tracks" are being played at the moment.
A couple of caveats about this approach:
You'll want to check out the Cloud Functions for Firebase pricing guide to make sure it falls within an acceptable range for your needs.
Cloud Functions triggers for analytics events currently only works for Android and iOS apps (no support for web apps).
Google Analytics triggers for Cloud Functions is currently in beta.
Option 3 - Using BigQuery for your analytics data
This option requires a bit more effort to setup, but you can export your Google Analytics data to BigQuery, and query the generated intraday tables to see which tracks are trending as well as a lot more additional insights.
The caveat with this approach are that you'll also need to check the pricing guide for using BigQuery to make sure it falls within your needs, and you'll need to make a call to execute the query and retrieve the list of tracks (or get a cached result).
In our project we stored all users event data in our database for over one year , but it's not indexed.
now we are going to use google analytics to store our analytics and analyze the report using google analytics dashboard.
but before start using google analytics , i would like to emigrate all old statics (about 2 million events) to google analytics.
for this matter i should use Measurement Protocol and it's limit allow me to transfer 2 million hits with no problem.
but i didn't succeed to know how to set the time of the event. Measurement Protocol has Queue Time but google says :
Values greater than four hours may lead to hits not being processed.
how it's possible to transfer 2 million events to google analytics with there event time ?
Thanks
You are correct you can use the measurement protocol to send events data directly to google analytics. I don't see any problem in sending 2 million events. However its not possible to set the event time longer then four hours ago.
Queue time is used to set the time that the event occurred as you can see it cant be more then four hours ago and I have found that if you do set it to four hours ago its a bit fuzzy if the data is correct or not. This feature is probably most use in mobile devices where they may go off line for a short time you can store the data then send it all once the device is online again.
So the dates will be the date that you sent the event to Google Analytics you cant back date the data to more then four hours ago. So I am not sure how much use the data will be to you when it is all inserted.
There is no way to do this, but you can make it easier on yourself.
Unfortunately, there is no way to add, remove, or otherwise edit Google Analytics hit data retrospectively, except to delete all of it. You also cannot copy, or move it between accounts, or download it all.
You are not the first to have to come to terms with this.
In this situation, we recommend to our clients that they run their new and old systems in parallel for a testing period (usually 6 months or a year), before switching off one of them.
Yes, it's difficult to let go of old data, but sometimes it has to be done.
I had two days of event tracking lost due to a trigger being changed. I have the data in an excel file and need to import it into my GA reports. Is this even possible? From what I have read, it seems that I could create a custom metric for the lost data and use a custom dimension that I already send with the event, as the key. Does anyone have any insight into a solution?
Data Import does not change data that's already been collected (unless you have GA Premium, which can apply data via query time imports apparently), plus you cannot create hits with data imports, so you cannot create events retrospectively (plus I do not quite understand the last part of the question, if you haven't sent the event then you have not sent any custom dimensions attached to it; custom dimensions are only sent together with interaction hits).
If it were just a few hours you might experiment with measurement protocol hits and the queue time parameter, but even if the 4 hours max queue time are apparently not a fixed limit it will certainly not work for two days. Plus you would have a hard time to connect those hits to an existing session.
All in all I don't think there is a way to do this.
I'm using Google Analytics API to get the number of page views for each page of my website. In order to reduce the number of api calls, I'm setting an interval for doing this and cache the data on my server. For each api call, I try to get the page views of every page on my site and update them to my database.
Is there a way to get only CHANGED DATA from a specific time stamp? For example, only page views that changed within last 2 hours.
I think it would be a kind of filters (if any) but I could not find it from the documentation here https://developers.google.com/analytics/devguides/reporting/core/v3/reference#filters
You could add a filter for ga:dateHour so that it comes back in the last two hours. But the problem is that it takes Google around 4 hours to process the data. So you wouldn't get anything back for two hours ago.
If you want to see data that is that new you have to use the Realtime api https://developers.google.com/analytics/devguides/reporting/realtime/v3/
What exactly is your query currently? If you do ga:date, ga:dateHour, ga:pagepath, ga:pageviews The results will all be returned in one query (not counting next pages), thats a log way to the 10,000 queries per day limit.
On a side note. What do you mean by changed? Nothing is going to change in data previously processed.