I've been directed to post any questions on Stack overflow from Google regarding BigQuery
I have an app linked to Firebase collecting a few custom events. This has been running since August 2019. I've linked my firebase project to Big Query (Sandbox)
I've run a query to get a breakdown of 2 events which works fine BUT only for the last 2 days. When I put in a date range from say 1st August 2019 to 1st April 2020, I get the exact same number. If I choose a date range prior to the last 2 days I get no results.
I believe that only the last 2 days worth of data is being linked across.
How do I get access to all the data I have in Firebase analytics in Big Query? Is this a Sandbox limitation?
Thanks
Related
I have a script that creates Google Calendar events (via the Google Calendar API), inviting/auto-accepting the classroom, teacher and students (all users within our G Suite for Education account). (Yes, it is written in Perl but I don't think that is the problem.) Using this script, I manage 500-600 calendar events per school day.
There is enough rate limiting in the code and quota available in Google API Console that I can create a couple months worth of events in a nightly run. So I usually push one grading period into Google Calendar at a time. (I have over 37,000 events for this 2021-22 school year already pushed to Google Calendar.) This has worked since August 2018.
But, for the past month or so, if I try to create events after mid-January 2022, I get a "Forbidden (403)" after about 50 events are created. However, if I need to change 2021 or early 2022 events (for example, there is an assembly scheduled at school and the class times change, or a class moves from one room to another), I can delete/update/create the usual thousands of events per run with no problem.
As an example, tonight's run deleted and re-created 517 events for January 5, 2022 (there was a schedule change for that day) and made a few other miscellaneous changes, but only created 50 events for January 13, 2022 before a "Forbidden (403)" I'm not going to be able to create anything for a few hours. But, after that (or tomorrow), I'll be able to create 50 more events and then hit the same error again.
Did I miss a change to the API effective with events scheduled in the second week of 2022?
If you are confident that the quotas you use are way below the limits, you might be affected by a bug
There is a already an issue filed on Google's Issue Tracker where several users complain about limit being reach while console shows only 2% of limit being used - so similar to your case.
I recomend you to "star" this issue to increase visibility.
However, beforehand it might worth contacting the Google Workspace support. They can look into your quota usage and see if you are using some "idden quota" you ar enot aware of.
I'm getting campaigns/adGroups reports from Sponsored Brands/Sponsored Products Amazon Advertising API. When I select reportDate older than 60 days, I'm getting the error "Report date is too far in the past. Reports are only available for 60 days." (code 406). Is it really not possible to get older reports? Or maybe older reports need to be queried differently? Also, isn't it possible to get the report for time period longer than one day in one request?
There is an information about "reportDate" parameter, that it is "The date for which to retrieve the performance report in YYYYMMDD format. The time zone is specified by the profile used to request the report. If this date is today, then the performance report may contain partial information. Reports are not available for data older than 60 days." - but is it for all reports always?
It seems strange to me, as other services normally offers more stats that 2 months, and also there is the note in the documentation, that "Note: New-to-brand metrics are calculated from November 1, 2018. If a report date is requested earlier than this date, the metrics will be calculated from November 1, 2018."
Thank you for explanation and your help!
Ela
No, you cannot get data older than 60 days through the API. The data does exist within Amazon's databases, but cannot be accessed via API.
If you have a vendor manager contact or something similar, it's theoretically possible to request this data from them, but they'd probably only do it as a one-off for a large client.
I recently started using BigQuery with Firebase and Google analytics and it was working fine few days ago, but not anymore. As I can see, there are 2 tables:
app_events_ that should contains history and
app_events_intraday_ that should be current day (realtime)
Today is 13 of July and app_events_ is "stuck" on 8 of July:
and the app_events_intraday_ is "stuck" with data of 9 of July and 10 of July:
Is this normal behavior? Has anyone experience with this problem? Is possible that is bug in the platform?
Recently Firebase changed the way they send data to BigQuery, they used to create a dataset per platform and the tables app_events and app_events_intraday will go in that dataset. Now they send the data in the dataset "analytics_ which contains events and events_intraday tables for all platforms. You may want to check this dataset.
We are having some issues pulling yesterday's Google Analytics data from BigQuery. Can anyone explain at what point a previous day's GA data is finalized?
There is some explanation here of the intraday tables, but it's not very clear:
https://support.google.com/analytics/answer/3437719?hl=en
To get previous day data do you need to need to use the intraday tables at all? Do you have access to the fully processed dataset at 8am local time? Or is it 8 hours after the current day UTC+14:00 (etc)?
I had a similar question and asked their support, this is the reply:
"According to this Google Analytics documentation , it states that '1 file will be exported each day that contains the previous day’s data, and 3 files will be exported each day that contain the current day's data'. In such, the minimum time that the data from Google Analytics to be exported to BigQuery was 8 hours. Although Google Analytics can be linked to BigQuery, the availability of data depends on how it was served by Google Analytics 360."
But based on experience, it's really a minimum time. Sometimes there are delays of 4-5 hours.
My team has been pressing Google's support for providing SLA's for BigQuery dump, so they updated the documentation:
This feature is not governed by a service-level agreement (SLA).
In practice we are experiencing regular delays anywhere between 2 to 12 hours.
I'm using "Reporting google Analitics API" and I can’t find information about what the last “end date” with data in Analytics is.
For example, let's suppose you want to retrive the last month’s data.
When do you have to perform the query?
The first day of the current month?
...or the second one?
...or maybe the third one?
And only another question: are the returned data for days in pacific time?
Google Analytics API is supposed to have access to the same data you have in the interface.
Google says that data can take up to 24h to process. The time it takes to really update the data depends on the type and size of the account. Small accounts are updated multiple times a day and can have data available in just a few hours. Once you reach 1M hits a month you are moved to a different mode where the data on your account is updated only once a day. Google Analytics Premium customers have updates more often even for large ammounts of traffic.
There's no way to tell through the API what is exactly the time of the last hit processed. You can query the data for today by the hour and see for yourself though.
Usually you don't care and just want to make sure that the data you're querying has been fully processed for that day.
So if you query data for yesterday there's a chance it has not being completely updated, for example if it's midnight the data for yesterday is just a couple minutes ago and probably haven't been completely processed yet. The safest bet in this case is to query data for 2 days ago.
So if today is 2012-06-15 and you want to get 1 month of data a safe approach is to query data with start-date=2012-05-13 and end-date=2012-06-13. This will most of the time give you data for days that have been fully processed, but it's not 100% safe as well. Google Analytics have had outages in the past where data took longer than that to process, these are not usual though. When you get the data out it's really hard to tell just for the API if the data for those days have been fully processed or not, using the 2 days ago isea you just make it more likely that it is.
The days are aggregate following your timezone settings configured on the Google Analytics profile.