Already exists (HTTP 409) error Big Query/Google Analytics - google-analytics

We have recently set up our import from Google Analytics to Google Big Query, but at the same time as there is a successful Update Table, there are two errors: 'Failed: Create Dataset' and 'Failed:Insert Dataset'
The errors message is below:
Already exists (HTTP 409): Already Exists
Does anyone know why I'm getting this error? There doesn't seem to be an issue with the data so I'm not sure what is causing it.

This is a known "issue" within BigQuery Analytics export. As far as I know it is a side effect of how the export is currently configured.
The Google Analytics team is working towards fixing this issue, but you should be able to ignore it if the exports are otherwise successful.
Meaning, if you are not experiencing any other issues you shouldn't be concerned about those two error messages.

Related

Authenticating Google Cloud Storage in R Studio

I know a similar question has been asked (link), but the response didn't work for me.
TLDR: I keep running into errors when trying to authenticate Google Cloud Storage in RStudio. I'm not sure what is going wrong and would love advice.
I have downloaded both the GCS_AUTH_FILE (created a service account with service admin privileges'--downloaded the key associated with the service account) and also downloaded GAR_CLIENT_WEB_JSON by creating a OAuth 2.0 Client ID and downloading that associated JSON file.
I've tried authenticating my Google Cloud Storage in several ways and hit different errors.
Way 1-automatic setup:
gcs_setup()
Then I select any one of the options, and get the error: Error in if (file.exists(local_file)) { : argument is of length zero And that error happens no matter which of the three options I select.
Way 2 - basic, following manual setup instructions from the package:
Sys.setenv("GCS_DEFAULT_BUCKET" = "my-default-bucket",
"GCS_AUTH_FILE" = "/fullpath/to/service-auth.json")
gcs_auth()
In this case, GCS_AUTH_FILE is the file that I mentioned at the beginning of this post, and the GCS_DEFAULT_BUCKET is the name of the bucket. When I run the first line, it seems to be working (nothing goes awry and it runs just fine), but when I run gcs_auth() I get taken to a web browser page that states:
"Authorization Error
Error 400: invalid_request
Missing required parameter: client_id"
Way 3: Following the method from the post that I linked above
This way involves manually setting the .Renviron file w/ the GCS_AUTH_FILE and GAR__CLIENT_WEB_JSON locations, and then running gar_auth(). And yet again, I get the exact same error as in Way 2.
Any ideas about what could be going wrong? Thanks for your help. I wasn't sure how to put in totally reproducible code in this case, so if there is a way I should do that, please let me know.

How to solve a data source error when loading Google Analytics data in Power BI?

I would like to load data from Google Analytics into Power BI.
After transforming the data in the Query Editor, I apply the changes.
At first, I see the message 'Waiting for www.googleapis.com' and the number of rows increases.
After a while, I get the following error message:
Failed to save modifications to the server. Error returned: 'OLE DB or ODBC error: [DataSource.Error] There was an internal error..'
Rows with errors have been removed in one of the steps and I have a stable Internet connection.
Does anyone have suggestions on how to solve this?
I was also facing this kind of refreshing issue, First go to edit query and verify the data types and change the data types if needed, after that if you still facing this error, you need to keep open the app.powerbi.com while refresh your PBI dashboard, I was followed the above steps and my issue got resolved now.

How come Stackdriver messes up my error grouping

In my experience the Stackdriver Error Reporting service groups unrelated errors together. This is a big problem for me on several levels:
The titles often do not correlate to the reported errors in "recent samples". So I have to look at the samples for each error to see what errors really happend because the title really can't be trusted.
I might set an error to "muted" and as a result other errors that are grouped under the same title don't get reported anymore. It might take me months to discover that certain errors have been happening that I wasn't aware of.
In general I have no overview about what errors are happening in what rate.
This all seems to violate basic functionality for an error reporting system, so I think I must be missing something.
The code is running on Firebase Functions, so the Firebase flavour of Google Cloud Functions and is written in Typescript (compiled to Javascript with Firebase predeploy script).
I log errors using console.error with arguments formatted as Error instances like console.error(new Error('some error message')). AFAIK that is the correct way for code running on Node.js.
Is there anything special I can do to make Stackdriver understand my code better?
I have this in a root of my functions deployment:
import * as sourceMaps from "source-map-support";
sourceMaps.install();
Below is a screenshot of one error category. You see that the error title is "The service is currently unavailable", yet the samples contain errors for "Request contains an invalid argument" and "This request was already locked..."
The error about service and invalid argument could be related to the FCM service, so there is some correlation although I think these are very different errors.
The error about request lock is really something completely unrelated. The word "request" in this context means something really different but the word is the only relationship I can see.
The error reporting supports Javascript, but not Typescript as mentioned in the documentation for the product, nevertheless, you should take a look at your logs and see if they are properly formatted for them to be ingested in the error reporting.
Also, keep in mind that the errors are grouped based on the guidelines over at this document, so maybe you won't get the grouping you get due to them.
Hope you find this useful.

Why is Bing Custom Search API throwing an error when I make a call through node-js?

I was using Bing Custom Search API for the past week with no problems thanks to the free trial, but today, I tried upgrading to the S1 plan, since the API was sending error messages. I tried regenerating the key as well, but despite doing both of these things, I was still getting errors and unable to use the API.
However, I was able to make calls using: https://www.customsearch.ai/applications, where I was able to use my API key to test endpoints and get the results I expected. What baffles me is that my nodejs code which hasn't been modified aside from the subscription key should still work with the upgraded plan, but it doesn't.
I should be able to help here. Let us do little bit of troubleshooting once:
First go to https://www.customsearch.ai/applications -> click on your instance name ->Click on "Production" tab on top -> Try making API call on this page by providing Query and Subscription key you got. If this works, go to the next step.
On the page mentioned above you would have seen Custom Configuration ID and Subscription Key. Make sure those two are same in your node.js code. Ideally this should work.
If it still doesn't work, please share your error code so that I can get better understanding of the error you are getting.

IcCube - How to retrieve full error messages for failed Google Analytics Api Calls

When you hit the "Refresh Datatable" Button for a misconfigured datatable inside the Google Analytics Plugin the command fails with the message:
Failed to load columns for table 'My Table' due to following error: 'Google Analytics Error'
This is not very descriptive.
So, how to get the fully qualified error-message which comes from the Google Analytics API?
Is there a way inside the icCube to log the error message which comes from the Google-API?
I tried to set all log-levels to DEBUG, but it did not help.
Alternatively, does Google log these errors, so that I can view them anywhere at the developer console?
It's an error on the current icCube version, will be fixed in 6.0 RC2.
Issue 254
When you refresh the Datatable, the API-Call analytics.data.ga.get is triggered.
So you can simulate this call inside the Google APIs Explorer.
Here you just need to put in the metrics and dimensions, like you did in the IcCube IDE.
How to get the profile Id is explained here.
After hitting the Execute-Button you can now see the full error-message.
A common error is to ask for more dimensions or metrics than allowed (max. 10 metric, 7 dimensions).
Of course it would be much more convenient, if the icCube Plugin would read the error-message and print it inside the IDE or the log...

Resources