I have a script that imports data from Google Analytics to my BI. It was working for years. Recently it stopped working.
After investigation, it turns out that removing ga:adMatchedQuery from dimensions solves the issue.
I checked changelogs (https://developers.google.com/analytics/devguides/reporting/core/v4/changelog?hl=en, https://developers.google.com/analytics/devguides/reporting/changelog?hl=en, etc.), and do not see any deprecation notices. Are there?
Nevertheless, are there any alternative way of getting visitor's search term for my adds in Google API?
Update:
Further investigation showed that I can make a request if adMatchedQuery is the only dimension. Adding the second dimension (for example, my custom dimention or ga:hour) breaks the request.
For example:
https://ga-dev-tools.appspot.com/query-explorer/?start-date=2022-09-01&end-date=2022-09-08&metrics=ga%3AgoalCompletionsAll&dimensions=ga%3Asource%2Cga%3Amedium%2Cga%3AadKeywordMatchType%2Cga%3AadMatchedQuery%2Cga%3Acountry%2Cga%3Adate%2Cga%3Ahour&sort=-ga%3AgoalCompletionsAll%2C-ga%3Adate&ids=ANALYTICS_ID produces empty set.
https://ga-dev-tools.appspot.com/query-explorer/?start-date=2022-09-01&end-date=2022-09-08&metrics=ga%3AgoalCompletionsAll&dimensions=ga%3Asource%2Cga%3Amedium%2Cga%3AadKeywordMatchType%2Cga%3AadMatchedQuery%2Cga%3Acountry%2Cga%3Adate%2C&sort=-ga%3AgoalCompletionsAll%2C-ga%3Adate&ids=ANALYTICS_ID gives me results.
The only difference is that I removed ga:hour from the list of dimensions in the second link.
Same here, it seems google made an update on Sept. 1 to restrict search queries that don't meet their higher privacy threshold, however it seems like they completely shut off search query reporting in GA. Hopefully they recognize this bug and fix it.
Check https://support.google.com/analytics/thread/178348751/search-console-report-not-showing-search-queries-for-landing-pages?hl=en&authuser=1
I had the same issue and here is my solution.
Since Universal Analytics will no longer process new data in standard properties beginning July 1, 2023, Google is asking us to switch over to a Google Analytics 4 property. I guessed that it should be a way to keep us updated. So I updated my property for GA4 and I manage to access that dimension.
The ga:adMatchedQuery now could be found at GA4 as firstUserGoogleAdsQuery,sessionGoogleAdsAdNetworkType or googleAdsQuery.
I also found out that the API changed.
The service in the Google Cloud that used to bring the data was "Google Analytics Reporting API" now you need to use the service "Google Analytics Data API"
That is the quickstart Guide:
https://developers.google.com/analytics...
Metrics and Dimensions:
https://ga-dev-tools.web.app/ga4/dime...
I'm coding in Python, and that video helped me a lot:
https://www.youtube.com/watch?v...
We had the same issue since 5 September.
I found a solution on github. They say you have to replace the ga:adMatchedQuery with ga:keyword.
This works for me, but not entirely sure if it is the same as adMatchedQuery, but the results look good though.
Hope this helps you too.
I am currently using the Google Places API on a free trial. I am interested in paying for the API but can't find the exact cost of the two commands that I use: google_places(), and google_place_details(). I have contacted the Google sales team and looked at the places and billing url, but I have not managed to find the answer of how much it would cost exactly to execute these two commands.
For google_places(), this is an example of a command I would execute:
google_places(search_string = "Cafeteria in Madrid, Spain", key=key)
From the places and billing url, it seems like this counts as a text search, so each time the code is executed it would cost 0,032$. Is this the case?
For google_place_details(), here is an example of the command I would execute:
google_place_details(place_id = "ChIJf_XA-F0U04kR1IPYSdTJ4so", key=key)
This command, as well giving basic place details (which cost 0,017$ according to the billing url),
gives information which counts as contact data (an extra 0,003$) and atmosphere data (an extra 0,005$). It also provides photo data (0,007$ according to the billing url), which I am not interested in but is automatically included in the results anyway. Does this mean that the cost of executing this command once is these four prices summed up?
I am interested in knowing exactly how much it would cost to execute the two commands I have listed.
probably this helps:
First of all you are billed monthly after you exceeded the 200 Euro/Dollars, which are given by google for free (as you probably described as "free plan"). So after every month you get a bill on how many requests of each function you send to google. There everything is written quite clearly including the amount and price of each "unit". then you can easily divide it.
Second option would be your Google Api Cockpit.
It tracks your requests quite precisely on different time bases. So sending your wanted commands only once on a day can give you an exact total-price.
The Cockpit is super handy for different things. If you want you can even set limits, which is probably helpful in your case too.
Here is the link to the billing monitor as well: Billing Google API Cockpit
Furthermore the description of how google charges you. Look here
best regards
I'm pulling distance/time information for a large number of origin/destination pairs using the Google Maps API in R. I'm currently using the gmapsdistance package but have looked at a few others.
My premium API key includes 100k free queries per day. Are there any packages that can return how many are remaining? For example, the ggmap package has a geocodeQueryCheck(). The problem is I don't think this function actually returns the number remaining on your account. It doesn't ask for your API key. My guess is that it just keeps track of how many it has called today. The latest github version has a register_google() function that does allow you to set your API key, but when I make API requests with the gmapsdistance package, geocodeQueryCheck() doesn't update.
In summary, I just want to know how many are left. Even if I need to construct the URL address directly. When I look at the API documentation, I don't even see URL calls for it, which doesn't give me much hope.
As confirmed by #SymbolixAU, there is currently no way to do this.
Sorry, I guess this is late, but have you tried this?
sum(.GoogleDistQueryCount$elements)
I recently attempted to query the Google Analytics API for a report using device category, source, and medium as dimensions. The report covered about four weeks of time. Despite the fact that I was able to build the equivalent ad-hoc report in the UI and get results based on 100% of sessions, I couldn't get the API to give me results based on any more than 1.3% or so of sessions. The client I'm using is based on the v3 API, but I got the same results when using Google's v4 testing tool, so it's not a function of the API version.
According to Google's documentation, ad-hoc reports are supposed to use pre-aggregated unsampled data where possible:
Ad-hoc reports are based on any non-standard query of Analytics data. For example, if you apply a segment or secondary dimension to a standard report, then Analytics has to issue a new, non-standard query of the data to return that information.
The new query goes first to the tables of aggregated data to see if all of the requested information is available there. If the information is not available there, then Analytics queries the complete, unfiltered set of data and computes new aggregates to satisfy the application of the segment or secondary dimension.
This is apparently true of the web UI, but not necessarily of the API. I was under the impression that the web UI was making calls equivalent to those exposed in the API under the hood, but it seems that this isn't the case. Does anybody know whether it's possible to force the API query to use the pre-aggregated data sets that I know are available?
The difference in sampling threshold between the web UI and the API does indeed explain this. This happens to be a 360 account, for which the sampling threshold is much higher than the API permits (the documentation is cagey about exact numbers but apparently it can be "up to 100M sessions"). The same test on a standard account showed equivalent behavior between the API and the web UI. Google's issue tracker for the GA API indicates that they do not plan to increase the sampling threshold beyond 1M sessions even for 360 accounts.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I'm trying to determine why an enterprise wouldn't want to use Google Analytics.
Here are the main reasons I've seen mentioned:
Inability to track clients that have Javascript disabled.
Lack of ownership of the statistics - Google owns the data.
Most of the web clients with Javascript disabled will probably be bots/spiders. This data is interesting, but probably not very useful.
As for the ownership issue, this is a bit paranoid IMO.
What am I missing here? When is Google Analytics not good enough?
Here are my findings from additional research:
Google Analytics is limited to 5 million page views per month - source
If a web site generates more than 5 million pageviews per month it will need linked to an active AdWords account to avoid interruption of service.
Lack of / slow technical support
All Google support is handled through email and response times can take a week or more. Commercial analytics products often have much faster & personalized support.
Inability to track files (PDF's, Images, etc.)
GA relies on Javascript and files lack the ability to execute Javascript. The workaround to this problem is to tag the link, but this won't track requests that go directly to the file.
Limited ability to customize
This is a selling point that I see pushed by commercial analytics tools (WebTrends). However it's never explained what customizations are denied by GA but allowed by WebTrends.
The Google Analytics EULA does not allow you to track individual users by identifying them. So if you wanted to add a custom variable for username to track how many times each user logs in, then you would be in a gray zone if not outright violating the EULA.
I use Google Analytics on about 10 sites right now and it's a great tool. In addition to all the analytics stats, you can tie it in with AdSense and it becomes a marketing/revenue tool and not just "wow look at all these cool user stats". If there was a way to track by user ID in certain circumstances (e.g. if user's agreed to it, or if they work for the company that owns the site) then I would have no issues.
Besides, it's free and all you have to do is add JavaScript to the files, so give it a try and see what you think after a few months.
One reason that was, surprisingly, not posted:
timing / speed of reaction
It takes at least 4 hours (up to 24) for GA to update your data.
This is ok for me personally in most of the cases, but when reacting fast is crucial (news sites, one-off events, etc.) you may want to employ some other solution (Mint comes to mind, but it's not the only one out there of course).
Thought I'd add my two pence worth to this thread, as this a topic close to my heart and one I've debated with colleagues for years. We've used webtrends in house for as long as i can remember, back to version 4 of the log analyzer (how different things were back then!). Since Google Analytics came along, we've started to come under increasing pressure from certain parts of our business to switch, as 'it does everything we need form an analytics tool'
Well, true in many senses it does, especially these days. But I championed the integration of our CRM and web analytics tools back in 2006, and as our business isn't e-commerce (the 'conversion' happens offline, sometimes months after the visitor acquisition) we need to integrate in this way to get a true picture of campaign effectiveness, and notion of ROI.
All of this means, we need access to the raw data, need to be able to join visitor records on sessionID etc, without this access we'd be screwed. I'd love it if we could roll without it, but the current requirements mean we can't, so this alone is a HUGE reason why Google analytics is not good enough.
Over and out
For tracking desktop software or creating a whitelabel solution there are better solutions.
For white label an integration based analytics, i use MixPanel. For Desktop Software, i use Deskmetrics
Google Analytics does not work well with mobile phones. While the iPhone and the Palm may be supported, many of the existing handsets do not support the javascript that Google uses.
If you're based in the UK, then theoretically you could be breaking the Data Protection Act by using Analytics.
If information about your users (like which web pages they're looking at) goes "outside the European Economic Area" and onto Google's servers in the US, then you're breaking the DPA.
Pretty obscure, but you did ask :)
Piwik avoids the problem because you host it on your own servers.
Lack of ownership of the statistics - Google owns the data.
... As for the ownership issue, this is a
bit paranoid IMO.
One problem with it is that we can't even access the raw data. We had a use case this week where we wanted a visitor map for an executive presentation. We needed to get more flexible with how the visitor map is displayed (wanted to view the map in Google Earth plug-in). In GA, you can't. You take what they give you. You can see a map of how many visits came from each city, but you can't export a data file of cities and number of visits, to run the data through other tools. So, paranoia aside, there are significant limitations on what you can accomplish with GA.
However this is not a problem if you use Urchin, the self-hosted version of GA: you can export the data and do what you want with it. (And the exported data is richer than the web server log's, as it includes some analysis already.)
Since Piwik is open source, and pluggable, I imagine you could enhance the visitor map plug-in any way you wanted to. And export whatever data you want.
Whether this limitation affects you depends on your needs, obviously.
Update: I've now looked at the GA Data Export API, and it turns out that things you cannot do through the UI (as you can with Urchin), you can do with this API. It does look like you can export the visit data I was talking about, via a feed (although there are daily traffic caps on those requests). So sprinkle salt heavily on what I wrote above.
A couple more points that I've come across:
GA doesn't let you dig beyond full-day statistics; I would often like the ability to investigate whether a traffic dip the previous day was caused by the design update I did at 1pm or the soccer match on TV at 8pm.
GA doesn't offer a workaround for traffic spikes caused by DDoS attacks, Slashdotting etc. When I'm looking at a GA visitor graph of 2009, all I can see is the 2-million-pageview-spike on October 16th, pushing the entire rest of the year down flat against the horizontal axis of the graph. To get a meaningful graph, GA should offer the ability to trim or exclude outlying data points, or the ability to limit/bracket the graph window itself
GA doesn't have an event monitoring client (think Reinvigorate's Snoop tool)
While GA is very user-friendly, I've found it's not as granular as some of the other stats programs (or maybe I'm not looking in the right places). Before the marketing monkeys I work with began pushing GA, we were very satisfied with AWStats. The sheer scope of the data helped us on several occasions hone sites to better suit their audience. While GA is very shiny and laid out well, I personally still prefer the raw numbers like I used to get through AWStats.
Slow data processing speed - Can be as low as 15-30 mins for page views, but may be up to 48 for eCommerce
EULA is limiting in some cases
You won't own or have any control of the data. Google's engineers might use it (anonymously) for testing
Anything more complex requires customization - Downloads and such care of no issue, but there are limits
Cross domain tracking by linker is faulty at best
Visit based - Proper tools are based on Visitor level, GA works on Visit based reporting mostly
Limited number of custom vars used at one time (5)
No tech support, if you're realistic
Usually when there is a downtime notice, it's already gone
API limitations (4 dimensions and 10 metrics at one time, not all can be used together in addition to that)
I have many more, but at the end of the day it is a good tool for it's price.
From the non-technical point, I think the most important is that some enterprise has the high level data security policy. All of the data should be controlled and managed by themselves.
If you use the Google analytics,the data is stored in google's server. For some special enterprise, like insurance, financial company. The policy should be followed.
I would NOT go with server logs. In fact I have them disabled on my server. Why you ask me?
For the simple reason that everytime you hit my server that stupid logging program makes an entry in the physical log file on my HDD. So if my server gets 100,000 hits in a day that's 100,000 time a HDD write operation happens.
You think that's cool? Well it's not. It's slowing your server down, specially if the log file is huge.
Why would someone even consider doing that to their server? Specially when we're working so hard to minify javascript, css and make image files 2 KB smaller!
Please do yourself a favor don't log directly on your server.
At least Google Analytics logs it on Google's server so my server's healthier.
I wouldn't use it for any of my sites, because you're forcing the user to accept your proprietary JavaScript code in their browser, which is bad. Also, giving your data is Google is a really bad idea.
See Piwiki for something you can run yourself as in free software, eliminating both of the problems.