Google Cloud billing is not updating with the free trial (on monthly payments) and I can not change it to a faster update cycle. As per https://cloud.google.com/free-trial/docs/billing-during-free-trial the bill should come every month.
It is therefore not easy to see how much of the 300$ is left.
Is there any way to at least see how many TBs my queries used? This should be by far the biggest item on the bill.
I am concerned that I might get 'stuck' between some important queries that I otherwise could have managed better to have at least partial results available after the trial ends.
BigQuery analysis & storage costs should be listed under your GCP billing transactions:
https://console.cloud.google.com/billing/<INSERT_YOUR_BILLING_ID_HERE>/history?e=13803970,13803205
Another way to see how much you have queried is by enabling audit logging as described here.
Related
I am currently using the Google Places API on a free trial. I am interested in paying for the API but can't find the exact cost of the two commands that I use: google_places(), and google_place_details(). I have contacted the Google sales team and looked at the places and billing url, but I have not managed to find the answer of how much it would cost exactly to execute these two commands.
For google_places(), this is an example of a command I would execute:
google_places(search_string = "Cafeteria in Madrid, Spain", key=key)
From the places and billing url, it seems like this counts as a text search, so each time the code is executed it would cost 0,032$. Is this the case?
For google_place_details(), here is an example of the command I would execute:
google_place_details(place_id = "ChIJf_XA-F0U04kR1IPYSdTJ4so", key=key)
This command, as well giving basic place details (which cost 0,017$ according to the billing url),
gives information which counts as contact data (an extra 0,003$) and atmosphere data (an extra 0,005$). It also provides photo data (0,007$ according to the billing url), which I am not interested in but is automatically included in the results anyway. Does this mean that the cost of executing this command once is these four prices summed up?
I am interested in knowing exactly how much it would cost to execute the two commands I have listed.
probably this helps:
First of all you are billed monthly after you exceeded the 200 Euro/Dollars, which are given by google for free (as you probably described as "free plan"). So after every month you get a bill on how many requests of each function you send to google. There everything is written quite clearly including the amount and price of each "unit". then you can easily divide it.
Second option would be your Google Api Cockpit.
It tracks your requests quite precisely on different time bases. So sending your wanted commands only once on a day can give you an exact total-price.
The Cockpit is super handy for different things. If you want you can even set limits, which is probably helpful in your case too.
Here is the link to the billing monitor as well: Billing Google API Cockpit
Furthermore the description of how google charges you. Look here
best regards
I work for a non-profit that needs to see how our fundraising efforts are going in 'real-time'.
We look at results in blocks of about a half hour - so we need to report on how we finished the last 24 hours or so and also where we're at in the current half-hour. We're accomplishing this through google analytics, as we have multiple fundraising streams all pointing to a common GA account.
I have tried using datastudio to report against the GA API, but that connector does not seem to refresh at a reliable rate - someitmes it'll pull fresh data within a minute, sometimes it can take twenty minutes to report on recent transactions. I believe the 'real-time' API could be used to get fresher GA data, but as far as I can tell, that will only report 'live' data, and not prior/historical data (say from four hours ago). Does anyone know what API I could use if any to pull all data historical through current datetime?
I apologize if this request is vague, but I'm just looking for a conceptual approach at this point to get the freshest data - preferably in one fell swoop (API call). There is more complexity post-data intake (I have to then compare it to goals we've set for each half-hour, amongst other nuances to the transacitons themselves), so i wanted to start with this fundamental piece/question.
Thanks!
Given the context provided, I believe that the API solution would not be feasible. Among other reasons:
The real time API only offers a limited amount of dimensions and metrics. For example, e-commerce data is not available.
https://ga-dev-tools.appspot.com/dimensions-metrics-explorer/
https://developers.google.com/analytics/devguides/reporting/realtime/dimsmets
The Standard intraday processing SLA for the Core Reporting API is < 24 hours for standard properties. The processing occurs on a best effort basis. Meaning that an hourly availability can occur from time to time but can not be guaranteed.
https://support.google.com/analytics/answer/7084038?hl=en
As an alternative approach to the API solution, you could consider the use of an App + Web property which would allow you to stream event data in real time to BigQuery. However, this solution has some cost implications and would introduce you to a new tracking paradigm.
https://developers.google.com/analytics/devguides/collection/app-web/tag-guide
https://support.google.com/firebase/answer/6318765?hl=en
https://www.simoahava.com/analytics/getting-started-with-google-analytics-app-web/
I am looking for options to ingest Google Analytics data(historical data as well) into Redshift. Any suggestions regarding tools, API's are welcomed. I searched online and found out Stitch as one of the ETL tools, help me know better about this option and other options if you have.
Google Analytics has an API (Core Reporting API). This is good for getting the occasional KPIs, but due to API limits it's not great for exporting great amounts of historical data.
For big data dumps it's better to use the Link to BigQuery ("Link" because I want to avoid the word "integration" which implies a larger level of control than you actually have).
Setting up the link to BigQuery is fairly easy - you create a project in the Google Cloud Console, enable billing (BigQuery comes with a fee, it's not part of the GA360 contract), add your email address as BigQuery Owner in the "IAM&Admin" section, go to your GA account and enter the BigQuery Project ID in the GA Admin section, "Property Settings/Product Linking/All Products/BigQuery Link". The process is described here: https://support.google.com/analytics/answer/3416092
You can select between standard updates and streaming updated - the latter comes with an extra fee, but gives you near realtime data. The former updates data in BigQuery three times a day every eight hours.
The exported data is not raw data, this is already sessionized (i.e. while you will get one row per hit things like the traffic attribution for that hit will be session based).
You will pay three different kinds of fees - one for the export to BigQuery, one for storage, and one for the actual querying. Pricing is documented here: https://cloud.google.com/bigquery/pricing.
Pricing depends on region, among other things. The region where the data is stored might also important be important when it comes to legal matters - e.g. if you have to comply with the GDPR your data should be stored in the EU. Make sure you get the region right, because moving data between regions is cumbersome (you need to export the tables to Google Cloud storage and re-import them in the proper region) and kind of expensive.
You cannot just delete data and do a new export - on your first export BigQuery will backfill the data for the last 13 months, however it will do this only once per view. So if you need historical data better get this right, because if you delete data in BQ you won't get it back.
I don't actually know much about Redshift, but as per your comment you want to display data in Tableau, and Tableau directly connects to BigQuery.
We use custom SQL queries to get the data into Tableau (Google Analytics data is stored in daily tables, and custom SQL seems the easiest way to query data over many tables). BigQuery has a user-based cache that lasts 24 hours as long as the query does not change, so you won't pay for the query every time the report is opened. It still is a good idea to keep an eye on the cost - cost is not based on the result size, but on the amount of data that has to be searched to produce the wanted result, so if you query over a long timeframe and maybe do a few joins a single query can run into the dozens of euros (multiplied by the number of users who use the query).
scitylana.com has a service that can deliver Google Analytics Free data to S3.
You can get 3 years or more.
The extraction is done through the API. The schema is hit level and has 100+ dimensions/metrics.
Depending on the amount of data in your view, I think this could be done with GA360 too.
Another option is to use Stitch's own specfication singer.io and related open source packages:
https://github.com/singer-io/tap-google-analytics
https://github.com/transferwise/pipelinewise-target-redshift
The way you'd use them is piping data from into the other:
tap-google-analytics -c ga.json | target-redshift -c redshift.json
I like Skyvia tool: https://skyvia.com/data-integration/integrate-google-analytics-redshift. It doesn't require coding. With Skyvia, I can create a copy of Google Analytics report data in Amazon Redshift and keep it up-to-date with little to no configuration efforts. I don't even need to prepare the schema — Skyvia can automatically create a table for report data. You can load 10000 records per month for free — this is enough for me.
Considering doing some relatively large scale event tracking on my website.
I estimate this would create up to 6 million new events per month in Google Analytics.
My questions are, would all of this extra data that I'm now hanging onto:
a) Slow down GA UI performance
and
b) Increase the amount of data sampling
Notes:
I have noticed that GA seems to be taking longer to retrieve results for longer timelines for my website lately, but I don't know if it has to do with the increased amount of event tracking I've been doing lately or not – it may be that GA is fighting for resources as it matures and as more and more people collect more and more data...
Finally, one might guess that adding events may only slow down reporting on events, but this isn't necessarily so is it?
Drewdavid,
The amount of data being loaded will influence the speed of GA performance, but nothing really dramatic I would say. I am running a website/app with 15+ million events per month and even though all the reporting is automated via API, every now and then we need to find something specific and use the regular GA UI.
More than speed I would be worried about sampling. That's the reason we automated the reporting in the first place as there are some ways how you can eliminate it (with some limitations. See this post for instance that describes using Analytics Canvas, one my of favorite tools (am not affiliated in any way :-).
Also, let me ask what would be the purpose of your events? Think twice if you would actually use them later on...
Slow down GA UI performance
Standard Reports are precompiled and will display as usual. Reports that are generated ad hoc (because you apply filters, segments etc.) will take a little longer, but not so much that it hurts.
Increase the amount of data sampling
If by "sampling" you mean throwing away raw data, Google does not do that (I actually have that in writing from a Google representative). However the reports might not be able to resolve all data points (e.g. you get Top 10 Keywords and everything else is lumped under "other").
However those events will count towards you data limit which is ten million interaction hits (pageviews, events, transactions, any single product in a transaction, user timings and possibly others). Google will not drop data or close your account without warning (again, I have that in writing from a Google Sales Manager) but they reserve to right to either force you to collect less interaction hits or to close your account some time after they issued a warning (actually they will ask you to upgrade to Premium first, but chances are you don't want to spend that much money).
Google is pretty lenient when it comes to violations of the data limit but other peoples leniency is not a good basis for a reliable service, so you want to make sure that you stay withing the limits.
The company I work for has just purchased 4 32" LCD screens to be mounted at the front of the office for demonstration purposes. Whilst we are not demonstrating (most of the time), the screens are to be used as development information screens for the whole team.
What information would people recommend displaying to be most useful to the team? Our focus is on hosted business web-apps but I am interested in what other teams doing other types of development find useful too. Pointers on how to gather the displayed information would be useful also.
Information about your continuous integration status.
Major Development Milestones that have been hit in the last week
Releases within the last month (including a short description why this release is awesome)
Use it as motivational board. The achievements of software development are seldom communicated well enough.
Since you're hosting apps for your customers, server and network status information would probably be useful.
Heck, why not create a "chat room" for the dev team to discuss issues and post a streaming version of that as well?
Schedule information, Scrum notes from that morning, a gantt chart...the possibilities abound.
Outstanding bugcount, sorted by priority and severity. You can likely get this from your bugtracking tool programmatically.
Depending on your process management
system, possibly a list of feature
requests and the percentage complete
on each of them. Again, you can probably get this programmatically from your process management / time tracking tool.
Time spent in the current development
cycle, and time remaining. Again, this should be available from your process / management / time tracking tool. You may want to use this data with your bugcounts as well to give a bugs / day fix rate.
If you're a public company with a
profit-sharing plan (i.e. stock or
options), the current price of the
stock (this can be surprisingly
strongly motivating). You can get stock data from several sources online programmatically (although a small delay may be injected unless you're paying for the service).
The movie 'Office Space'
Weather radar from intellicast.com
Latest Checkin.
Number of checkins per day
Number of customers that use software
Metrics on Bugs found/fixed and the ratio.
One screen could be an aggregated RSS feed of development topics pulled from sites such as Stack Overflow (or even Coding Horror). Not sure what your goal for these screens is, but I could see it useful to me if you had a feed with topics specific to your development team headlined. If I were there, I'd glimpse them, maybe catch an interesting thread, and go learn something. Funnel a bunch of keywords and tags through a Yahoo Pipe and dump it to the screen.
That's if they are more "informal and informational."
I think most popular pages from your webapp(s) would be a fun/interesting thing to show on a big monitor up front.
Another would be a live feed of your error reporting.
We have one monitor showing all meetings for the day, with start-end, subject, and room. I find this helpful, not only for my orientation, but also to see what other people do at our company.
xkcd, bunny, dilbert and savage chickens :-)