I am currently using the Google Places API on a free trial. I am interested in paying for the API but can't find the exact cost of the two commands that I use: google_places(), and google_place_details(). I have contacted the Google sales team and looked at the places and billing url, but I have not managed to find the answer of how much it would cost exactly to execute these two commands.
For google_places(), this is an example of a command I would execute:
google_places(search_string = "Cafeteria in Madrid, Spain", key=key)
From the places and billing url, it seems like this counts as a text search, so each time the code is executed it would cost 0,032$. Is this the case?
For google_place_details(), here is an example of the command I would execute:
google_place_details(place_id = "ChIJf_XA-F0U04kR1IPYSdTJ4so", key=key)
This command, as well giving basic place details (which cost 0,017$ according to the billing url),
gives information which counts as contact data (an extra 0,003$) and atmosphere data (an extra 0,005$). It also provides photo data (0,007$ according to the billing url), which I am not interested in but is automatically included in the results anyway. Does this mean that the cost of executing this command once is these four prices summed up?
I am interested in knowing exactly how much it would cost to execute the two commands I have listed.
probably this helps:
First of all you are billed monthly after you exceeded the 200 Euro/Dollars, which are given by google for free (as you probably described as "free plan"). So after every month you get a bill on how many requests of each function you send to google. There everything is written quite clearly including the amount and price of each "unit". then you can easily divide it.
Second option would be your Google Api Cockpit.
It tracks your requests quite precisely on different time bases. So sending your wanted commands only once on a day can give you an exact total-price.
The Cockpit is super handy for different things. If you want you can even set limits, which is probably helpful in your case too.
Here is the link to the billing monitor as well: Billing Google API Cockpit
Furthermore the description of how google charges you. Look here
best regards
Related
I am looking for options to ingest Google Analytics data(historical data as well) into Redshift. Any suggestions regarding tools, API's are welcomed. I searched online and found out Stitch as one of the ETL tools, help me know better about this option and other options if you have.
Google Analytics has an API (Core Reporting API). This is good for getting the occasional KPIs, but due to API limits it's not great for exporting great amounts of historical data.
For big data dumps it's better to use the Link to BigQuery ("Link" because I want to avoid the word "integration" which implies a larger level of control than you actually have).
Setting up the link to BigQuery is fairly easy - you create a project in the Google Cloud Console, enable billing (BigQuery comes with a fee, it's not part of the GA360 contract), add your email address as BigQuery Owner in the "IAM&Admin" section, go to your GA account and enter the BigQuery Project ID in the GA Admin section, "Property Settings/Product Linking/All Products/BigQuery Link". The process is described here: https://support.google.com/analytics/answer/3416092
You can select between standard updates and streaming updated - the latter comes with an extra fee, but gives you near realtime data. The former updates data in BigQuery three times a day every eight hours.
The exported data is not raw data, this is already sessionized (i.e. while you will get one row per hit things like the traffic attribution for that hit will be session based).
You will pay three different kinds of fees - one for the export to BigQuery, one for storage, and one for the actual querying. Pricing is documented here: https://cloud.google.com/bigquery/pricing.
Pricing depends on region, among other things. The region where the data is stored might also important be important when it comes to legal matters - e.g. if you have to comply with the GDPR your data should be stored in the EU. Make sure you get the region right, because moving data between regions is cumbersome (you need to export the tables to Google Cloud storage and re-import them in the proper region) and kind of expensive.
You cannot just delete data and do a new export - on your first export BigQuery will backfill the data for the last 13 months, however it will do this only once per view. So if you need historical data better get this right, because if you delete data in BQ you won't get it back.
I don't actually know much about Redshift, but as per your comment you want to display data in Tableau, and Tableau directly connects to BigQuery.
We use custom SQL queries to get the data into Tableau (Google Analytics data is stored in daily tables, and custom SQL seems the easiest way to query data over many tables). BigQuery has a user-based cache that lasts 24 hours as long as the query does not change, so you won't pay for the query every time the report is opened. It still is a good idea to keep an eye on the cost - cost is not based on the result size, but on the amount of data that has to be searched to produce the wanted result, so if you query over a long timeframe and maybe do a few joins a single query can run into the dozens of euros (multiplied by the number of users who use the query).
scitylana.com has a service that can deliver Google Analytics Free data to S3.
You can get 3 years or more.
The extraction is done through the API. The schema is hit level and has 100+ dimensions/metrics.
Depending on the amount of data in your view, I think this could be done with GA360 too.
Another option is to use Stitch's own specfication singer.io and related open source packages:
https://github.com/singer-io/tap-google-analytics
https://github.com/transferwise/pipelinewise-target-redshift
The way you'd use them is piping data from into the other:
tap-google-analytics -c ga.json | target-redshift -c redshift.json
I like Skyvia tool: https://skyvia.com/data-integration/integrate-google-analytics-redshift. It doesn't require coding. With Skyvia, I can create a copy of Google Analytics report data in Amazon Redshift and keep it up-to-date with little to no configuration efforts. I don't even need to prepare the schema — Skyvia can automatically create a table for report data. You can load 10000 records per month for free — this is enough for me.
This seems really basic but i am struggling with it
We have a client who runs a travel website.
They have a few different search bars eg Flights, Hotels, Carhire.
I am trying to track the performance of each... "What % of people completed a sale that ran a Flight search." Same for Hotel, and for Car hire
Any ideas for the best way to get this info in GA?
Many thanks
There are a few ways to get this information, each with their pros and cons. The options that I see immediately available are segments and goals.
Segments are great because they are retrospective and generally more flexible, with the ability to be changed if you find your criteria isn't quite right. You create here, and specify sessions that go through search results pages etc:
Then you can create another segment for booking confirmation page, and any other intermediary steps that you'd like to report on. The main con of segments is that you can only pull in 4 at a time, but if you have more you can pull them 4 at a time and copy+paste the data into an excel sheet or google sheet. Segments can also be pulled via the Core Reporting Api and DataStudio which makes them great for automating into dashboards.
Goals are cool because they pull into the default reports, and basically track sessions through a particular page, event or sequence. The main con I see and the reason is that I don't use them is that they only start tracking fro mthe time you create them , and if you change the configuration it does not impact historical data, so your data can get messed up quickly if you don't have sandbox GA views or sandbox goals for your testing before putting it into a dedicated goal slot. You can also only have 10 or 20 goals depending on your plan, so once data is tracked against that goal you can't remove or clear it.
Google Cloud billing is not updating with the free trial (on monthly payments) and I can not change it to a faster update cycle. As per https://cloud.google.com/free-trial/docs/billing-during-free-trial the bill should come every month.
It is therefore not easy to see how much of the 300$ is left.
Is there any way to at least see how many TBs my queries used? This should be by far the biggest item on the bill.
I am concerned that I might get 'stuck' between some important queries that I otherwise could have managed better to have at least partial results available after the trial ends.
BigQuery analysis & storage costs should be listed under your GCP billing transactions:
https://console.cloud.google.com/billing/<INSERT_YOUR_BILLING_ID_HERE>/history?e=13803970,13803205
Another way to see how much you have queried is by enabling audit logging as described here.
This may be a possible duplicate of this question, but according to all the Google Analytics documentation I really should be able to pull my list of custom segments.
Since I have a very large list of them, it would be suboptimal for me to manually copy the segment ids over one at a time.
I'm following this walk through. Steps to reproduce:
Create a custom segment using date of first session in your Google Analytics account.
Authorize the Google Analytics guide to access your Google Analytics account.
Try their on-page query tester, and inspect whether your custom segment is there.
One thing I've already ruled out was the user that created the segment. I've manually created a segment with the same user that I'm querying the API with and it still does not show. Is there a flag I need to set somewhere to include custom segments?
Edit:
It turns out that it will list some custom segments, but not ones created with date of first session, so this is a duplicate of this question, which means that there is a bug in the Google Analytics API.
There was a bug which is now fixed. So it is now possible to list the Date of Session Segments in the Google Analytics Management API by calling the segments.list() method.
So after days of trying to solve this one I've come to the conclusion that it cannot be done as asked.
There is, however, another way to do it. For every segment set up a daily (or weekly, etc) email report to a email as a TSV. In each email body specify the name of the segment so when you're consuming the emails you can know which segment the attached TSV is for. It doesn't look like the daily reports were designed with segments in mind, since non of the metadata included in the TSV mentions which segment it is for.
From there it's trivial. Connect to the email address using an IMAP client once a day and update the numbers.
Note that the daily email only contains the numbers for that day (not a specified range), so you'll need to first generate the report one time with the historical data to load in.
While hacky, one nice thing about this approach is that it keeps your reports in sync with your (faked through email) api code (provided you match the column headings in the TSV). So, if for example, a new filter is included into a report, the new daily fields will continue to update.
Unfortunately though, the past data won't be reflected in the change.
Obviously this isn't great, but if you are monitoring daily cohorts it's the best you've got if you need to stay with Google Analytics. I have raised this as a bug to the Google Analytics developers, but I haven't heard back as to whether or not they plan to fix it.
I think the question has been answered here before,but i could not find the desired topic.I am a newbie in web scraping.I have to develop a script that will take all the google search result for a specific name.Then it will grab the related data against that name and if there is found more than one,the data will be grouped according to their names.
All I know is that,google has some kind of restriction on scraping.They provide a custom search api.I still did not use that api,but hoping to get all the resulted links corresponding to a query from that api. But, could not understand what will be the ideal process to do the scraping of the information from that links.Any tutorial link or suggestion is very much appreciated.
You should have provided a bit more what you have been doing, it does not sound like you even tried to solve it yourself.
Anyway, if you are still on it:
You can scrape Google through two ways, one is allowed one is not allowed.
a) Use their API, you can get around 2k results a day.
You can up it to around 3k a day for 2000 USD/year. You can up it more by getting in contact with them directly.
You will not be able to get accurate ranking positions from this method, if you only need a lower number of requests and are mainly interested in getting some websites according to a keyword that's the choice.
Starting point would be here: https://code.google.com/apis/console/
b) You can scrape the real search results
That's the only way to get the true ranking positions, for SEO purposes or to track website positions. Also it allows to get a large amount of results, if done right.
You can Google for code, the most advanced free (PHP) code I know is at http://scraping.compunect.com
However, there are other projects and code snippets.
You can start off at 300-500 requests per day and this can be multiplied by multiple IPs. Look at the linked article if you want to go that route, it explains it in more details and is quite accurate.
That said, if you choose route b) you break Googles terms, so either do not accept them or make sure you are not detected. If Google detects you, your script will be banned by IP/captcha. Not getting detected should be a priority.