Google cloud usage API - google-translate

with my company we were interested in the translation API.
I need to know if there is a way to retrieve the usage state of an account for a specific period.
(For example I would like to be able to know that last month I translated X characters that corresponds to Y USD).
I'm sure that such an API exists, but I really can't find any reference to it.

Translation API bills per character translated and counts translated characters for the whole project in order to create a bill for the customer.
To view your current billing status, including usage and your current bill, see the Billing page (GCP Console => Billing). Billing reports for a particular service contain fields Product name (Translate), Usage (number of characters) and Cost.
See Cloud Billing > Doc > View your billing reports and cost trends
More detailed information about consumption of the Translation API is not provided. At this time Translation API does not have such functionality.
A similar problem was raised on Issue Tracker in 2017 but to no avail: https://issuetracker.google.com/35903950.

Related

Pricing for specific Googleway commands in RStudio

I am currently using the Google Places API on a free trial. I am interested in paying for the API but can't find the exact cost of the two commands that I use: google_places(), and google_place_details(). I have contacted the Google sales team and looked at the places and billing url, but I have not managed to find the answer of how much it would cost exactly to execute these two commands.
For google_places(), this is an example of a command I would execute:
google_places(search_string = "Cafeteria in Madrid, Spain", key=key)
From the places and billing url, it seems like this counts as a text search, so each time the code is executed it would cost 0,032$. Is this the case?
For google_place_details(), here is an example of the command I would execute:
google_place_details(place_id = "ChIJf_XA-F0U04kR1IPYSdTJ4so", key=key)
This command, as well giving basic place details (which cost 0,017$ according to the billing url),
gives information which counts as contact data (an extra 0,003$) and atmosphere data (an extra 0,005$). It also provides photo data (0,007$ according to the billing url), which I am not interested in but is automatically included in the results anyway. Does this mean that the cost of executing this command once is these four prices summed up?
I am interested in knowing exactly how much it would cost to execute the two commands I have listed.
probably this helps:
First of all you are billed monthly after you exceeded the 200 Euro/Dollars, which are given by google for free (as you probably described as "free plan"). So after every month you get a bill on how many requests of each function you send to google. There everything is written quite clearly including the amount and price of each "unit". then you can easily divide it.
Second option would be your Google Api Cockpit.
It tracks your requests quite precisely on different time bases. So sending your wanted commands only once on a day can give you an exact total-price.
The Cockpit is super handy for different things. If you want you can even set limits, which is probably helpful in your case too.
Here is the link to the billing monitor as well: Billing Google API Cockpit
Furthermore the description of how google charges you. Look here
best regards

Download Google Analytics information with a unique user ID

I'm looking to download hit data from a Google Analytics view for a small period of time that includes unique ID for a session and URL that was viewed. I believe I could do this going forward by setting something in Google Tag Manager to a Custom Dimension, but I was looking to avoid that (we have a good number of custom dimensions) and because I wouldn't be able to go backward.
Is it possible in the free version of GA to do something like? I picture the output being the URLs in my x-axis and my users in the y-axix with counts.
I'll be looking to take this data and do a cluster analysis to determine user behavior types.
Nope. Google Analytics does not expose a user specific id via the API or via data exports in a standard account (in GA360 you could use BigQuery to extract the client id).
You either have to set up a custom dimension (as you said this does not work for historic data), or try and use calcuated fields in Google Data Studio in the hope that if you aggregate enough different dimensions into one field you will end up with something specific per user.

how to check how many token been sold for my Smart contract

I want to create a smart contract and launch it for ICO. I also create a website where people can buy my token. I want know how to check how many token been sold (live)? so i can create a live bar counter to show how many percentages of the token already been sold.
Or is there a way i can monitor the token sale process in the smart contract?
A token contract is no different than any other smart contract. There are no special built in Solidity features or logic associated with them. They are just regular smart contracts that follow a specification.
So, if you want access to the number of tokens sold, you code that into your contract. While tokens sold is not part of the standard ERC20/ERC721 interface, nothing prevents you from adding a constant function to retrieve this information. In fact, if you're using the basic Zeppelin Crowdsale contract, you can just calculate it using the public state variables weiRaised / rate (Chances are you should be creating your own Crowdsale subcontract, so it's better to add the functionality you want there).
We can use the Etherscan Developer API to review transactions against a given contract address and find out the total supply or number of items available for sale.
There is a lot you can do with the Etherscan Developer API. For example, here's one URL that pulls data from Ethereum Mainnet -> Etherscan -> JSON parser -> Shields.io and renders it as an image to calculate the number of Su Squares remaining for sale:
Source: https://img.shields.io/badge/dynamic/json.svg?label=Su+Squares+available&url=https%3A%2F%2Fapi.etherscan.io%2Fapi%3Fmodule%3Daccount%26action%3Dtokenbalance%26contractaddress%3D0xE9e3F9cfc1A64DFca53614a0182CFAD56c10624F%26address%3D0xE9e3F9cfc1A64DFca53614a0182CFAD56c10624F%26tag%3Dlatest%26apikey%3DYourApiKeyToken&query=%24.result
^ I don't know if SO is going to cache the image here. But that URL is a live URL which pulls the number of Su Squares available hot off the blockchain.

Getting MCF Conversions path data from Google Bigquery

I am using Google Bigquery to extract data on conversion paths from Google Analytics (GA).
When I analyze these conversion paths from the exported dataset, the last-click conversions match the Acquisition report in GA, but not to the Multi Channel Funnel (MCF) data. Apparently Bigquery doesn't really export raw data, but transforms it by deleting all last direct clicks. like described here: https://support.google.com/analytics/answer/1319312?hl=en.
Is it possible to get the Bigquery data to correspond to Multi Channel Funnel (MCF) conversion path data? To undo the deletion of last non-direct click and get proper 'raw' user level data?
All of the trafficSource fields in BigQuery Export for Google Analytics use campaign attribution as described in this processing flow, which will overwrite direct traffic with the most recent campaign (if there is one and it is within the specified timeout), as you mentioned.
If you are using Universal Analytics, you can adjust the campaign timeout to be shorter than the 6 month default. For example, if you set the campaign timeout to be one day, any direct visits that come in at least one day after a visit with a campaign will be attributed to direct instead of the previous campaign. This can be done with Classic Analytics as well using _setCampaignCookieTimeout. This technique will affect data collection from the time it is implemented going forward.
This thread is rather dated, so I thought I'd update just in case anyone else comes across this same question.
There is a field that was introduced (both in the Google Analytics interface and the BigQuery export) that allows you to match the numbers in the MCF reports. In BigQuery, look for the field trafficSource.isTrueDirect
BigQuery Export Schema
trafficSource.isTrueDirect
True if the source of the session was Direct (meaning the user typed
the name of your website URL into the browser or came to your site via
a bookmark), This field will also be true if 2 successive but distinct
sessions have exactly the same campaign details. Otherwise NULL.

How to include custom segments in the list of segments when querying the Google Analytics API?

This may be a possible duplicate of this question, but according to all the Google Analytics documentation I really should be able to pull my list of custom segments.
Since I have a very large list of them, it would be suboptimal for me to manually copy the segment ids over one at a time.
I'm following this walk through. Steps to reproduce:
Create a custom segment using date of first session in your Google Analytics account.
Authorize the Google Analytics guide to access your Google Analytics account.
Try their on-page query tester, and inspect whether your custom segment is there.
One thing I've already ruled out was the user that created the segment. I've manually created a segment with the same user that I'm querying the API with and it still does not show. Is there a flag I need to set somewhere to include custom segments?
Edit:
It turns out that it will list some custom segments, but not ones created with date of first session, so this is a duplicate of this question, which means that there is a bug in the Google Analytics API.
There was a bug which is now fixed. So it is now possible to list the Date of Session Segments in the Google Analytics Management API by calling the segments.list() method.
So after days of trying to solve this one I've come to the conclusion that it cannot be done as asked.
There is, however, another way to do it. For every segment set up a daily (or weekly, etc) email report to a email as a TSV. In each email body specify the name of the segment so when you're consuming the emails you can know which segment the attached TSV is for. It doesn't look like the daily reports were designed with segments in mind, since non of the metadata included in the TSV mentions which segment it is for.
From there it's trivial. Connect to the email address using an IMAP client once a day and update the numbers.
Note that the daily email only contains the numbers for that day (not a specified range), so you'll need to first generate the report one time with the historical data to load in.
While hacky, one nice thing about this approach is that it keeps your reports in sync with your (faked through email) api code (provided you match the column headings in the TSV). So, if for example, a new filter is included into a report, the new daily fields will continue to update.
Unfortunately though, the past data won't be reflected in the change.
Obviously this isn't great, but if you are monitoring daily cohorts it's the best you've got if you need to stay with Google Analytics. I have raised this as a bug to the Google Analytics developers, but I haven't heard back as to whether or not they plan to fix it.

Resources