Analytic Dashboard - google-analytics

I am planning to develop a website which will allow registered users to
view there analytic data from various sites like Google Analytic in one
dashboard, some what similar to http://www.cyfe.com/ which provides all in
one dashboard.
I am thinking of two approaches to implement this application.
Approach #1: once the user logins to my web application and request for data, my application would make a call to analytic website using there API (ex
Google Analytic API) and display response data.
Approach #2: execute a job which executes for a particular interval
(say for every 30 min) and retrieves analytics data for all registered users
and saves in my application database. And when user requests data, my application would display the data from application database instead sending request to Analytic website.
Can anyone please suggest the pros/cons of each approach and which one is good to implement?

Remember google analytics data isn't done processing for 24 - 48 hours so requesting data every 30 minutes is over kill the data wont be complete or accurate. Run your application once a day to get data for two days ago.
The main problem you are going to have is the limit of 7 dimensions and 10 metrics per request. There is no primary key so there is no way of linking data from one request back to the data of another request.
Another issue you will have is that you can max return 10k rows per request depending upon how many rows are returned by the request you end up making a large number of requests against the API which will be hard on your quota.
Also you may end up with quota issues you can make a max of 10k requests to each profile per day. once you have hit that quota you will not be able to make any more requests against that profile until the next day. This quota can not be extended.
You can also make a max of 10 requests a second per user / profile you can tweek this a little using quota user but your application will not be able to run very fast it takes on average half a second for each request to return data. Things are going to take time unless you want to run multiple versions of your extracting application but again it will require you tweek quota user. This quota can not be extended.
Your application can make a max of 50 k requests against the api a day for all profiles. Once you reach 80% of that quota I recommend you apply for an extension it can take a month or more to get the extension of this quota it is a good idea to plan ahead for this.
Note: I am the lead developer on a business intelligence application that exports data from Google Analytics into data warehouse application daily I have run into each of those issues. While what you are planning on doing is possible to do you just need to understand the limitations of the google analytics api before you begin your development process.

Related

How to stop bots from racking up your firebase bill on a web app?

I am learning web development and I am using Firebase as my backend. I am planning on preventing bots from brute force attacks which will rack up my firestore bill by:
Only allowing logged in users to write to the database with security rules.
Using Firestore & security rules to only allow a certain amount of writes per second per user. If the user goes over this amount they will be banned for a certain time period.
Will these 2 security measures stop most bot attacks?
Also I was looking at the pricing for hosting and I saw it costs $0.15 per 1 GB of data transferred.
My entire web application is only about 5 mb of data, but that means if my web application is loaded 1,000 times then it would cost me 5 mb * 1,000 = 5 GB * $0.15 = $0.75 which I can handle. But let's say someone got a bot to re-load the page 1,000,000 times then it would cost me 5 mb * 1,000,000 = 5,000 GB * $0.15 = $750. Obviously $750 is a lot of money and I can't handle that.
How do I prevent bots from re-loading my page multiple times and racking up my hosting bill? I can't use the listed strategy above because I want users who don't have an account to still be able to view my website.
First, a few clarifiers…
Cost of FB hosting is $0.15 per GB after you have exceeded the free usage of 10GB/month.
Your entire application may be 5MB but that is likely not the amount of data transmitted unless you missed a build step. In a React app, resources are loaded only as needed. Responses are then even further compressed by Firebase to save bandwidth.
Assets are generally cached and not reloaded on subsequent page views. It’s a bit more nuanced with bots but this helps keep data transfer rates down overall.
Now to your question…
Mitigating repeat page visits the way you describe is handled mostly through rate limiting. This limits the number of requests to your site in a given period.
Implementing Google Analytics can also help in detecting bot traffic by reporting unusual bursts of activity.
You could also use a CAPTCHA to reduce bot attempts to log in or submit other form data.
Two-factor auth is another good tool in preventing brute force attacks.
Finally, I would suggest creating budget alerts for your project. This is easy to do and highly customizable. You can even set it for $1 if you wish. It will not stop your project automatically, only send you an email. But if handled quickly, it will save you from getting an even bigger bill.

I want to increase the Real Time Reporting API request limit

We are using the Core Reporting API and Real Time Reporting API.
This API limit 10000request/day per view, but I want to increase the this
Is this possible?
if possible, please let me know how we can increase quota limits and the price for the case of 20000 request/day.
There are several quotas for the Google Analytics APIs and Google APIs in general.
requests/day 0 of 50,000
requests/100seconds/user 100
requests/perView 10000
Your application can make 50000 requests per day by default. This can be extended but it takes a while to get permission when you are getting close to this limit around 80% its best to request an extension at that time.
Your user can max make 100 requests a second which must be something that has just gone up last I knew it was only 10 requests a second. User is denoted by IP address. There is no way to extend this quota more then the max you cant apply for it or pay for it.
Then there is the last quota the one you asked about. You can make max 10000 requests a day to a view. This isn't just application based if the user runs my application and your application then together we have only 10000 requests that can be made. This quota is a pain if you ask me. Now for the bad news there is no way to extend this quota you cant apply for it you cant pay for it and you cant beg the Google Analytics dev team (I have tried)
Answer: No you cant extend the per view per day quota limit.

Google Analytics real-time - keep alive

i have a realtime platform when users are staying on pages for a long duration, i found that after 5 minutes (more or less) the GA realtime stop show them so i created timer that each 4 minutes send pageview and this way all users remain "connected" to GA.
I wonder if it's a good approach or it's can may produce un-accurate data on the reports later.
Is anyone experienced that?
Your terminology seems a little off - users do not become "disconnected" from Google Analytics, the difference between realtime reports and data from the reporting api is that the former shows only a subset of ad hoc computed dimensions and metrics whereas the reporting api shows, after some processing latency, the full set of metrics and dimensions, including stuff that required more processing time like session- and user scoped data.
Other than that your approach is fine. There is a limit on the number of API calls you are allowed to make - the documentation has an example on how to calculate your calls to stay within the limits, and Google suggests to implement some sort of serverside caching if you do need a lot of realtime dashboards.
But this is not going to affect the data quality of reports in any way. Realtime API is a read-only API, the worst thing that can happen is that you exceed your quota and get blocked for the rest of the day. So there is no way this would create "un-accurate data on the reports later".

Google Analytics API Quota for several clients

I tried searching but I didn't find any useful resource that would answer my question.
I'm trying to develop a service for my costumers where I will need to connect to their analytics data and combine with information of other services that I already provide. However, with the quota of the API request, how can I get it to work with several costumers?
I mean, the limitation is 10.000 requests per month, and I will probably make around 40-50 requests per day per costumer. That means that if I get more than 7 costumers to use it I would reach the monthly quota. What is the best approach to make this scalable?
Thank you in advance!
I think you are confused a little about the Google Analtyics api limits.
Managment api, and metadata api have a limit of 10,000 requests per day. 10 requests per second.
The Core reporting api is 10,000 requests per day per User and or (View (used to be profile)) and 50,000 requests per application. You can request that that 50k be extended. But you need to show that there arnt a lot of errors comming from your application.
It might be a good idea to also send send either Userip or quotaUser with all over your requests this will ensure that each of your users gets 10k requests each day. If you dont send quotaUser or UserIp then google lumps them all under the same quota user and they they are as a group limited to the 10k. This may or may not be a problem if you can ensure that sevral users wont be requesting the same data from the same view (used to be profile)
Another thing you should remember is that nextlinks count twards the limit as well so you should either try refine your requests so that you dont get to many rows back or set max-results high enough that you dont get to many nextlinks.
You can read more about how and why you should use QuotaUser here Google Analtyics QuotaUser
The quota is 10,000 per day per profile.
You should be fine especially if each of your clients has a separate profile.
https://developers.google.com/analytics/devguides/reporting/core/v2/limits-quotas#core_reporting

Linkedin API throttle limit

Recently I was developing an application using Linkedin people-search API. Documentation says that a developer registration has 1 lac API calls per day, but when I have registered this API, and ran a python script, after some 300 calls it says throttle limit exceeds.
Did anyone face such kind of issue using Linkedin API, comments are appreciated.
Thanks in advance.
It's been a while but the stats suggest people still look at this and I'm experimenting with the LinkedIn API and can provide some more detail.
The typical throttles are stated as both a max (e.g. 100K) and a per-user-token number (e.g. 500). Those numbers together mean you can get up to a maximum of 100,000 calls per day to the API but even as a developer a single user token means a maximum of 500 per day.
I ran into this, and after setting up a barebones app and getting some users I can confirm a daily throttle of several thousands of API calls. [Deleted discussion of what was probably, upon further consideration, an accidental back door in the LinkedIn API.]
As per the Throttle Limits published by LinkedIn:
LinkedIn API keys are throttled by default. The throttles are designed
to ensure maximum performance for all developers and to protect the
user experience of all users on LinkedIn.
There are three types of throttles applied to all API keys:
Application throttles: These throttles limit the number of each API call your application can make using its API key.
User throttles: These throttles limit the number of calls for any individual user of your application. User-level throttles serve
several purposes, but in general are implemented where there is a
significant potential impact to the user experience for LinkedIn
users.
Developer throttles: For people listed as developers on their API keys, they will see user throttles that are approximately four times
higher than the user throttles for most calls. This gives you extra
capacity to build and test your application. Be aware that the
developer throttles give you higher throttle limits as a developer of
your application. But your users will experience the User throttle
limits, which are lower. Take care to make sure that your application
functions correctly with the User throttle limits, not just for the
throttle limits for your usage as a developer.
Note: To view current API usage of your application and to ensure you haven't hit any throttle limits, visit
https://www.linkedin.com/developer/apps and click on "Usage & Limits".
The throttle limit for individual users of People Search is 100, with 400 being the limit for the person that is associated with the Application as the developer:
https://developer.linkedin.com/documents/throttle-limits
When you run into a limit, view the api usage for the application on the application page to see which throttle you are hitting.

Resources