What means "Estimated Savings" in Google Lighthouse? - pagespeed

Does anybody know what is the meaning of Estimated Savings in Google Lighthouse (or PageSpeed Insights) report?
I can save 3.6 s. Byt of what? Page Load Time? It is impossible to find it anywhere.

From this comment:
the "estimated savings" time in most of our opportunities is derived exactly from the estimated impact to the TTI metric.

Related

What does "Origin Summary" mean in this PageSpeed Insight report?

I recently noticed that Google updated the PageSpeed Insight report page. There is an "Origin Summary".
What does it exactly mean? Why it's slightly different than the field data?
If the page you are testing has enough visitors you will see your "field data". That is real world performance for that page
The origin summary is real world performance data for all pages on the domain where there is enough data.
Basically it is page performance (Field Data) vs site performance as a whole (Origin Summary).

Google Analytics suddenly started Sampling Data, 3k sessions for property over time period

We are using the free level of GA and have been creating reports using Custom Dimensions and Metrics since last summer.
We also use the Google Sheets Analytics add-on to post process data pulled from the API.
Overnight on 16-17 May (UK Time), our reports suddenly started showing as being sampled. Prior to that we had no sampling at all, as our reports are scheduled so I can look back through the revision history to see changes made when the scheduled reports run.
This sampling is occurring in custom reports viewed in the GA platform and in GA sheets. I've done some analysis and it appears to only occur at the point that more than one Custom Dimension is added to a report, or when the GA dimensions ga:hour or ga:dateHour are used (ga:date does not trigger sampling).
All our Custom Dimensions and Custom Metrics are set at Hit level (I've read a post where it was claimed to be due to mixing scopes on Dimensions & Metrics, but we are not doing this).
If I reduce the date range of a query (suggested as a solution on many blogs), the sampling level actually gets worse rather than better.
For the month of May we didn't even hit 4k sessions at property level. I can't find any reference anywhere to any changes being made to GA that would cause sampling to apply to our reports (change documentation, Google Blogs etc).
Is anyone else experiencing this or can anyone shed any light on why this might be happening? Given how we use GA if we can't resolve this then it's a year of work down the drain, so I'm really keen to at least know why this has suddenly happened even if ultimately nothing can be done about it.

Unsampled data with Google Analytics API

I am trying to automate the weekly report. Currently, I am using Google Analytics website to get the data for my report. Sampling level is higher precision.
I tried to get the same data by Google analytics API set samplingLevel as HIGHER_PRECISION. However, I am still getting the sampled data.
For FASTER, Precision Level is roughly 25% whereas for DEFAULT and HIGHER_PRECISION sampling level is roughly 50%.
On Google Analytics website, it says 'This report is based on 100% of sessions'. Can I get the same level of accuracy with Google API? I am using Google Apps script.Response for HIGHER_PRECISION is not matching.
Sumit, the API and the Google Analytics UI are certainly different and similarly the sampling's effect on things is a different beast which must be handled properly to get anything useful out of it.
As was mentioned in the comment, you can achieve high precision unsampled reports by (typically) shortening your date range that you're querying for and then "walking" the data.
To walk the data, you are essentially just gradually incrementing that small date range as you move through the desired data.
The "unsampled reports API" is... well, not the best. Considering that's what they are avoiding giving the end user in the first place, the offering available is not a very good long term or large project friendly solution. I would recommend small date ranges and then doing a data walk.
Happy Coding
There are several solutions to avoid sampling issue in Google Analytics by automating the process of data export for short date ranges.
I prefer this tool, it's pretty simple to use: MadStats.io

If I link Google analytics and adwords, will Google adwords be able to see my conversion rates and ROIs and change my ad cost?

This may seem like a silly question, but I've not been able to find an answer anywhere in either StackOverflow or Google.
The question is, if I link my Google Analytics and Adwords accounts, in theory wouldn't Google be able to see my ROI? And if they see my ROI as incredibly high, couldn't they increase the PPC ad cost behind the scenes to make Google more money? I suppose, I could just trust Google to "do the right thing" to let the market-based algorithms dictate pricing. But let's say I'm paying $2.00 per click and making average $100 in revenue per click -- an outrageous ROI -- what's to stop Google from seeing this data and increasing the PPC cost to $5 or even $10?
No, they don't do that. Anyone can set an arbitrary number as their conversion value and see an inflated ROI in their reports. There are more genuine use cases to do this than you imagine.
Second, your idea of how the market works is flawed. Google cannot arbitrarily increase costs (well, they can indirectly through quality score). The cost is determined by the bids of your competitors. If you and all your competitors got together and collided to decrease your bids, Google could do nothing and everyone would benefit (except Google). The fact that competition is geographically dispersed and online means this situation is unlikely so competitors bid based on the ROAS.
Besides, if Google were found to be doing this, nobody would use them anymore.

Google analytics data adjustment?

I've been using a SSIS Integration component to download data from Google Analytics in order to keep an historical view of some websites and track the evolution of them. Basically the metrics we track are Visits (now Sessions) and Visitros (now Users), and the dimensions are Year and Month. However, today I noticed that the data I downloaded for july had a variation on the Users metric. I heard that google analytics uses an estimation method to "calculate" some (if not all) of their metrics, could it be that after that they "adjust" the data with more acurate information? If so, is this mentioned in the documentation? (a link would be highly appreciated) Since the users are complaining that we are not delivering the real GA Data. I tried looked on the Google analytics documentation page with no luck.
Thanks for your time.
PS: Sorry for my english, it isnĀ“t my native language
If you are using the standard version of Google Analytics (you'll know if you are paying $150k for premium), data is sampled depending on volume. Have a read of this article can-you-trust-your-google-analytics-data
I have seen very slightly differing results being returned if you repeatedly call the api with the same historical parameters repeatedly. In my case the figures only differed by 1-2 over a daily set of several thousand, but nevertheless it differed.
If you want to guarantee your results, consider upgrading to premium
Sampling could be an issue if what you are requesting is over 50,000 rows for the time period you are requesting. To avoid it you can download more often, such as daily.
But I think your issue is that there is a processing time for Google Analytics - if you are downloading at 3 am on the 1st it is probable that the processing for the previous day has not finished.
Google Analytics Premium SLA is for 4 hour data freshness, so even that would have trouble. Pragmatically you should allow 24 hours before you download data for the previous day, 48 hours for e-commerce data.
Thirdly make sure it is not Unique Visitors you are requesting, as this is dependent on the time period you are requesting.

Resources