Facing issue on google pagespeed insights - pagespeed

I am trying to know my website's page speed from google page-speed insights. But it showing question mark as result. I want to know about this question mark and its solution.
my website is developed in WordPress.

I recommennd Lighthouse Developer Tools
Simply follow the steps, and you can perform your first Audit without Page Speed Insights!
First step for your website is to serve images that are appropriately-sized to save cellular data and improve the load time.
For example: "07/besp_long1-1.png" which has a size of 1,761 kb can be served with a smaller size and in a next-gen format like .webp.

The given URLs by you are showing following errors in Google Page Speed Insights:
Field Data - The Chrome User Experience Report does not have sufficient real-world speed data for this page.
Origin Summary - The Chrome User Experience Report does not have sufficient real-world speed data for this origin.
It means that your website doesn't provide sufficient number of distinct samples that provide a representative, anonymized view of performance of the URL as per the Documentation of Google page speed insights.
As an alternative I would recommend using Lighthouse Tool to avoid such issues.

Related

How can you measure page to page speed on a website?

There are many tools to measure page load speed GT Metrix, Pingdom, WebPageTest and all of Google's tools.
However these all measure the page load speed of a single page in isolation. How do I measure the page to page speed of a site?
Well, each tool is accurate on its own way.
For example Pingdom / GTMetrix allows you to measure the speed of your website when using a Computer with leased internet line.
If you want to measure the speed of your whole website, you need to categorize your pages using templates and then calculate yourself the amount of time between First Byte Received + DomContentLoaded.
Then as soon as this is ready you need to send it towards an analytics tool (maybe Google Analytics) and measure it out of your user experience.

How to find the Page Load Sample Report in Google Analytics

In this article about making sense of the Site speed related data in Google Analytics. The author mentions about a report that has both the Page Load Sample and Avg. Page Load time. I am unable to find this Page Load Sample report or add it as an additional dimension in my Site Speed related reports next to the Avg. Page Load Time. Any directions in this regards would be helpful. I google around and it seems there is a chance this may have been phased out of Google Analytics.
Try switching to the "Technical" tab in Site Speed Page Timings report

Issue with (not set) fields inside Google Analytics on various reports

One month ago on web site, www.cashflowtallinn.ee I noticed an issue related to Google Analytics, which is representing as large amount (over 80% of our traffic) viewed by Google as (not set).
This is causing different issues, such as:
If we want to see what pages users mostly visit, biggest percentage is (not set)
If we want to see default languages, biggest percentage is (not set)
If we want to see traffic sources, mostly it's viewed as Direct traffic, but this is not true, since most of our traffic is Social networks.
I tried to resolve issue:
Installed Google Tag Assistant, but it reports all is good.
Examined <head> section and found out that web site has several <title> instances, could this cause issues?
Found this from Google Support https://support.google.com/analytics/answer/2820717?hl=en
Fount this, but couldn't find solution http://www.lunametrics.com/blog/2015/06/25/11-places-google-analytics-not-set/
Any ideas how to handle this (and issues like this one) ?
There is always a possibility to turn of all plugins and switch back to default theme (since it's WordPress web site), but I would like to test this on live site, and make it work live, so switching off plugins and changing theme is actually not a good idea.
All best,
I noticed your GA code is outside of head tag.
I'd consider fixing the location GA code (inside head tag)
you have 2 same GA codes installed (bad practice)
Images attached.enter image description here

How can I verify that overall a site is faster after compression has been switched on?

I have suggested that we Enable dynamic content compression on IIS7 to improve user experience. The QA department wants me to demonstrate that it is in fact faster. I've been using Firebug to view the load time waterfall chart generated under the Net setting and it is inconsistent with the overall (total) page load time from page to page. i.e. sometimes faster but sometimes slower.
Dynamic pages by themselves are always faster but now some static uncompressed content appears to be slower.
What I would like is a tool (Firefox addin) that can add together all page load times during a typical workflow (usage) of a site and give me a final time figure. I would then use that with dynamic compression enabled and disabled to see what the total net effect is. Any suggestions?
Use Fiddler from Microsoft, it has a much low level interaction with Explorer and could be easily used to produce comparative graphics.
Firebug is very useful for a lot of stuff, but for every kind of measurement that involves network Fiddler is much better because instead of examining the page it works as a local proxy and so can examine the network traffic much better.
Site link: http://www.fiddler2.com/fiddler2/
The Net tab in Firebug gives you timings for every item the browser downloads. There's also an option to disable the cache, to guarantee accurate timings. I believe that by default it will clear the timings after every page load, but that can be disabled.

Is Google Analytics Accurate? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed last year.
Improve this question
My records show a particular page of my web site was visited 609 times between July 2 and November 15.
Google Analytics reports only 238 page views during that time.
I can't explain this discrepancy.
For Google Analytics to track a page view event, the client browser must have JavaScript enabled and be able to access Google's servers. I doubt 60% of my visitors have either disabled JavaScript or firewalled outbound traffic to Google's tracking servers.
Do you have any explanation?
More Info
My application simply puts a record into a database as it serves up a page.
It doesn't do anything to distinguish a bot viewer from a human.
The disparity is almost certainly from crawlers. It's not unheard-of for crawler traffic to be 10x user traffic.
That said, there's a really easy way to validate what's going on: add an ASPX page which emits a uncacheable, 1x1 pixel clear-GIF image (aka "web bug") to every page on your site, and include an IMG tag referencing that image on every page on your site (e.g. in a header or footer). Then parse your logs for hits to that image, looking at a query-string parameter on the image call (e.g. "referrer=") so you'll know the actual URL of the pageview.
Since crawlers and other bots don't pull images (well, Google Images will, but not images sized as 1x1 pixel in the IMG tag!), you'll get a much more accurate count of pageviews. Behind the scenes, most analytics software (including Google Analytics) uses a similar approach-- except they use javascript to build the image URL and make the image request dynamically. But if you use Fiddler to watch HTTP requests made on a site that uses Google Analytics, you'll see a 1px GIF returned from www.google-analytics.com.
The numbers won't line up exactly (for example, users who quickly cancel a navigation via the back button may have downloaded one image but not the other) but you should see roughly comparable results. If you don't, then chances are you don't have Google Analytics set up correctly on all your pages.
Here's a code sample illustrating the technique.
In your header (note the random number to prevent caching):
<img src="PageviewImage.aspx?rand=<%=new System.Random().NextDouble( )%>&referer=<%=Request.UrlReferrer==null ? "" : Server.HtmlEncode(Request.UrlReferrer.ToString()) %>"
width="0" height="0" hspace="0" vspace="0" border="0" alt="pageview check">
The image generator, PageviewImage.aspx :
private void Page_Load(object sender, System.EventArgs e)
{
Response.ContentType="image/gif";
string filepath = Server.MapPath ("~/images/clear.gif");
Response.WriteFile(filepath);
}
BTW, if you need the image file itself, do a Save As from here.
This is of course not a substitute for a "real" analytics system like Googles, but if you just want to cross-check, the approach above should work OK.
Could the rest of the page views be from crawlers - either Googlebot or others?
Are you looking at unique page views in Analytics and total page views in your logs?
Probably crawlers. Our website was being hit every couple of hours by robots.
Are you positive the site is working properly in all browsers? I've seen analytics thrown off by pages that fail to render properly in Firefox but work fine in IE, and vice versa.
Maybe the tracker of your web pages record every hit, even if it comes from the same IP address (same surfer hits the page twice).
It is not, many visitors have javascript turned of or have the customize google firefox extension installed.
Given the time stamp of the last comment, I thought I'd leave an update here; Google Analytics recently announced they'd let people opt-out of Google Analytics, on the user-side, meaning if you didn't want website owners to track your movements, you could effectively become invisible on sites that are measured by Google Analytics. this could further offset your data points. in a sep thread, I suggested running two web analytics tools (many free to choose from) to measure against each other.
Justin's answer is very good. I would just add this as a comment but I'm lacking powerpoints :P
One thing to keep in mind, too, when comparing analytics systems, is that there's always some discrepancy to be expected:
The methodology of page tagging with JavaScript in order to collect visit data has now been well established over the past 8 years or so. Given a best practice deployment of Google Analytics, Nielsen SiteCensus or Yahoo Web Analytics, high level metrics remain comparable. That is, can be expected to lie between 10-20% of each other.[ link ]

Resources