Recently I fully use the Cloudflare CDN for my domain. I point my namesever to Cloudflare and enable it with SSL/TLS encryption in Full mode. I found there is a blank page when I start loading the page and then all contents show up immediately. It let some visitors think there is a problem when loading the page in the first few seconds of time. I do the speed testing and get the result as the attached screenshot. Do anyone can help to explain what the meaning for this testing and how can I minimize the time before the visitors to my website see content in 2.2 seconds? Does this 2.2 seconds be the reason of the blank page loading time? How can I improve it?
Without knowing the exact domain, a number of things can be at play.
Long First Byte Times - If the first byte time of a server is long, users will be seeing the white screen for a while.
Poor JavaScript Integration - JavaScript can manipulate the DOM and block images, and other things from loading.
Lots of synchronous requests - Even with HTTP2 the browser can only make so many requests in parallel. If there are lots of images, css, and js requests, then the page would appear white while those load.
Providing the precise URL could help shed light on what the problem is.
Related
When I check (https://www.readonlinenewspaper.com) site speed using PageSpeed Insights.
I am not able to see and results and get an error message like below:
Lighthouse returned an error: FAILED_DOCUMENT_REQUEST. Lighthouse was unable to reliably load the page you requested. Make sure you are testing the correct URL and that the server is properly responding to all requests. (Details: net::ERR_CONNECTION_FAILED)
It is probably caused by one of two things
1. The site just takes too long to load.
Your page takes well over 40 seconds to load (on a high speed desktop connection, albeit in the UK and I am guessing this is somewhere else due to the long delay on requests.) so Page Speed Insights thinks it is broken as the page never completes loading within its timeout period.
Your country flags are the main cause of this, you should instead consider a CSS image sprite, or inline SVGs as the total of 438 requests on your page is so high you will never get good performance (generally only 8 requests can be made at once so that means you have over 50 round trips to your server for resources.)
If each set of eight resources takes 200ms to complete that is 10 seconds of latency (dead time waiting for a response) on its own, for me they were taking 800 to 1000 ms each!
This is particularly slow so perhaps there is something wrong with your hosting configuration or website setup? (You aren't storing the flag URLs in the database and looking them up one at a time in a loop by any chance are you?).
2. Hotjar
For some reason Page Speed Insights doesn't seem to play well with hotjar.
It is something to do with websockets but I never got to the bottom of it I just know that this is a problem I see often when people use hotjar and it is related to web sockets (maybe something to do with the wss:// protocol or their implementation).
Try disabling hotjar and run the test and see if it works then (perhaps test on another page when investigating this as it is only the homepage that is unbearably slow to load because of the flags as per point one).
p.s. the resource online-newspapers-banner-02.jpg is not being loaded over HTTPS so fix that, nothing to do with your question I just noticed the site was showing as "not secure" and I think that is the cause.
I' currently working on a website scraping service with all that goes with it (rotating proxies , rotating user agent, random delays).
It works for me when the amount of concurrency requests is low (up to 4). But when I increase the concurrency, after some time I get blocked by the webpage. But it dont't block just one proxy, but rather all other proxies even if it is used the first time. I would like to understand how they can find out? Can I prevent by set some further entries in the request header and rotating them?
Thanks in advance!
Have PHP/mySQL/JS-JQuery based web site that records finish times for racers, then sends the time back to the server. The server inserts the finish time in the db, Calculates the finish place based on a handicapping formula. Stores that and send the finish place back to the web page and it is updated on the screen.
It uses Jquery Ajax calls so the page doesn't get reloaded at all.
Everything works fine if the data connection is good.
If the data connection is bad my first version of this page would put a message up that the connection was bad.
Now I am trying to make it a bit smarter, so I have started with the HTML5 feature that tells the browser if it is on or offline(i realize this may not be the best way yet but it works for concept testing)
When a new finish time is recorded(or updated) and we are offline the JS just adds a class of notSent to the tag of the finish time. The finish place and all of the finish places would normally come from the sever are greyed out indicating the data is no longer valid(until it can communicate with the server).
When the browser finds itself back online, A simple jQuery each loop on each notSent class starts re-sending the AJAX requests and if they all get completed it processes the return finish place information and display it as up to date.
It also disables all external links on the page when the browser is offline. This keeps the user from losing the data entry page by accident by clicking a link that will give them a page not found button.
So my last issue, is the browsers reload and close buttons, if the user click these when it is offline they will lose the data entry screen and are out of luck until the connection comes back.
Can I disable these functions as well? A quick Stack-overflow search of this indicates it can be done but most answers give the old, "you really shouldn't and if you think you need to you should rethink your design." warning.
So rethinking my design I start learning about;
HTML 5 local storage (decide I don't need it, since my data is stored already in a input box)
App-cache Manifest for controlling the cache of the page so if reloaded in the browser off line if would get that cached version. After much reading came to the conclusion that this could work on a static page but not mine where the data is updated all the time. Then found that most browsers are deprecating this anyways.
Service Workers seems to be the possible future for contorlling offline caching, but not all browsers support it, it is pretty cumbersome to learn and still very new.
Now I am stuck, Leaning towards preventing browser reloads and defering learning service worker till more support and better examples for a dynamic content pages like mine.
Bottom line- am I missing something here? Is there a easy solution?
I think the best option is to use PouchDB to sync between the client and server and use Background Sync to awake a Service Worker when you regain connectivity. If Service Worker is not present in your browser, it can sync the next time your user open the browser.
You have a similar example of deferred requests explained in the Service Worker Cookbook,
I have been trying out Meteor and so far so good! I have read that all html and javascript files are concatenated when sent to a client so my question is how does this scale when the app gets large, let's say 150+ pages, several complex ones (x3 for read/write/delete) and all the scripts & business logic that go along with the pages.
Am I right in thinking that the data sent to the client could be substantial?
Is it only sent the first time or every time when the browser is closed and re-opened (I don't mean the 'hot' updates sent while the client is connected).
It definitely can be a big initial payload, though this is mitigated by (as you said) minification and concatenation, as well as gzipping and caching the payload which are both also done.
With that number of pages, you could dynamically generate them from templates, with the content going over the wire after initial page load.
I have a few Meteor apps in production and the largest initial load for them runs 700kb, including javascript, css, fonts and images. 200k of that is javascript. It opens fast enough and once open, everything feels instantaneous.
That said, it really depends on your site and how it's designed.
I have a website set up with in IIS 7 with HTTPS, and every time a user access it for the first time the load time is about 15 sec.
THIS IS NOT the compile/warm up "problem" described for instance her: Slow first page load on asp.net site
I know about that "problem" and I also have that, but that is of course expected and not the issue here.
Since it's not when the application loads first time since recycle/start. If I open another browser and access it after doing it first in another browser then it takes the same amount of time. So it seems every time a session is started, that's when the delay happens. All following requests from the same user/browser is as quick as expected.
This is for an admin interface site I have and I use asp.net membership. Although the delay happens even before the user have logged in. So I'm not sure if that is the culprit.
I am a bit unsure where to look for killing the delay. I am running session state in process. With cookies.
Any ideas?
You need to get a little more information. Enable trace and track how long each step takes. You could also use Wireshark and have a look at the traffic between the client/server. If there is a big gap in traffic you can see something is hanging at the servers end. If you see constant traffic perhaps you have to much going on with your landing page. Other simple things to do would be to enable dynamic caching/compression on the server to speed things up.
Warm it up...
http://learn.iis.net/page.aspx/688/using-the-iis-application-warm-up-module/