Page load time differs between capital and small url - google-analytics

My site has two different url like www.example.com/home and www.example.com/Home. This comes some where on the website and we are trying to figure it and solve.
But my question is as I am using CMS both URLs are active working and has the same content as well. But the page load time differs on gtmetix and yslow and also shows different avg load time on Google analytics. I would like to study the reason why there is a load difference on both link even they load exact same content.

Related

Lowering Largest Contentful Paint, Over A Full Second After FCP On Mobile

I've spent quite a bit of time on performance optimization for my site and I'm right there at the finish line for getting all the green good scores in search console for mobile usability and core web vitals. The last outstanding thing is getting my LCP under 2.5 seconds for my blog posts.
I'm consistently getting 2.6-2.7 and I can't quite figure out why. The LCP element is the <h1> tag on my blog posts and there are no above-the-fold images.
Example URL With Higher LCP
Let me say that I am using a Beaver Builder website and some marketing tags like GA, GTM, etc but I've spent a lot of time analyzing my script and style loads to create an optimal experience with preconnects and preloads of various resources. I know it's a big ask to try to get a site like this down in load time. I'm on Kinsta hosting with low TTFB, full-page caching and CDN. I am also using the Perfmatters plugin to control various aspects of load times.
I've done everything I can think of to get the LCP down and it seems like the <H1> tag is visible almost immediately but then may be repainted later towards the end of the page load, but I can't figure out the cause of this.
Anyone out there feeling generous that could take a look?
Page Speed Insights URL
as i can see Pagespeed is showing aggregated real user experience of your entire domain, not of this specific pages but for all pages on this domain.
DOM content load time for your page on mobile is around 3sec, that means your LCP has to be greater than this
You need to reduce your DOM content load time first. You need to prioritize your network calls such that redundant dependency needs to be minimized
And also for desktop page, you are lazy loading first fold image which is not considered as a good user experience and might be contributing to LCP for desktop version of your page, also impact negatively for your SEO

I'm unable to run experiments accross subdomains using Universal Analytics

I'm currently running an experiment without redirect, using Google Analytics, but I'm running in some issues.
The case
I work for a company that has two websites, with two separate brands, selling the same product. Today, we are plaining a merge of the brands, one of the reasons being the low costs of maintanance.
To see how this would affect sales, we are doing an a/b test. The test consists of changing the logo of the sites, and displaying an information about the merge of brands in the variant. The original is the website without changes.
We have some requirements to do it:
We use a CMS that has no support to the Google Analytics Experiment tag (we get some errors when we install it to the , and are unable to run it)
We need to run it through all pages of our websites. We have also a subdomain in each site, that the user is redirected to place an order.
We doesn't have time to wait for the experiment to end for itself. So, we came up with the idea to track the rejection and sales using a duplicate pageview with "/variant" in the url and in the title.
To do that, I used the Content Experiments without redirects, with the Google Tag Manager.
Configuration of the Experiment
In Google Tag Manager, I load the Content Experiment Javascript API and define the choosenVariation variable in all pages of both websites and subdirectories.
I track the "gtm.load" event, to see when the page finished loading all elements and change the DOM in three ways: changing the logo, adding the content about the merge and add an item to the main menu. All of this, through Javascript.
Along with the changes of the DOM, I add a datalayer called VirtualPageView, and pass the corresponding url with "/variant" and the title with "Variant".
When the datalayer fires, I send a new Pageview with the variant information.
The problem
The experiment is running right, but when a user gets the B variant of the experiment and procceed to a subdomain of our websites to place an order, it seems that it's somehow running another test, and happens to the user get the A variation.
We are trying to persist the original session and the client Id through the domain and subdomain, in order to the user that saw the different logo, continue in his way to order.
I saw this page about Running Experiments across Subdomains, but its about the Classic Analytics and the classic experiment, and we are using the Universal Analytics with the Content Experiment without redirects.
I don't know if my explanation was clear enough, so if someone have doubts, please ask me. I don't have a profound knowledge of Google Analytics or the Content Experiments either. So, if you have a better way to do this, please, tell me.
I came up with a solution to our problem. We agreed to use the experiment only in the pages of the main domain, so I can change the content otherwise in the pages of the subdomain:
When a user visits our main domain, through Google Tag Manager, I created a cookie that says what the result of the variation chosen for the user (0 for the original and 1 for the variation).
When this user goes to our subdomain to place an order, still via GTM I check the cookie to see its value. If its equal to 1 (a variation), I change the logo and the menu, according to our previous configuration, and I send a virtual pageview to help us check the data.
Until now, this is working properly.

Issue with (not set) fields inside Google Analytics on various reports

One month ago on web site, www.cashflowtallinn.ee I noticed an issue related to Google Analytics, which is representing as large amount (over 80% of our traffic) viewed by Google as (not set).
This is causing different issues, such as:
If we want to see what pages users mostly visit, biggest percentage is (not set)
If we want to see default languages, biggest percentage is (not set)
If we want to see traffic sources, mostly it's viewed as Direct traffic, but this is not true, since most of our traffic is Social networks.
I tried to resolve issue:
Installed Google Tag Assistant, but it reports all is good.
Examined <head> section and found out that web site has several <title> instances, could this cause issues?
Found this from Google Support https://support.google.com/analytics/answer/2820717?hl=en
Fount this, but couldn't find solution http://www.lunametrics.com/blog/2015/06/25/11-places-google-analytics-not-set/
Any ideas how to handle this (and issues like this one) ?
There is always a possibility to turn of all plugins and switch back to default theme (since it's WordPress web site), but I would like to test this on live site, and make it work live, so switching off plugins and changing theme is actually not a good idea.
All best,
I noticed your GA code is outside of head tag.
I'd consider fixing the location GA code (inside head tag)
you have 2 same GA codes installed (bad practice)
Images attached.enter image description here

Google Experiments A/B Transaction Tracking

I am adding a step into the buying journey on an eCommerce retail website. I want to test to see if this affects overall conversion rates. Our platform doesn't support A/B testing itself so I'm hoping I can test this using Google Experiments.
A lot of examples for Google Experiments involve having a different page/url to test with, so you want to test a different button style or page layout, different content etc. In my case this doesn't apply as my test is a multipage test, if I add a step here how many of those customers end up on my order confirmation page. The business logic which determines whether to add the additional step is handled via javascript so essentially I want half of my visitors to load to old js file and half to load the new js file and then to see how they differ in Google Experiments.
I'm guessing the only way I can do this is to set-up an experiment where the original page would be
www.mydomain.com/product_page
and my variation page would be
www.mydomain.com/product_page?some_variable=1
Then in my framework I could check for that url parameter and load to new js file. But would that track through the whole buying journey once the customer has left that page?
Any help or suggestions would be very welcome.
Thank You
Saul

Web scraping of an eCommerce website using Google Chrome extension

I am trying to do web scraping of an eCommerce website and have looked for all major kind of possible solutions.The best I found out is web scraping extension of Google Chrome. I actually want to pull out all data available in the website.
For example, I am trying to scrape data of an eCommerce site www.bigbasket.com. Now while trying to create a site map , I am stuck to this part where I have to chose element from a page. Same page of say category A, while being scrolled down contains various products ,and one category page is further split as as page 1, page 2 and few categories have page 3 and so on as well.
Now if I am selecting multiple elements of same page say page 1 it's totally fine, but when I am trying to select element from page 2 or page 3, the scraper prompts with different type element section is disabled,and asks me to enable by selecting the checkbox, and after that I am able to select different elements. But when I run the site map and start scraping, scraper returns null values and data is not pulled out. I don't know how to overcome this problem so that I can draw a generalized site map and pull the data in one go.
To prevent web scraping various websites now use rendering by JavaScript. The website (bigbasket.com), you're using also uses JS for rendering info to various elements. To scrape websites like these you will need to use Selenium instead of traditional methods (like beautifulsoup in Java).
You will also have to check various legal aspects of this and whether the website wants you crawling this data.

Resources