Different pagespeed scores for two domains of the same website - pagespeed

We have two domains for the same website (e.g. test.it & test.com).
When we run Pagespeed for the same page with the two domains we get different results.
I know that numbers can vary between tests, but the .it domain has double TBT and LCP on the same page compared to the .com version.
Do you know what might be the cause?

Related

Does Google Optimize work inside an iframe hosted on an external domain?

I'm running and A/B test with Google Optimize on one of my pages.
This page is shown on multiple website on different domains through an iframe.
So, domain of my page is different from domain of the website hosting the iframe.
I have no control on the website hosting the iframe.
The problem is that I'm not seeing any data collected from the Experiment in Google Optimize.
I suspect that iframe an Google Optimize don't work smoothly together.
I can't find a clear answer on the documentation or in another questions online.
Do you know if is it possible to run an AB test inside on iframe hosted on a different domain?
If Google Optimize is the problem, do you know other tools that work properly in this situation?
I think is a problem with the _gaexp cookie, "SameSite" and "Secure" settings, because your iframe is in different domain
In these threads you can see similar problems with iframes:
Possibility to set cookie flags to _gaexp
https://support.google.com/optimize/thread/75547539?hl=en

UTM tracking across sub-domains

I have a main website (e.g. mybrand.com ) with static pages mostly developed on Wix.com and I have a full application hosted on AWS on a sub-domain, e.g. app..com.
For tracking the traffic coming from different social media channels, we are building UTMs. My understanding is that the UTM tracking doesn't work when you hop between the sub-domains. Can you please suggest some clever options?
One option for us is to re-do the Wix website in WordPress and host WordPress ourselves on AWS next to our WebApp to completely avoid the domain hoping. But if we have a more elegant solution while keeping the Website, it would be preferred.
You can use a simple parameter in querystring (i.e. subdomain.com/page?from=domain1) then in Analytics count unique page with that parameter in the URL.

google analytics report for desktop and mobile site

Is there a way to create like a piechart or report that distinguishes who have been on the m. or www. site of a domain?
both sites uses the same GA code.
I looked at cross domain tracking option and added a new filter as suggested but was unable to apply it to any reports on the GA account. Plus it will only show me the page list etc
https://support.google.com/analytics/answer/1034342?hl=en
I want to know if i can just get a number on how many vistors on the m.mysite.com and www.mysite.com websites during a date range
Any suggestions welcomed.

Crawl errors 404 for multiple unknown /contact.asp urls

I have a Wordpress site which has been flagged up by Google Webmaster as it has a huge number of 404s such as mysite/Contact.asp?7aW8/27hJSK.html all with a different string of numbers/letters. Does anyone know what is generating these? Is it a hack (if so no apparent harm done).
Thank you

Does automatic redirection/geo-location have impact on my SEO? - Detect if its a spider that is accessing site

I have a site who's search ranking has plumetted. It should be quite SEO friendly because its built using XHtml/CSS and has been run against the SEO toolkit.
The only thing I can think that may be annoying Google is
The keywords are the same accross the whole site rather than being page specific. (cant see why this would be a massive deal
Another URL has been set up that simply points to my site (without redirecting) (again - no big deal)
Non UK users are automatically forwaded onto the US version of the site which is a different brand. I guess this could be the problem. If google spiders my site from the US then it will never get the UK version
So the question is, does geo redirecting setting effect my SEO? Is it possible to detect if who is accessing your site is actually a search engine that is spidering my site. In this case I don't want to do any geo-location
Do not use same keywords on entire site. Try to use specific keywords per page.
Do not let several URL:s point directly to the same site since this will cause the inlinks from the different domains to be treated as to different domains. If you point URLs by redirect, all inlinks will be added to the target domain and thus increase it's "inlink score".
To detect is request is from a crawler, use the browsercaps project: http://owenbrady.net/browsercaps/

Resources