Essentially, I'm concerned that a single user can be counted twice. Is there a best practice, etc. I've tried googling and I'm not sure if I'm just not asking the right question with the right words. Platform is on sitecore.
Using the same property to track AMP and non-AMP pages will result in multiple users. See here for Google's recommendation.
Though looks like you can use the Google AMP Client ID API to work around this.
Related
I want to track particular links on my site to see where they come from. For example, I want to know which links on my navigation are being clicked, so if something is not being clicked I could potentially remove it.
I have been using UTM's, super easy, but results in skewed analytics data.
I looked into Google Tag Manager, but I don't want to slow down my website. I can change the site easily, so not sure if this is the best solution.
I found an article dated 2008 that says I can do this:
https://www.example.com/?from=topnav
Is that still valid? Is there a better way. I can't seem to find any information on this and assume somebody wants to acquire this information.
Thank you.
I have been using UTM's, super easy, but results in skewed analytics
data.
UTM codes are meant to track inbound traffic. Don't use them to track internal/outbound navigation, as it will seriously mess up your reporting.
I looked into Google Tag Manager, but I don't want to slow down my
website.
GTM is loading async, just like GA, so performance-wise they are equivalent.
I found an article dated 2008 that says I can do this:
https://www.example.com/?from=topnav
By default GA will not track link clicks. You can indeed add parameters to URLs and then use those to build custom reports and see which links are being clicked.
Since what you're trying to do is custom implementation, you won't find a single best answer, it's up to you to implement something that fits your needs. These are some examples:
https://analytical42.com/2017/track-internal-links-google-analytics-gtm/
https://www.gravitatedesign.com/blog/can-google-analytics-track-link-clicks/
I'm currently using woocommerce and by chance when looking through google search console I noticed well over 15,000 pages had been indexed. Instantly raised a concern because I know I should have no more than 400 actual pages.
After looking into it, I noticed that absolutely every possible parameters "variations, grid styles, shipping methods" etc is being indexed causing what is 400 into 15,000 variations.
Does this have an affect on google ranking showing 15,000 pages when really there are only 400.
I can not find a single resource that explains whether google indexing so many variations has a positive or negative impact on google rankings.
Finally how to prevent google or any other search engines from indexing url's with parameters. I've seen advice using robot.txt but no recommendation of what standard woocommerce filters should be excluded or if that's a bad idea?
I have a feeling I am losing alot of link juice by having so many indexed pages with parameters?
WordPress by default expects that you wish to share everything because why not? Surely if you have 15,000 items that are index-able surely you'd be more mad if your site decided on a whim to hide parts of that without you knowing. It is understandable that this would be confusing though, and quite rightly it is always better to have control over which parts of your site gets seen by search engines.
Anyway there are plugins which allow you to customise which parts of a website are visible or hidden to search engines by providing a mechanism for generating an XML site map (an index of which pages should be crawl-able). A very common plugin I see people use is called Yoast SEO. Another one I have had some success with is called Google XML Sitemaps, you may find another to work better for you, but that should give an idea of what you need.
Also if you are not getting visited often by google bot then take the sitemap from one of those types of plugins and submit via the google console to help google better understand your site.
So I've been working on a website for a while. GA account has been up for a couple months but I waited for the website to be finished before putting up the actual JS tag.
In the meantime, the website is being HTTP password restricted (basic authentication) so it isn't even accessible unless you know the user/pwd combination.
To my surprise, I realized today that GA has logged several hundred views to the root of my website. Paths are mostly things like:
/
/?from=http://social-widget.xyz/
/?from=http://www.traffic2cash.xyz/
Bounce% and exit% both at 100% for all of them.
I realize this looks like referral spam, and there are ways to prevent it. Came across this upon googling:
http://botcrawl.com/block-social-widget-xyz-referral-spam-in-google-analytics/
My question is: how can GA log anything anyway when no tag is up and the website isn't even accessible?
Thank you very much in advance
Because it's spam. They hit Google Analytics directly with random GA codes and don't even go through your website.
GA can't tell if these are real hits (from website visits) or fake hits (from spam bots who hit GA directly calling the same ode as they would if on the website). Though arguably they should do more about this.
Massively annoying - particularly when first starting out as this can be a heavy proportion of your "traffic".
It's easy to set up a filter rule is to catch a lot of this by filtering on hostname. As they are randomly hitting GA and don't even know what website they are hitting GA for, they don't usually set this correctly. Real traffic should only come from yourwebsitedomain.com so add a filter for that.
STRONG piece of advice: abandon the default UA-########-1 tracking code of your new website -- simply do not use it!
Create a second and third property on the Admin screen, then use the tracking code for the third property. You will immediately see a lot less spam. No filters or segments necessary!
If you want the whole sad story about spam visits in GA, I have been maintaining the Definitive Guide article for over a year now:
http://help.analyticsedge.com/spam-filter/definitive-guide-to-removing-google-analytics-spam/
What people are doing is basically taking the UA-XXXXXX code that you normally get with analytics, and they are generating calls against it. This is skewing my analytics stats. On top of that, in Google WebMaster tools, it's also causing this:
It looks like somehow these pages, with my code on or at least with the generated code on, is making Google Webmaster tools think I have lots of 404's. This can't possibly be good for my rankings.
Anyone know if there is anything you can do to stop this?
Try making async call from your server end using CURL.That way you will never expose your GA code.
I have not implemented it, but it might work as per theory
Since you can filter by custom dimensions you can set a "token" in a custom dimension on every page and filter out any traffic in your view settings that does not include the token.
Obviously this will not help against people who use the code from your website (unless you also implement shahmanthan9s suggestion - which is a lot of work but will give you cleaner data), but it will work against drive-by shooters who randomly select UAIDs to send data to (which is the situation you refer to in your comment).
I have a website with following domain and folder structure:
Main Website: www.ry.com
Subdomain1: mobile.ry.com
Subdomain2: speed.ry.com
Directory1: www.ry.com/mobile
Directory2: www.ry.com/blog
I have just started setting up Google Analytics for this and I am totally confused as to what are the best practices that should be followed? Should I consider them as individual properties or just 1 property. Should I be setting up a different GA code for each one of these?
Ideally, I would like to track all of it at one place but at the same time, using some filters be able to see the traffic on any one of the subdomains/subdirectories.
I started reading up about Universal Analytics but got totally confused and some of the posts were outdated as GA and UA seems to have changed significantly in the recent times.
Please advise me about how to set this up or point me to any good blogs or urls that are a rich resource.
This is one of the most complete answers I've found for this question.
http://moz.com/blog/cross-domain-subdomain-tracking-in-google-analytics