I have the following url: http:// domain.com/subdirectory/ for my website. I have set goals in 3 different ways and none seems to work.
1- /subdirectory/goal.php
2- /goal.php
3- goal.php
Notice that the settings in traffic info for my website is not the full URL domain.com I have a shared hosting so the web that I want to track is under http:// domain.com/subdirectory/
Google actually did a fairly good job explaining this exact problem here:
https://support.google.com/analytics/answer/1033158?hl=en
Most likely the "The Goal page is not tagged with tracking code".
Related
Not sure if this is the right forum to post a question like this. My site at https://www.usahazmat.com is setup in Analytics using www and Search Console is setup with a connection to my analytics account. I assumed when my site is indexed, it's indexed using the www version of my domain.
When I use the site command site:www.usahazmat.com, I only see about 1600 pages, but when I do site:usahazmat.com (nonwww), I see 6500 pages.
I have two questions:
Why is search console indexing the nonwww version when everything is setup for www
With Google indexing the nonwww version, is this hurting my ranking? And if so, what would be the best route to fix this?
This covers a few areas.
What you specify in Google Analytics his no effect on ranking. What you see in Google Analytics is based on where you place your tracking code.
What you register with Google Search Console has no effect on ranking. The way you register controls what you see. So if you register the www version, you only see data on that.
It looks like you redirect users to the www version of your site (good). Use that for everything
Did you change from none www to www. That could explain the site: search change. But you will fins better data if you look at your site in the Google Search Console.
I'm a new WordPress designer. My site runs Tesseract Theme and is built with Beaver Builder.
PROBLEM: When I post my website (https://louiseclark.tech) on Facebook it removed my site after a couple minutes. Now when I try to post my site it gives me this message--> It looks like a link you're sharing might be unsafe. If you can, please remove this link: louiseclark.tech Note: The unsafe link might be on the page you’re linking to.
What I've done to try and resolve:
When I ran my site through the Facebook debugger I got this message:
The 'fb:app_id' property should be explicitly provided, Specify the app ID so that stories shared to Facebook will be properly attributed to the app. Alternatively, app_id can be set in url when open the share dialog.
I created an app id following this instructional video: https://www.youtube.com/watch?v=V97h03H21y0
I pasted my app id into my Yoast SEO plugin under the Facebook category.
Check my Google Webmaster Tools Sitemap...all is verified and sitemap set.
SSL certificate is set - checked with my hosting company SiteGround. When I asked them about this problem they didn't really feel that the security issues where from their side.
I've reported this problem to the black hole that is Facebook support.
Thank you for any insight.
In case anyone sees this thread, I found the solution.
When I moved my WordPress sites to managed WordPress hosting I also migrated my websites to https with the SSL certificates. While the pages were migrated and displaying the https just fine, the images still held their old url (http).
I did two things:
I installed SSL Content Fixer plugin. This worked for some images but not others.
I installed Better Search Replace plugin. I had found the specific insecure images using Firefox. From my page in Firefox, I went to:
Tools -> Page Info -> Media This showed me every image/js/css call on this page. Finding these images allowed me to use the plugin to make the changes.
It worked. I'm quite sure knowing how to code my site would be much better in this situation. But I'm a newbie and this is what I could come up with.
What I learned: It's a flag when you have a secure site that embeds non secure objects/images.
Two days ago we posted a new blog on a site with the aim of being picked up for the search term "live comedy in chippenham". It’s been indexed by Google and we’re now 2nd in the results for the search query. The bad news is that for some reason the post has been indexed as a https URL so all browsers give a warning when the link is clicked.
Firefox gives this error:
The owner of www.neeld.co.uk has configured their website improperly. To protect your information from being stolen, Firefox has not connected to this website.
The host has confirmed that it's not a server config error and we have other posts and pages on the site that are being indexed correctly. We're using WordPress and the Yoast plugin. I can't see anywhere in Webmaster Tools that could be causing the problem.
Can anyone offer any advice please? If you search Google for "live comedy in chippenham" you'll see the issue (it's the link https://www.neeld.co.uk/live-comedy-in-chippenham/)?
It's a really strange one but something I've experienced before.
It has mostly likely been caused by an external link to the page using https protocol which Google has followed before indexing the page. Google are very keen to index https pages at the moment so we might start seeing this kind of issue more often.
There's not a lot you can do other than wait for Google to realise their mistake and list the correct URL in the SERPS. You can help speed this along with a canonical link (which I can see is there), XML sitemap (which you've got) and a server level redirect of https to http.
Do not try to remove the page in Webmaster Tools as this won't have the desired effect and will stop Google reindexing the page properly.
Hope this helps.
For my Wordpress.org site I use the Google Sitemap Generator plugin by Arne B. While in localhost I activated the plugin and it works.
I usually update my website in localhost and then upload the database to my webhost. So now I am wondering if Google search results will now enter both urls below?? Reason I am asking is because I am afraid Google will consider this as duplicate content.
http://127.0.0.1/beef-recipe-1/
http://www.actual-website.com/beef-recipe-1/
Google can't access your localhost (127.0.0.1) so it will most likely ignore those URL's.
If you are afraid of the above situation the best thing you can do is to delete all the previous sitemaps and re-generate a new one while your site is online. By going to Google's webmasters tools Resubmit the sitemap if necessary and crawl your websites main domain link for e.g: mydomainname.com and let Google crawl all direct links from associated to the homepage.
This way you will not lose rankings on Google while it may become a helping factor to your website.
Cheers!
I want to track traffic for mysite.com/current-campaign/ and careless about traffic on mysite.com in general.
Is it ok to place the GA tracking code in the files inside the /current-campaign/ folder or does it HAVE TO be in the root of the server for tracking to work?
GA will only track on the pages you actually put the tracking code on, regardless of where the page is located (unless you start messing with things like domain settings or filters etc..).
So IOW yes, it is okay to do that. If you don't have tracking code on mysite.com/somePage.html then it's not gonna track that page (though it might show up as the URL in some reports like referring URL or exit link or whatever, same as any other page you don't track)
In Google Analytics, you can add a filter to the profile and filter all but the chosen directories. Go to Analytics Settings > Profile Settings and look for "Add Filter" link.
In addition to Crayon's answer, you can limit tracking to a subdirectory by using _setCookiePath() function in your tracking function. See Analytics documentation on single subdirectory (note the link anchor is not resolved to a correct header, at least for me).
This is advised in the documentation to use when you only want to track a subdirectory and avoid clashes with Analytics trackers possibly in use in other subdirectories.
I work for a department in a large university.
The department's web page resides at www.some-uni.com/department-name/.
I only have FTP access to the sub-folder /department-name/ and nothing else on the site.
It was quite easy to get Google Analytics to track traffic within the subfolder /department-name/, ignoring the rest of the site. All I did was create a profile in GA, setting the default url to www.some-uni.com/department-name/. I then pasted the tracking code into the pages I wished to track.
It took about eight hours for anything to show up in GA, but after that it worked just fine.