Hey I'm not sure why this is happening. GA seems to be randomly appending equals signs to some URL. They are not present in this way on the live site.
I made a fresh view with no filters or other configurations and it still occurs. I've never seen this before and it's very annoying. Any idea what's causing this and or how to fix it? Any idea where to start looking?
Thanks in advance.
If you go into Admin > [your view] > View Settings you can choose to Exclude URL Query Parameters to address this.
Related
When I add a website (url) to a user in wordpress, it automatically gets "http://" added to the start of the url. Is there a way to stop this happening? It is causing other plugins to not function properly as I call on the user_url but need the http to not be present?
edit: i have tried editing user-edit.php to change the "Website:" field input type to text instead of url, but no avail.
Thanks, Nick
What plugins are causing issues with that? It usually works perfect and must not have any issues as per my opinion.
but there is one trick that MIGHT work in this scenario : using Protocol relative URLs
So instead of putting user website as "www.yahoo.com", try putting "//www.yahoo.com" and see if it works for you.
Hy to Everyone,
I want to carry my ?ref=*** URL variable through my menus in a WordPress site.
This variable will be added by me manually for the URLs and given for the users.
It would be good that if they navigate, the ref variable didn't get lost in a browser session.
How could this be done?
Thank you, I am looking forward to the answer!
BR
Maybe this will point in the right direction based on the comments.
Wordpress - Update ALL menu links in header with querystring value
Another solution which is less complex but also less advised is to use javascript to update all anchors on the page with the query string and value
I have one page that returns a 404 error and it is just mind boggling why this is happening. Please see this page: http://www.cra63.com/eventos/
It's the first link, 50 Aniversario 2013.
All other links work. But, not this one. Crazy. Is it a cache issue of some sort? I don't have a cache plugin installed, so I can't think of anything else.
When in the admin panel, the preview button loads the proper page with no problem. I have looked at the url 50 times and it seems to be correct.
Permalinks is set to /%post-name%/.
I'm not a novice although this apparently silly question makes me feel like I am.
Please help. Thanks!
Please duplicate page plus update URL accordingly (after you rename it).
You may enable WP_DEBUG - visit this codex article
One more idea, consider integrating an optimization plugin and run it often. A broken link checker is powerful too.
Ya neva know what the problem can be! Hopefully, it doesn't happen again...
NOTE - This was resolved by simply renaming the permalink from:
50-aniversario-2013
to ...
50th-aniversario-2013
When using a different name, aniversario-2013, it wasn't fixed. So, all I can suggest is to rename the url/permalink. But, this is definitely not a fix in my book. Call it a bug!
I have a website based on wordpress and newly google shows extended results of this webpage (hope you know what i mean by extended results)
So there are the pages, which should be there like "contact", "services" or "info" but theres also one which shouldn't appear in the results: "domain.xx/xmlrpc.php?rsd". it shows in the results as "wordpress http://..." which is pretty ugly.
i tried to "downgrade" (sorry, dont know the exact translation for this in english) the page in google webmastertools about one week ago, but theres no effect. Actually since then it appears as the second link in the results. before it was the last one..
any hints to get rid of this?
First (and recommended way): In Google Webmaster Tools go to Configuration->Sitelinks and demote your xmlrpc URL.
Second way: Go to Optimization and Remove URL and enter your xmlrpc.php URL there. It will take some time before the URL is removed.
Third way: Block Google to access xmlrpc URL with the help of robots.txt file.
Fourth way: Block Google to access xmlrpc URL with the help of .htaccess file.
First off, I saw similar posts already, but they weren't exactly what I am asking.
I used the Facebook Dev to create a like button for my website, stuck the code in and the the button showed up. The only issue is that it likes the wrong url when I click the button.
I'm pretty sure the issue is that I have it set to redirect automatically from mydomain.com to the most recent post. I think this is gumming up the works with the like button and causing it to like mydomain.com/mostrecentpost instead of simply liking mydomain.com.
Is there a way to correct this issue without having to get rid of the redirect (because that isn't an option)?
Sorry if that was a little wordy, wanted to make sure I explained the issue fully.
Is there a way to correct this issue without having to get rid of the redirect (because that isn't an option)?
Either don’t redirect in those cases where the user agent header of the request points to it being Facebook’s scraper;
or set the canonical URL of http://example.com/mostrecentpost to just http://example.com/ using the appropriate Open Graph meta tag. (Although that would mean you would not be able to like a single post any more, because all of your posts would just point to your base domain as the “real” website; so the first is probably the better option.)