Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I don't know what I did but suddenly all my pages have a duplicate pages with the extension /google.com/+cormilubr (which is my website's google+ page).
I used screaming frog and it's showing me over 50 404 errors. Below an extract of screaming frog seo. You'll see all pages with 404 errors. They also exist without the extension and are code 200.
My website is http://cormilu.com.br
this is one of the duplicate urls that have appeared:
http://cormilu.com.br/google.com/+CormiluBr
Does anybody have an idea how this could have come upon so I can resolve it? I've deactivated all plugins , so this doesn't have anything to do with it. Pointing me in the right direction would be awesome!
Thanks a lot for you help.
Best regards,
Amir
Well in the footer of the website you have list of all your social networks. In the source of google plus link, you have added something like
google.com/+CormiluBr
Replace this with the exact url (with http://)
http://google.com/+CormiluBr
Here is the image with the link
Related
Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 3 years ago.
Improve this question
I've got a probleme, I use yoast for my SEO. But now on my description on google I found whats i wrote in the footer of my website and not the text I wrote on Yoast.
How can I fix?
Thank you
This is (probably) not a problem of your plugin. Google uses crawlers to index every page. What probably happened is that google indexed your page before you were using Yoast. This means it indexed your text from your footer and uses that as a description of your website.
How can you fix this?
You can just wait until google reindexes your page, but this might take a while.
You can try an make google to reindex your page. This question has covered that already: How to request Google to re-crawl my website?.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
How would one find out if it's even possible on specific site?
For example https://forums.eveonline.com/default.aspx?g=topics&f=257
There are many more sites where I wanted to display more results per page but option is not available.
Without knowledge of the code base, there is no way to know whether you can change the page behavior via a URL parameter, other than trial and error.
If the site is well designed, all URL parameters ought to be validated against a white list, so it should not be possible to hack the URL. So you should not rely on this.
I know that this is not answering the real question and i know that John Wu is right: You can't obtain this via querystring if you don't know if this is coded server side. What i think is that there is always a way:
For example, in this case you can use rss feed (the button placed at the bottom of the page):
https://forums.eveonline.com/default.aspx?g=rsstopic&pg=Topics&f=257
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
If a aged ranked page in Google is something like http://website.com/fun.html
and I change it to http://www.website/fun will this affect the rank or previous juice?
I ask because we are rebuilding a site for a client that has been around for many years
with hundreds of pages with urls like that. Basically its taking of the .html
Also we have a link like http://website.com/books.html I'm assuming
if we change it to http://website.com/services/books.html it will completely
destroy the SEO, am I right?
Ps the new site is a wordpress.
The direct answer is: Yes. Google looks at fun.html and /fun/ as 2 separate pages. To associate the two pages you will need to add a 301 redirect from fun.html to fun either in your WordPress roots .htaccess file or you can install a plugin to take care of it for you.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
i have a website about the social media which is http://www.vuub.net but my site is very slow and i thought it's about the Theme. How can i reduce to server load time an easy way. I am using WP Super Cache plug-in but still have a problem about the server load time. Thanks for advice!
Here are few articles, hope it will helps you,
http://wp.tutsplus.com/tutorials/10-quick-tips-optimizing-speeding-up-your-wordpress-site/
http://graphpaperpress.com/blog/10-seo-tips-for-wordpress-websites/
http://www.inkthemes.com/12-ways-to-optimize-speed-of-large-wordpress-websites/01/
First of all like the your website look but as you can say, site performance are is not good right now. Here is the some steps before the move for more advancing steps;
1) De-activate all of your plug-ins.
2) Try the use W3 Total Cache instead of WP Super Cache.
After to using this methods come and we can speak again. Also while you are from Turkiye, you can check my web site about more advancing steps about the WordPress and WordPress Optimization; http://www.fatihtoprak.com
Cheers,
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I have created a site twoo years ago in php and now converting it into asp.net mvc. Now I would like to get all indexed pages by Google so I can validate all these are working with new site.
I search on google using "site:mysite.com" it shows 21000 results, So how Can i get these 21000 results and validate all they are working with new.
I don't know if there is a tool that gives you a list. However one way I can think of is to keep an eye on your Google WebMasters account for errors. Any page that Google can't reach will be a page you need to look at and fix. This isn't a fast solution but it's a reliable one.
If your previous website has a structure to its urls then it should be easy to replicate that using routes in asp.Net.
I think this topic has to be moved to webmasters section of stackexchange.
Generally what people would do with change of url is to give 301 redirect to all the new pages from old pages (aka url rewriting).
To verify if all your links work, you can use a tool called Xenu. Being a webmaster, you might already known this. It just goes through the entire list of urls in the pages and verifies them. But in your case you want to check if all your old php links work, you can possibly get a sitemap of the existing list and do the checking by putting this sitemap page on the new site.