Will a Sitemap in localhost create duplicate content issue? - wordpress

For my Wordpress.org site I use the Google Sitemap Generator plugin by Arne B. While in localhost I activated the plugin and it works.
I usually update my website in localhost and then upload the database to my webhost. So now I am wondering if Google search results will now enter both urls below?? Reason I am asking is because I am afraid Google will consider this as duplicate content.
http://127.0.0.1/beef-recipe-1/
http://www.actual-website.com/beef-recipe-1/

Google can't access your localhost (127.0.0.1) so it will most likely ignore those URL's.

If you are afraid of the above situation the best thing you can do is to delete all the previous sitemaps and re-generate a new one while your site is online. By going to Google's webmasters tools Resubmit the sitemap if necessary and crawl your websites main domain link for e.g: mydomainname.com and let Google crawl all direct links from associated to the homepage.
This way you will not lose rankings on Google while it may become a helping factor to your website.
Cheers!

Related

When i scan for my website domain google search gives spam links , how to remove it

The website is a wordpress site and it was been attacked by xss attack. Ive already installed wordfence and malcare to scan and remove the malicious code and files. but still the google search results are show spam links under the main result. I most of the pages direct to 404 webpages and i was told the google bot will remove it automatically but the issue still remains after 4 days. if any expert regarding this have any solutions and advice i would much appreciate.
You can try resubmitting you sitemap to Google in the Search Console. 
Otherwise, similiarly try using the Google Removals tool to temporarily these links, hopefully the will be cleared from the search results by the the time the links are restored.
Tutorial: https://support.google.com/webmasters/answer/9689846?hl=en

How to list all pages in Google search results?

My website was recently hacked and Google mentioned "This site may be hacked" so I removed the entire wordpress website and changed server. I installed a new wordpress website however, Google is still crawling the old pages. It runs into 20+ Google pages meaning there's over 200 links generated by the hack.
Now, I would like to do a 301 Redirect to all this links using htaccess so Google cache faster and remove these links.
How do I list all this links displayed in the searched result? or is there a better way to do? Yes, I have asked Google for Review but they said it will take several weeks :(
You can see all youre indexed pages by writing this in Google search
site:example.com
Just write youre domain name instead of example.com

Website not posting to Facebook: security & app id issues

I'm a new WordPress designer. My site runs Tesseract Theme and is built with Beaver Builder.
PROBLEM: When I post my website (https://louiseclark.tech) on Facebook it removed my site after a couple minutes. Now when I try to post my site it gives me this message--> It looks like a link you're sharing might be unsafe. If you can, please remove this link: louiseclark.tech Note: The unsafe link might be on the page you’re linking to.
What I've done to try and resolve:
When I ran my site through the Facebook debugger I got this message:
The 'fb:app_id' property should be explicitly provided, Specify the app ID so that stories shared to Facebook will be properly attributed to the app. Alternatively, app_id can be set in url when open the share dialog.
I created an app id following this instructional video: https://www.youtube.com/watch?v=V97h03H21y0
I pasted my app id into my Yoast SEO plugin under the Facebook category.
Check my Google Webmaster Tools Sitemap...all is verified and sitemap set.
SSL certificate is set - checked with my hosting company SiteGround. When I asked them about this problem they didn't really feel that the security issues where from their side.
I've reported this problem to the black hole that is Facebook support.
Thank you for any insight.
In case anyone sees this thread, I found the solution.
When I moved my WordPress sites to managed WordPress hosting I also migrated my websites to https with the SSL certificates. While the pages were migrated and displaying the https just fine, the images still held their old url (http).
I did two things:
I installed SSL Content Fixer plugin. This worked for some images but not others.
I installed Better Search Replace plugin. I had found the specific insecure images using Firefox. From my page in Firefox, I went to:
Tools -> Page Info -> Media This showed me every image/js/css call on this page. Finding these images allowed me to use the plugin to make the changes.
It worked. I'm quite sure knowing how to code my site would be much better in this situation. But I'm a newbie and this is what I could come up with.
What I learned: It's a flag when you have a secure site that embeds non secure objects/images.

Google listed a blog post with https and I don't know why?

Two days ago we posted a new blog on a site with the aim of being picked up for the search term "live comedy in chippenham". It’s been indexed by Google and we’re now 2nd in the results for the search query. The bad news is that for some reason the post has been indexed as a https URL so all browsers give a warning when the link is clicked.
Firefox gives this error:
The owner of www.neeld.co.uk has configured their website improperly. To protect your information from being stolen, Firefox has not connected to this website.
The host has confirmed that it's not a server config error and we have other posts and pages on the site that are being indexed correctly. We're using WordPress and the Yoast plugin. I can't see anywhere in Webmaster Tools that could be causing the problem.
Can anyone offer any advice please? If you search Google for "live comedy in chippenham" you'll see the issue (it's the link https://www.neeld.co.uk/live-comedy-in-chippenham/)?
It's a really strange one but something I've experienced before.
It has mostly likely been caused by an external link to the page using https protocol which Google has followed before indexing the page. Google are very keen to index https pages at the moment so we might start seeing this kind of issue more often.
There's not a lot you can do other than wait for Google to realise their mistake and list the correct URL in the SERPS. You can help speed this along with a canonical link (which I can see is there), XML sitemap (which you've got) and a server level redirect of https to http.
Do not try to remove the page in Webmaster Tools as this won't have the desired effect and will stop Google reindexing the page properly.
Hope this helps.

How to get all content of website from google cache?

My gmail account was hacked today, and I can't login or request new password anymore. And I lost all content in my blogspot too.
I looked around and found it was stored in google cache. But I had more than 200 articles, and I need to go through more than 200 urls to copy all content.
Is there any methods can help me retries all content from google cache?
A Web Crawler could help you to retrieve many pages of information.
Also, maybe you could use Internet Archive Wayback Machine to retrieve some lost information.
Another tip: the google advanced search could help you too. In particular, the site or domain param.
Update: Maybe this script can do all work for you: Retrieving Google’s Cache for a Whole Website

Resources