I am using the IIS Search Engine Optimization Toolkit to scan over a site of mine. One of the compaints it has is that there are multiple canonical URLs for my static assets, e.g.
http://example.com links to http://example.com/styles.css
https://example.com links to https://example.com/styles.css
It is of course correct, the same file is linked differently on the secure pages. This is only happening for static resources, the actual HTML pages all have single canonical URLs.
Should I leave this as is an ignore the toolkit or is there a better arrangement. I need to consider all angles, e.g.
Performance (browser caching, server load)
SEO (duplicate content penalties)
Usability (mixed content warnings)
Thanks for your help!
You need to decide if it's better for you to receive this message from your analysis toolkit, or for your customers to be told that not all items on the page are secure.
I know what my choice would be...
Related
I've looked a bit and seen a number of questions related to serving mp4, but I have not yet seen an answer to "How can I serve MP4 from Wordpress and/or Apache?" I believe it is possible as the twentyseventeen theme IIRC lets you host your own.
I am looking to host my own, if possible, after this question had an anchor linking to this blog post, and the custom generated minimized code did not work on this page on my site. Both that and the present homepage have the offered HTML solution but fail to do what is intended, namely hide related offerings like the (now retired) rel=0 parameter.
What, if any, options do I have to serve MP4 gracefully, with or without streaming, from Wordpress under Apache? I would ideally like something as graceful as under Youtube, but without related videos.
Thanks,
You definitively want to peek a look at the Plugins WordPress offers for video streaming. For example, Easy Video Player allows you to "embed both self-hosted videos or videos that are externally hosted using direct links".
It's just a one-liner. Can't get simpler than that:
[evp_embed_video url="http://example.com/wp-content/uploads/videos/myvid.mp4"]
As of functionality, it will work in Apache, since the webserver only transfers the video by HTTP and the playing occurs in the client side with HTML5. However, you must check that your hosting gives you enough bandwidth to stream content to all your visitors
By default, Wordpress provides a [video] shortcode. I'm trying to see if that will work, as a matter of using Wordpress default functionality.
We have just moved to drupal and are trying to pro-actively identify all broken external web (http://, https://) links.
I've seen some references to validation of links but wasn't sure if it only meant validation of the syntax of the link as opposed to whether these web links work or not (e.g. 404).
What is the easiest way to go through all web links in a drupal site and identify all of the broken external web links? This is something we'd like to automate and schedule every day/week.
As someone else mentioned, use Link Checker module. It's a great tool.
In addition, you can check the Crawl errors in Google Webmaster tools for 404'd links like this:
Clicking any URL from there will show you where the URL was linked from so you can update any internal broken links. Be sure to use canonical URLs to avoid that.
Make sure you're using a proper internal linking strategy to avoid broken internal links in the first place, too: http://www.daymuse.com/blogs/drupal-broken-internal-link-path-module-tutorial
Essentially: use canonical, relative links to avoid broken internal links in the future when you change aliases. In simple Drupal terms, be sure you're linking to "node/23" instead of "domain.ext/content/my-node-title" since multiple parts of that might change in the future.
I have not found a Drupal based approach for this. The best, free piece of software I've found for finding bad links on sites is Screaming Frog SEO Spider Tool.
http://www.screamingfrog.co.uk/seo-spider/
I have a site who's search ranking has plumetted. It should be quite SEO friendly because its built using XHtml/CSS and has been run against the SEO toolkit.
The only thing I can think that may be annoying Google is
The keywords are the same accross the whole site rather than being page specific. (cant see why this would be a massive deal
Another URL has been set up that simply points to my site (without redirecting) (again - no big deal)
Non UK users are automatically forwaded onto the US version of the site which is a different brand. I guess this could be the problem. If google spiders my site from the US then it will never get the UK version
So the question is, does geo redirecting setting effect my SEO? Is it possible to detect if who is accessing your site is actually a search engine that is spidering my site. In this case I don't want to do any geo-location
Do not use same keywords on entire site. Try to use specific keywords per page.
Do not let several URL:s point directly to the same site since this will cause the inlinks from the different domains to be treated as to different domains. If you point URLs by redirect, all inlinks will be added to the target domain and thus increase it's "inlink score".
To detect is request is from a crawler, use the browsercaps project: http://owenbrady.net/browsercaps/
Is it advisable to implement url routing for an asp.net(webforms) website which is one year old... What are the factors to be considered before implementing....
Edit:
It is a web based product website developed my company and users should pay for using it...
Some of the factors I can think of from top of my head:
Does your boss/sponsor/client/guy-who-pays-the-bill understand the importance & wants it done?
How large is the user base? If it is a internal site with few users, it might not be a big deal to ask them to update their links, but for a large public facing site, it might not be so simple as the users might have many bookmarks etc. to the content
What kind of site is it? If it is like a news site, I think people visit it for the new content rather than very old articles, but it if a knowledge base of some kind (read MSDN-like) you can expect people to have a lot of bookmarks etc. to keep handy.
Is the site SEO'd & how important is not loosing the traffic to the site based on the old URL?
What is the plan to ensure that web search engines re-index your site pages & the old URL's are given a permanent move?
There is a great advantage in having a good simple URL, but are your users tech-savvy to use it or is it just a "next-shiny-thing" initiative pushed by the developers?
HTH.
User friendly urls will improve you site from the SEO point of view. If your site is public and present in search engines will be benefited with this change.
I have to disagree with Sunny in relation with old urls. It's not true that users won't be able to access to old urls, normally, you can create redirect rules to send user hitting the previous format to the new one.
So reasons I would evaluate are:
- How important is to improve the site from SEO point if view
- If old urls can be translated to new url format through redirect rules and the importance to this.
I have developed a multi-lingual site in ASP.NET, which detects the user's system culture, and displays content in the appropriate language.
This is all working nicely, but my client has since had an SEO audit. The SEO agency has expressed a concern that this is not good SEO practice, as there are not unique URLs for each language.
They have suggested that the site may be accused of cloaking, and that google may not index the site correctly for each different language.
Any ideas on whether these are valid concerns, and if there is an advantage to having unique URLs for each language version of the site?
Although you have done a beautiful job switching Language automatically, the SEO Agency is correct!
That google may not index the site correctly for each diffferent language.
This is true! Google doesn't send the accept-language header last time I checked. This means that Google will only index the default language.
They have suggested that the site may be accused of cloaking,
This differs from your Excact implementation, but it is possible your site will receive a penalty!
There IS advantage having unique URLs (for each language version) on the site!
First of all, for your users: they can link to the language they prefer. Secondary for the Search Engines as they can index your site correctly.
I advice most of the time to redirect the user only on the home page for a language switch using a 302 redirect to the correct URL (and so the correct language). (edit: You can review the post by Matt Cutts "SEO Advice: Discussing 302 redirects")
To verify my advice: install fiddler and surf to http://www.ibm.com. As shown below, i received a 302 redirect to the appropriate language, arriving at www.ibm.com/be/en.
Result Protocol Host URL Body Caching Content-Type
4 302 HTTP www.ibm.com / 209 text/html
5 200 HTTP www.ibm.com /be/en/ 5.073 no-cache text/html;charset=UTF-8
There are a few solutions you can solve this:
Start Rewriting Urls (adding e.g. a directory using the language)
If you don't want to go through the hassle of adding directories (or rewriting url's) adding a QueryString would be the easiest solution (although try limiting them to maximum 2 parameters)
Another option is using different sub-domains! www.website.com for the default language, es.website.com, fr.website.com
Just make sure you supply every time the same content for the same URL.
Good luck with it!
Hopefully we will see some people answer who know about the internals of Google (anyone?). But most suppositions about how Google and others' crawlers are... suppositions, and subject to change.
My guess is that you should use separate URLs for languages, even if they just have a ?language= difference (although better would be a truly different URL). I believe this because when you go to google.it it says, google.com in English and that link goes to... google.com. In other words, Google itself uses different URLs for different languages.
Also, another big site, Microsoft (they probably know about SEO) uses
http://www.microsoft.com/en/us/default.aspx
for US-English and
http://www.microsoft.com/it/it/default.aspx
for Italy-Italian so it's probably best practice to differentiate based on language (and country).
In any case, I am totally annoyed when I'm on an English language computer and I can't see a site in Italian or Spanish, and vice-versa. As a usability, not SEO strategy, the user should be able to override the language suggestion. This is how most big sites handle languages, too.