I have just started adding the new .NET 4.0 URL Rewriting into my project. I have a question.
Let's say I have a Article.aspx that displays, well, articles. I made a route for it in the Global.asax:
routes.MapPageRoute("article-browse", "article/{id}", "~/Article.aspx");
So the link consists of the article's id which is, obviously, not a very nice, nor SEO friendly link. I would like to display the Article's title in the link, instead of the ID.
Do I have to pass the whole title in the parameter (instead of the id) and then make a SQL query that searches for a database record with the matching title? That sounds scary. Maybe there is some way to do something similar to the Eval() methods, that would change the title into an ID?
Thank you very much!
There is nothing to prevent you from including both the ID (for quick SQL retrieval) and the article's title in the link (for SEO purposes). This is exactelly how stackoverflow is handling the routing (check the address for this question).
routes.MapPageRoute("article-browse", "article/{id}/{title}", "~/Article.aspx");
Obviously, the title after the ID is not necessary to display the page (you only use the ID to fetch the article), but everytime you generate the link in your site, generate it with the title, and the bots will use that when indexing your pages.
Oh, and you might also want to create a method that translates your title into a URL-friendly string.Like making all lowercase, converting spaces and other characters to '-',etc.
Related
I have an asp.net .aspx page(say fruits.aspx page) which lists all the fruits(apple, banana, mango etc) with a thumbnail, title and link which leads to each fruit's respective detail page. Now all this data is being retrieved from an XML with the help of backhand code with help of an XSLT and user-control.
Now since the data and URLs of each fruit's detail page are not there statically on this page, it will not be crawled and indexed as per my knowledge.
Is there a workaround that I can do to make each fruit's detail page crawled and indexed.
If I had the dynamic URLs only with something like "?var=value", I could solve it with static/dynamic conversion using URL re-write. But here the URL itself is not there but is generated from code behind.
Search engines will not see the aspx file as it sits on your server; Instead, they see the same thing your web browser does: the resulting HTML output.
This means that the parameters you speak of will be seen and indexed properly by search engines.
There is no way to do it then. Each page you want indexed must have a unique URL. When you generate the page, just generate a unique URL. Take your query parameters and paste them on the end of your script name.
For example say that fruits.aspx is called with ?fruit=banana as a query parameter. Your best option is to generate a page with a unique static URL for example make the link to the banana page look like /fruits.aspx/fruit/banana.
Even better would be to rewrite it to remove the .aspx. Then the site looks like all static content, which is even better for indexing. If a URL looks like it is backed by a databasem the search engine is less likely to index everything.
I have a company website (Visual Studio / VB / ASP.NET 4.0) and it's now localized in 10 different languages.
The problem: My URLs do NOT change when switching from, say, English to Swedish. Only the text changes, as it calls the information from the "sv" resource file instead of the "en" resource file. Stefan noted that this will not count against me for duplicate content.
But Tiggerito came up with an excellent suggestion. He suggested I use canonical tags in the section to intimate to SE bots that I have other languages. I'd like to follow his suggestion, and add canonical tags to my master pages.
Can anybody tell me how I can go about doing this? What would the tags look like, and would I have to have one for en, es-MX, ru, sv, fr, etc.? Thanks for any guidance you can offer!
First of all its not good SEO to have the same page, same url, with totally different content. You confuse the search engines that do not know what to show. What search machine index to show ? what language of all. This is not count as duplicate content but indexer see you like to change the page and the language too often and do not what to show.
Second, the canonical tag works only for google from what I know, and second its not take language as an argument. The canonical tag works the different way connect many different url with similar content to the one url, not split one url to many different contents.
Is better to have you home page your default language, and when you change language to change the url, or add a url parameter.
Here is a canonical tag.
<link rel="canonical" href="http://example.com/page.html"/>
About canonical tags
http://www.mattcutts.com/blog/canonical-link-tag/
http://googlewebmastercentral.blogspot.com/2009/12/handling-legitimate-cross-domain.html
Notes
In this url that #JasonWeber give me
http://googlewebmastercentral.blogspot.com/2010/03/working-with-multilingual-websites.html
say it very clear:
if you’re going to localize, make it visible in the search results
and
"And last but not least, keep the content for each language on separate URLs - don't use cookies to show translated versions."
I need a little help understanding how HTML forms work. It is my understanding that forms that use GET as their method submit name/value pairs for all fields within the form tags of said submission. However, if you take a look at the follow example from Google (and I've seen this in many other places too) and only use one of the fields on the form:
http://books.google.co.uk/advanced_book_search
Rather than being sent to a page with a name/value pair for each field of the advanced search page you are taken to a much cleaner looking URL:
http://www.google.co.uk/search?tbo=p&tbm=bks&q=hitchiker&num=10
Despite all of the input fields on the advanced search page.
Onto my problem... My own advanced search page is quite large and at the moment is being POSTed to my search results page which is taking in the values and searching accordingly, no problems! However, I want my users to be able to bookmark/share their searches and in order to do this I need to have items being passed into the querystring but I don't want massive querystrings if I don't need them. If my user has only searched by a color for example then I want the URL to be something like search.aspx?color=red; If they're searching by color and size then search.aspx?color=red&size=large and so on. Is this possible?
To complicate things even further I'm using ASP.NET so it's not the easiest of things to create a form that uses GET though I do believe I have already found away around this.
If you can give any advice or a nudge in the right direction, then thank-you! :)
What you're suggesting should be easily possible if you conditionally check the querystring on the results page to ensure the key/value is there.
if(Request.QueryString["color"] != "")
{
// Add color to the seach parameters
}
To create the GET request I would think you would need to POST back to your search form and redirect to the results form from there, dynamically adding key/values to the querystring as and when they are required. This Post/Redirect/Get design pattern is typically used with web forms to help with book marking.
If you want to share bookmarked searches between users, then you'll have to share the name/value querystring options in the posted URL. It sounds like you don't want to include the pair if one wasn't specified. That's easy, just dynamically build a querystring for pairs that the user HAS provided input for. So, when processing, loop through all input controls, and if a value was provided, append it to the querystring, or not.
Good Afternoon,
A client is interested in creating an ASP.NET 2.0 website whose purpose is to serve up a "quote of the day". He wants the quotes on static content pages all attached to the same master page. The quote pages must be viewed in a certain sequence, and site browsers cannot view any other pages than the starting page when browsing to the site. That is, everyone must go to page 001.aspx when entering the site.
Two Questions:
1. The content pages are going to be created by the client using an excel data source and a merge process by which each quote page is created eg. 001.aspx, 002.aspx etc. This seems clunky to me at best. Would ASP.NET Dynamic Data be a better solution here?
I'm new to ASP.NET Routing and URL Rewriting as a whole. How would I setup a route table to ensure that users always entered the site on the same entry page, and create a route table such that default.aspx resolves to 001.aspx?
Thanks,
Sid
I would suggest to use the excel sheet as a data source and handle viewing the 'Quote pages' by paging through the result set obtained from said data source.
If your client is concerned about SEO, he must recognize that his requirement to have only one entry page defeats his One-Quote-One-Page-Is-SEO-friendly.
I don't think the effort to distinguish between a human user and a search bot is worth it.
Anyway googlebot is capable of indexing pages with URL parameters thus allowing to be SEO friendly without generating static content (other bots should be as well).
Possible solution
To allow search bots to index your Quotes you have a query parameter for the date of the Quote.
If you want to enforce human users (hackers don't count ;-)) to enter the site only by the current date you check the browser string and redirect any browser not being know as a search bot to the current date if the referer is not equal to the previous date.
This solution should give you a reasonable result without too much overhead.
All,
This would seem like a fairly basic asp.net question - but in all my years of coding, I've never really thought about it.
Say you have a asp.net 2.0 site with only a masterpage and a default.aspx and its a blog that saves all the data into the database. Links on the side are generated automatically. So ... the URL is always just http://www.XXXXX.com/default.aspx.
So, with that being the case, what do you need to do so that ... say google ... knows about all the different blog entries and links directly to the entries instead of just the base URL?
Is it as simple as changing the forms method to: method="get"?
Thanks, L. Lee Saunders
There are at least two solutions:
Search engines understand query strings, so just add the article IDs to the URLs in your anchor tags -- no need to even use a form control.
Use URL rewriting to expose one set of URLs to the outside world (like /article-title/1234/) in your anchor tags, and then modify the URL to be default.aspx when it arrives at your site; the page could then pull the article to be displayed from any number of places, including but not limited to a query string.
You could have a REST webservice so that you can just use urls to navigate the site, and perhaps have a front page with some new posts, so that the spider can navigate the site..
As an example, look at the urls for SO, it is easy for a spider to navigate this database-driven website.
Create a page that just serves up XML Sitemap (the data obviously being pulled from your database) and submit the sitemap to Google.
Google will then index any links in your sitemap.
(This assumes that these is some difference between each article - e.g. a Querystring key/value).
Useful Link(s):
Web Sitemap Generators
Google Sitemap Validator
Google Sitemaps for ASP.NET 2.0 (there are about a gazillion interesting links off the back of this as well).
some sort of URL rewriting may be an answer
I wouldn't recommend a postback for your situation, it can get ugly for refreshes etc. So, yes, change the method to "get"
Then, say your page of, default.aspx?postid=12345 will get translated into /mm/dd/yy/this-is-my-post.aspx