I have one website like www.example.com and have dynamic pages like www.example.com/page?1 and www.example.com/page?2 etc. more pages are created every hours. I need to create sitemap.xml file automatically save in server path and update my latest web pages to Google search engine. How to do this in ASP.NET? Give me any clue on this.
I was looking for similar information recently, there is a similar topic. What you need is called "web crawler" - the principle of work consists in the searching of all URL-address in the HTML-code, excluding links to other sites, and creating a list of found links. For each of the URL-address in the list it will repeat these steps and as result you'll get list of address for all your web pages. And then you can build file Sitemap.xml, I have used for this the class of .net Framework - XmlTextWriter.What about automatically updating the Sitemap file , I think you can set some timer and to update the file, for example, once a day or do it yourself every day. Good luck
Related
We are using umbraco 7.1.3
As per client's requirement we need to create more then 550 Umbraco CMS sites for different cities with same template and asp.net user control which access data from one master database.
So we created one windows application that will create 550+ sites as per city name under one Non-Umbraco root site.
We also managed to create different Umbraco database for each site is created and moved published code under to Non-Umbraco root site and convert to application and also updated Web.config file for each site dynamically.
After that when-ever we found that our logic or UI was not correct we also update DLL, ASCX user controls and CSS to all sites through same window application.
Till now everything was going smooth, but now we have one major change and that contains new document type, template, macro and new menu needs to be added dynamically. Updating published code through windows application was easy but we don't found any way to make update Umbraco database of 500+ sites through another application.
Some websites are already updated through respective sites owners, so without affected any existing changes we need to add new macro, content, document type and menu for each site and we don't know in which Umbraco database we need to enter records?
Had already posted the same in umbraco issue tracker #U4-7105
Also in Umbraco forums #71443
Thanks & Regard
Sounds like an interesting case!
If you want to migrate items that are in the database such as document types, templates and macros you would most likely need to get a product like Courier. I can see that due to license costs this could be an issue for you with 500+ sites.
Another option could be to take a look at uSync to see if it does what you need. I don't have much experience with this package but from the looks of it, it seems like it is handling all the database bits - and everything else (files on the file system) would be handled by your application just as it is right now.
I have been given a job to re-develop a news portal. The website already has couple of thousands of unique visits a day. I am going to develop it using ASP.NET webforms. I am currently in the planning phase and I am thinking to offer the main admin a page where he can change site specific configuration information. Some of these are;
Web site title "<title>"
site URL
footer text
default image directory
whether to accept comments without authorisation or not
I listed above some settings so that you can understand my scenario better.
What I can't decide is, where to store all this information. Do I store them in a DB (costly?), a custom XML file? or a .config file. e.g. ConfigurationManager.AppSettings
Any pros or cons would make my day!
Thank you!
My opinion is to store them on web.config on WebConfigurationManager.OpenWebConfiguration().GetSection() because this variables are critical and change only ones - in the initialize of the site.
For example the default image directory is stay the same for the rest of the site life, the same and the site URL the same and the other.
Also when you change this settings probably you need also a restart of the web application because for sure you need to re-read them on some static variables.
And because this variables are stay as is, and need them for start the web (then you read the database and the rest) you need to have it in first hand, from the web.config.
I made an ASP.NET MVC application which allows user to create dynamic websites. I need to add feature which will allow to download from server off-line version of choosen website as static html files with menu, hyperlinks, images, documents etc. It should work similar to applications such as Teleport Pro, but I have to choose from Admin Panel which content should be export.
Client wants to burn static website on CD, save on pendrive.
Do you have any ideas how to begin? Please help.
I currently have implemented that in a current project...
User is able to change anything in the frontend and at the end he can publish and download the offline files... the site subscribe users and show all prizes, winners and more information about that campaign.
All was done in ASP.NET MVC3 under .NET4 and hosted in AppHarbor.
It's composed at several applications but for what you want, you develop the Backend and the Frontend, and to generate the static files, simple use the Frontend to grab the full HTML
As an example, I can show what 2 users did...
Callme.dk did http://callme.julekal.info and
Sony Nordic did http://sony.julekal.info
plus, you can simply point custom domains to it as well like http://sonynordicxmas.net/
To publish and generate all files:
one part of the editing:
So I give the users, offline access (through the .zip file), online access (through the frontend application) and the ability of using custom domains...
I think the only way this might be possible is if you go to every single page and then use your browser to "Save" the web page script and all.
However this causes several issues;
You never quite get everything and you need to massage the HTML produced, dowload all the images etc to get the page to look right
Each html file now has an associated folder with the same name and each time you do this you will get another html file with a folder. You can combine all the folders into a single one but that leads me to item 3.
You will need to edit each html file to clear up any pathing issues if you want to share a single source folder.
Data is no longer dynamic!
You need to, if you want to link all the pages to each other, edit every single html file and resolver the anchor tags.
This is too much work and I think it actually breaks the true requirement.
Don't do it! :)
I am working on an ASP.NET 3.5 Web Application project in C#. I have manually added a Google-friendly sitemap which includes entries for every page in the project - this is not a CMS.
<url>
<loc>http://www.mysite.com/events.aspx</loc>
<lastmod>2009-11-17T20:45:46Z</lastmod>
<changefreq>daily</changefreq>
<priority>0.8</priority>
</url>
The client updates events using an admin back-end. Other than that, the site is relatively static. I'm trying to decide on the best way to update the <lastmod> values for a handful of pages that are regularly updated.
In particular, I am using the QueryStringField of the ListView control to enhance SEO as described here:
https://web.archive.org/web/20211029044137/https://www.4guysfromrolla.com/articles/010610-1.aspx
http://gsej.wordpress.com/2009/05/31/using-a-datapager-with-both-a-querystringfield-and-renderdisabledbuttonsaslabels/
When the QueryStringField property is set, the DataPager renders the paging interface as a series of hyperlinks which the crawler can follow and index. However, if Google has crawled my list of events two days ago, and in the meantime, the admin has added another dozen events... say the page size is set to 6; in this case, the Google SERP links would now be pointing to the wrong pages. This is why I need to be sure that the sitemap reflects changes to the events page as soon as they happen.
I have already looked though other SO questions for info and didn't find what I needed. Can anyone offer some guidance or an alternative approach?
UPDATE:
Since this is a shared hosting environment, a directory watcher/service won't work:
How to create file watcher in shared webhosting environment
UPDATE:
Starting to realize that I may need signify to Google that the containing page has been updated; update the last-modified HTTP header?
Rather than using a hand-coded sitemap, create a sitemap handler that will generate the sitemap on the fly. You can create a method in the handler that will grab pages from an existing navigation sitemap, from the database, or even from a hard-coded list of pages. You can create an XmlDocument from the list, and write the InnerXml of the document out to the handler response stream.
Then, create a class with a method that will automatically ping search engines with the above handler's URL (like http://www.google.com/webmasters/tools/ping?sitemap=http://www.mysite.com/sitemap.ashx).
Whever someone adds a new event, call the above method. This will ping Google using your latest sitemap (freshly generated by the above method).
You want to make sure that the ping only works if the sitemap has actually been updated. You could use File.SetLastWriteTime on events.aspx in the AddNewEvent handler to signify that the containing page has been updated.
Aslo, be careful to make sure there have been no pings for the last hour (as Google guidelines discourage pinging more than once per hour).
I actually plan to implement this in the following OSS project: http://cyclemania.codeplex.com. I will let you know once it's done and you can have a look.
If you let your user add events to the website you are probably using a database.
This means you can generate the XML-Sitemap at runtime like this:
create a page where your sitemap will be available (this doesn't need to be sitemap.xml but can also be sitemap.aspx or even sitemap.ashx).
open a database connection
loop through all records and create an Xml Element for each record
This blog post should help you further: Build a Search Engine SiteMap in C#.
It is not using the new XElements from .Net 3.5, but is will work fine.
You can put this in an aspx page, but adding an HttpHandler is probably better as described on the same blog, different post: (creating a httphandler for a sitemap)
I'm writing a pretty straight forward ASP.NET MVC web app: only a couple of CRUD pages, some folders where clients can browse documents and just 3 or 4 roles. The website will be used in a B2B scenario, where every client will have their "own" website.
At this point, the only thing that will change in the website, from client to client is the content (ie. the documents, and the rows of data they'll see). If this is the case, what's the best way to manage roles across all of my clients? I'm looking for the simplest possible solution because this is a proof of concept and I don't want to invest a lot of time right now.
What if it's not just the content that changes? Maybe some clients will want a few custom static pages. At this point, is my only option replicating the entire website? I'm leery of this because it'll become hard to maintain if I get a lot of clients.
I'd appreciate any help... I just don't want to shoot myself in the foot; I'm sure someone has done this before.
I create Virtual Directories in IIS for each client, all pointed back to the same folder where my ASP.NET code resides.
This allows me to support several dozen nearly-identical "web sites," each with their own database that is basically identical in form, only differs in data.
So, my site URLs look like:
http://mysite.com/clientacme/
http://mysite.com/clientbill/
http://mysite.com/clientcharlie/
There are two key implementation details I worked out for this:
I use the Virtual Directory folder name to determine which DSN my code reads from. This is accomplished by creating a simple static method that injects the folder name into a DSN string template. If you want to use the same database to store everyone's data, you can use the folder name as a default filter in your queries.
I store the settings for each web site (headers and footers, options, links to custom reports, etc.) in a simple "settings" table in each database (key, value) rather than in the web.config (which is shared). This allows me to extend the code base over time to customize the experience for each client without forking the code.
For user authentication, I use Basic authentication, and I keep usernames, passwords, and roles in a table in each database.
The important thing is that if you use different SQL Server databases for each client's content, you need to script any changes to your database tables, indexes, etc. and apply them across all databases at the same time (after testing of course). One simple way to do this is to maintain an Excel sheet with a table of database names and a big "SQL" cell at the top. Beside each database name, create a formula to "USE databasename;" and then concat the SQL code at the top.
I'm not sure if this answers your question completely, but as far as maintaining custom "static" pages I found myself implementing a system on a client's MVC website where the client can create "Pages" from their admin control panel and each Page has a collection of "PageContent" entities which consist of a Title and and HTML content field (populated using a WYISWYG editor). Upon creating a page the MVC application maps http://yoursite.com/Page/Page-Url-Specified-By-The-User to that page and renders its content there. Obviously, the pages are dynamic, but as far as the client can tell they have created a brand new custom page with little or no effort.