Coverage report for ASP.NET web site pages - asp.net

I have taken on an ASP.NET web site where the client is using the web server as a code repository, i.e. removing a page from the site involves not linking to it any more. There are a stupendous number of unsused files, and I would like to archive these off and arrive at a lean git repository of only files used by the active site.
How can I get usage or coverage data that will tell me, over an agreed upon period, i.e. a month, which pages are being hit? I know there are many ways of doing this in ASP.NET, and even in plain IIS, but I'd like some suggestions on a convenient and simple way of doing this.

I would suggest the IIS logs, but that wouldn't report linked pages that haven't been accessed by users.
You could try running a spider on the site. Here's a free tool. http://www.trellian.com/sitespider/download.htm
You should be careful what which files you delete from the web server if there are cached links to the pages out there. A good strategy would be to use Google. Run the following search query to see what pages are returned "site:example.com" where example.com is the domain for your site.

look at the access logs for the agreed period and compare the list of pages visited against the full list of all pages. this seems like more work than necessary though.
there is a program called Xenu link checker which already contains the functionality you require. it can spider your site and if you tell it where the files are it will identify unused files for you.

Related

Unable to debug some aspx pages in ASP application

I have a classic ASP website running on IIS. I opened it with VS 2015 (Open website in File menu) and saved the solution (when opening it it said this is a precompiled website - whatever that means). Then I attached to process to debug it.
Now, the breakpoints I placed are hit on some of the .aspx pages, and not on others. Any idea why this might be the case? I checked the webconfig and it has debug option set to true. Probably some PDB files are missing. People suggest to rebuild the website, but when I click build or rebuild solution, the process completes immediately with success, so I doubt anything was recompiled at all.
I can modify the code of those pages and the IIS recompiles them on the next request, but not sure why the breakpoints don't get hit there. They obviously are once I put something like Debugger.Launch() in my code, but it's not what I want.
I'm no expert so I'd be grateful if you could help me out with this.
Precompiled website means it improves performance on some ASP.NET websites. It can be used to optimize static sites. We explore strategies for other types of sites. This speeds up the first access to pages in your site. And if you want the faster option for the site once deployed, please consider "site precompilation." Let's assume your site is high-volume, popular and important. It is important that the site respond instantly when a customer visits.
Hope this is helpful.

Display web page from another site in asp page

Our customer has a requirement to extend the functionality of their existing large government project. It is an ASP.NET 3.5 (recently upgraded from 2.0) project.
The existing solution is quite a behemoth that is almost unmaintainable so they have decided that they want to provide the new functionality by hosting it on another website that is shown within the existing website.
As to how this is best to be done I'm not quite sure right now and if there is any security issues preventing it or that need to be considered.
Essentially the user would log on to the existing web site as normal and when cliicking on a certain link the page would load as normal with some kind of frame or control that has within it the contents of the page from the other site. IE. They do not want to simply redirect to the other site they want to show it embedded within the current one such that the existing menus etc are still available.
I believe if information needed to be passed to the embedded page it would be done using query strings as I'm not sure if there is even another way to accomplish this.
Can anyone give me some pointers on where to start at looking to implement this or any potential pitfalls I should be aware of.
Thanks
if the 2 sites are hosted from the same network (low latency between them) you could use state server for session management. that way, when you authenticate on one site, you will also be authenticated on the other, and share user state across them.
its pretty simple, in your web config of each web server you'd point to the state server (which could be located on one of the web servers)
<configuration>
<system.web>
<sessionState mode="StateServer"
stateConnectionString="192.168.1.103:42424"
/>
</system.web>
</configuration>
http://en.csharp-online.net/ASP.NET_State_Management%E2%80%94Storing_Session_State_out_of_Process
create a virtual directory under the primary domain. If your domain is www.mydomain.com then create a virtual directory www.mydomain.com/site and port the new website application under /site virtual directory. This was linking should become very much relavant. With this the virtual-directory application will also retain all domain cookies set by primary domain.
I would suggest to make the second website look exactly like the first one or at least use the same MasterPage, so you can redirect from one site to another without any visual difference.
If your site needs authentication, consider that you would need to do something to prevent the user to log in twice, an option could be to send an encrypted token to the second site.
All of this if you are forced to have a second site, if not just use a virtual directory
You could use something like UFrame. I've used it a couple of times and seems to do quite a good job with it...
"goodness of UpdatePanel and IFRAME combined"
http://www.codeproject.com/KB/aspnet/uframe.aspx
I would use an iFrame to embed that website in within your existing application. Just set the "src" attribute and pass in any query string parameters the other site needs to render correctly.
You can still pass in sensitive data in the query string, however it would make sure to encrypt it before sending it in.
I know it is not the most elegant solution, but it gets the job done. And from the description of the existing app, it doesn't seem like your customer cares for "elegance" :)
Hope this helps

How to put an asp.net application into offlince/maintenance mode?

I've developed my first web application which, surprisingly, is getting very popular.
Because the website is now live, I have a hard time doing some changes, in fear some people are still logged in and are using the application.
I wish to avoid having a duplicated instance of the web application for testing.
Is there any way to put the website in 'maintenance mode' with only me having access to it? Like redirecting to a page with some info, telling its in maintenance mode.
I wish to avoid having a duplicated
instance of the web application for
testing.
That's your problem right there. For anything but the most trivial sites, you should have a staging or development instance. You should be using source control and have a script to update the main instance.
You can simply drop a file called app_offline.htm in the root of your website and ASP.NET will automatically route all traffic to this page. This file can contain any HTML you wish indicating that your site is down for a short period due to maintenance.
For more information please read App_Offline.htm and working around the "IE Friendly Errors" feature:
The way app_offline.htm works is that
you place this file in the root of the
application. When ASP.NET sees it, it
will shut-down the app-domain for the
application (and not restart it for
requests) and instead send back the
contents of the app_offline.htm file
in response to all new dynamic
requests for the application. When
you are done updating the site, just
delete the file and it will come back
online.
This is the answer to your question:
http://www.codeproject.com/Tips/219637/Put-the-website-in-Maintanance-Mode-Under-Construc
There's no such built-in functionality in ASP.NET except app_offline.htm which doesn't quite fit your needs because even you will be denied access to the site. You have to build it on your own but this is best done on the routers and load balancers level than at the application level. Of course this will depend on your network architecture.
Besides building a dev replica of your website to build patches and fixes on, couldn't you just announce a site closing for maintenance several days in advance? I'm not a web programmer, but you might want look into what Hattrick, a popular online soccer management, does for maintaining their site. They use a notification system on the homepage, after users sign-in, that announces when maintenance will be taking place (usually late at night in Europe where a large portion of the players and all the devs are located) and they close down the website for a couple of hours. When they take the site down they post a page, using the same style as the rest of the site, and provide an estimate of when it will be up and running again. Simple, elegant, and when coupled with the long forewarning it seems to do a good job placating the user base.
Give users a long heads up that planned maintenance is scheduled to take place and give them some idea what it is for and most people will be able to accommodate the down time. Nothing is more frustrating than purposefully going to a web app that was up and running 10-20 minutes ago to find it suddenly unavailable and down for maintenance.
Try app_offline.htm ??
What version of ASP.NET? I'm sure there are a million more elegant ways of doing this, but you can change the Default Document in IIS to redirect to Maint.html (or similar).

Re-publishing an ASP.NET Web Application While Site is Live

I am trying to get a grasp on how to handle updates to a live, functioning ASP.NET (2.0 or greater) Application while there are users on the site.
For example, suppose SO is an ASP.NET Web Application project. The project code compiles down to the single .DLL in the BIN folder. Now, there are constantly users on SO, so what would happen to users' actions/sessions if you would use the Visual Studio .NET "Publish" feature (or just FTP everything again manually) while they are using the site?
Would creating an ASP.NET Web Site, instead, alleviate any problems that may or may not exist with the scenario above? I am beginning to develop a web site as a user-driven Web Application, and I want to make sure that my inexperience with this would not potentially annoy the [potentially] many users that I [want to] have 24/7.
EDIT: Sorry, I should have put this in a more exact context. Assume that this site is being hosted by a web hosting service with monthly fees. I won't be managing the server itself, just what the web host allows as a user of their services.
I create two Web sites in IIS. One is the production Web site, and the other is a static Web site with an HttpHandler that sends all requests to a single static "We're updating" HTML page served with an HTTP 503 Service Unavailable. Typically the update Web site is turned off. When it's time to update, we stop the production Web site, start the update Web site, and now we can fiddle with the production Web site all we want without worrying about DLLs being locked or worker processes needing to be spun down.
I started doing this because
App_Offline.htm really does not work well in Web Gardens, which we use.
App_Offline.htm serves its page as 404, which is bad if you're down for a meaningful period of time.
We can start the upgraded production Web site with modified settings (only listening on localhost), where we can do a last-minute acceptance/verification that everything is working before we flip the switch, turning off the update Web site and re-enabling the production Web site.
Things this does not solve include
Any maintenance that requires a restart of the server--you still have downtime where no page is served.
Any maintenance that diddles with the .NET runtime, like upgrading to the latest service pack.
Other approaches I've seen include
Having two servers. Send all load balancing requests to one server, upgrade the other one; then rinse and repeat. Most of us don't have this luxury.
Creating multiple bin directories, like bin-1.0.0.0 and bin-1.1.0.0 and telling ASP.NET which bin directory to use in the web.config file. (One advantage of this is that reverting to a previous binary is just editing a config file. A disadvantage is that it's harder to revert resources that don't end up in your binaries, like templates and images and such.) I don't remember how this actually worked--I think the application did some late assembly loading in its Global.asax based on its own web.config section (since you touched the web.config, the app had restarted, so it was okay).
If you find a better way, let me know!
Changing to the asp.net web site model won't have any effect, as the recycle will also happen, some of changes that trigger it for sure: web.config, global.asax, app_code.
After the recycle, user will still be logged in because asp.net will just validate the syntax. That is given you use a fixed machine key, otherwise it will change on each recycle. This is something you want to do anyway as other stuff can break if the key change across requests i.e. viewstate validation, embedded resources (decryption of the url fails).
If you can put the session out of process, like in sql server, you will avoid loosing the session. If you can't, your code will have to consider that. There are plenty of scenarios where you can avoid using session, and others were you can wrap it and re-retrieve the info if the session was cleaned. This should leave you with a handful specific cases that you know can give trouble to the users, so for those you do some of the suggestions others have already made.
One solution could be to deploy your application into a load balanced environment (web farm).
When deploying a new version you would use the load balancer to redirect requests to the server you are not deploying to.
App_offline.htm is great solution for this I think.
in SO we see application currently unavailable page when a deployment begins.
I am not sure how SO handles it.. But we usually put a holding page. So what ever the user has done (adding question or answering questions) does not get updated. As soon as he updates something he will see a holding page asking him to try after sometime.
And if I am the user I usually press the back button to make sure what I entered is saved in the browser history so that I can post later.
Some site use use are in clustered environment so I take one server offline and inform the load balancer that she will not be available and once I make sure that the new version is working fine I make it live.. I do the same thing for the next server.
Do we have any other option?
It is not a technical solution, but set up a scheduled maintenance window. You can annoucement in advance giving your user base fair warning that there is a possiblity that the application will not be available during that time frame.

How to deal with link redirects when migrating from classic ASP to ASP.NET?

I have a large-ish web project which is migrating from classic ASP to ASP.NET (about time), and would like to redirect requests from the old addresses, such as:
/some/path/old-page.asp?foo=bar
to the new addresses:
/other/path/new-page.aspx?qaz=bak
For a fairly long time, there will be classic ASP pages running in parallel with ASP.NET pages, with individual pages being replaced by their ASP.NET versions over time. Where possible, I want to redirect from the old pages to the new ones to keep users from receiving 404 errors, and also to keep the accumulated PageRank on the pages.
My question is, how would you do the redirection logic from the classic ASP to the new templates? The obvious solution is to replace old-page.asp with some simple VBScript that redirects to new-page.aspx, but in the long run I want to get rid of the old .asp files, so I would like to implement the redirection in such a way that they will exist also after the site is completely running in .NET.
One option would be to map the .asp extension to ASP.NET, and implement the redirection as an HttpHandler, but I guess there is no way of making the classic ASP engine run after the request has been passed to ASP.NET.
A couple of years ago, I ran into the very same issue at an eCommerce company where we upgraded their website to .NET. The main issue is that we had no idea how many customers had the old asp pages in their favorites, as well as the issue of SEO. Yes you can map the asp extension to ASP.net, but you lose the ability to run the asp files at all, so that would require that you update ALL asp pages, which may not be feasible.
The best solution I found at the time was to implement an ISAPI redirection filter in IIS. This is an app that is run by IIS BEFORE, the asp or asp.net runtimes. It would make a decision based on the url or any rules you want, whether it should allow the asp files to run or whether they should be redirected, or use url-rewriting to handle the request. It is not always a clean operation, since it runs before your website's request do, and confusion can happen later if other developers don't know it is running. So make sure if you go this route, that your website has plenty of comments or documentation to let developers know there is this thing in IIS running...
There is a good explanation of how to implement this in Code Project. Go here and check it out. http://www.codeproject.com/KB/ISAPI/isapiredirector.aspx
Good Luck!
You should use the "HTTP 301 - Permanently Moved" response code, as this is precisely what it is designed for.
http://en.wikipedia.org/wiki/HTTP_301
An ISAPI redirection filter will work in the sense that, yes, it will redirect visitors to the new URL.
However there are three key problems with the ISAPI redirection strategy.
More code to write/maintain
Bookmarks and search engine entries will never be updated with the new and correct URLs
If foo.asp is transparently redirected to bar.aspx and both are indexed by Google, you'll have two duplicate URLs to the same content in Google. This clogs up search results and is actually against their TOS I believe.

Resources