In my ASP.NET application, I have a script that I periodically automatically call to update the database. Due to a bug in this code, several invocation of it entered an infinite loop.
I am on a shared hosting, so I can't just restart IIS. I have tried “stopping” the website from the hosting's management website, but it had no effect.
Since they are running for seveal hours now, I assume there is no timeout configured. So, I would like to kill those “processes” (I assume it's actually just threads), is there a way to do that without contacting my hosting company?
Updating the web.config will stop your application. Assuming you can access the files of the website.
Have you tried stopping the application pool for the website?
I was logging some information from the threads I wanted to kill to a file. The code is able to create the file if it doesn't exists, but the same does not apply, to the folder it's in.
So I temporarily renamed the folder with the log and the threads seem to have stopped.
Related
We're having a problem where the application pools restarts (and loses all sessions) when deleting a folder in a virtual directory. This is not ImageResizers fault, but ASP.NET. We cannot replicate the issue on a static web site.
I'm wondering if someone has resolved this issue? We're thinking about creating a separate web page just for ImageResizer and image content. Maybe there is a simpler way?
This solution did not work for us: http://www.aaronblake.co.uk/blog/2009/09/28/bug-fix-application-restarts-on-directory-delete-in-asp-net/
IIS and ASP.NET - both - have independent FileSystemWatchers. If you disable these, the problem should go away.
See http://imageresizing.net/docs/howto/avoid-network-limit for more information.
While the article above mentions the problem from the perspective of a latent network-based storage, the problem can appear in other ways - such as upon directory deletion.
I also suggest avoiding deleting folders on ASP.NET websites; it's a painful goal, and likely to fail due to read locks. Goodnes, folder deletion rarely works on windows — even when it's not part of a web site being actively served to clients.
It seems to be possible to use IIS application instead of Virtual directory, and use a different application pool for the image archive. I tried, and the problem was solved... this appool restarted instead, which didn't affect the web pool. Great success!
However, I don't think we will be doing this either. It seems as a IIS application needs it's own bin folder. I had to copy the image resizer DLLs here. I also had to write another global.asax for auto 404 images in Application_Start. It works... I just don't want the image directory to have a lot of code in it. It is synced from a third party.
A couple of questions:
1) How can I update a Classic ASP website/page without interrupting service (users getting an error or service unavailable message) or shutting the website down temporarily?
2) When updating/restoring a MSSQL DB via SQL Server Management Studio, will the website users get an error message?
Thanks in advance.
A smart practice is to use at least one separate development environment with the same setup as your production environment and debug all changes there to ensure that they work. Once your entire site is running and tested on the other, identical environment to your production environment, you should be able to simply move the files and they should work in production. This model being effective is dependent on actually being able to maintain environments as close to identical to each other as possible.
When updating/restoring a MSSQL DB
Be careful with your terminology; UPDATE and RESTORE are two very different commands.
If the database is locked by the changes being made, then it will be inaccessible to users and may cause error messages depending on your IIS and code setup. Scheduling a maintenance period and blocking user access to any pages that access the database is will help avoid messy errors and revealing any information about your infrastructure while the changes are being made.
It seems like you might want to do some basic research on development and databases both in order to make sure you understand what you're doing and can cover all of your bases. Looking up commands like RESTORE and UPDATE and using them correctly is crucial.
For example, when you rewrite one or more of your website files
via FTP, in that very moment when rewriting is taking place,
users will get a 500 Service Unavailable error. How can I avoid this?
This really shouldn't happen, although you could upload the files to a different folder, avoiding any delay there, and sync the files with a diff tool such as Winmerge (also helping you keep track of changes and revert quickly) when done uploading.
I have the following scenario:
I publish a page which contains multiple binaries which is then received by an HTTP Receiver and Deployed using an in-process Deployer all hosted in IIS in a dedicated application pool running as the Local Service user.
The Page is stored in the Broker Database, and the binaries are published to the local file system using a path like "D:\Binaries\Preview".
The preview folder is shared to a domain user as a read only share at something like \machinename\PreviewBinaries so that the binaries can be displayed using the web application.
Nine time out of ten everything works fine, but occasionally publishing fails, and it seems to be because the binaries can not be overwritten due to them being locked by another process. I have used ProcessMon and other tools to try and establish what might be locking these files (to no avail). Sometimes I can manually delete the images, and then publishing works again. If I restart IIS on the server I can always delete the files and publish.
Does anyone have any suggestions on what processes could be locking these images? Has anyone seen this issue before? Could there be any issues that I am publishing to a share? Or could SiteEdit 2009 possibly be locking these files as it only seems to occur on our preview server and live (no SiteEdit) seems fine.
Thanks in advance
If you're on Windows 2008, you can try and delete the file from disk. It will then tell you what process has locked the file. But given that restarting IIS unlocks the file, it seems quite likely that it is IIS that keeps a lock on them.
I don't see how SiteEdit 2009 could cause a lock on these files. Given that you can have your preview server on another box, SiteEdit only talks to that server through HTTP. It never accesses the files on the preview server directly and not even through a CD API. Just regular requests to your web server, just like a visitor would.
Again, not a direct answer but I wanted to share this anyway:
I've seen a similar situation where I published Pages to the Broker Database and Binaries to the file system. When I changed the Identity of the Application Pool to Network Service this problem disappeared, and I haven't looked into it further.
OK, well it seems the offending code was in the Presentation Framework we are using. The framework used Response.TransmitFile(binaryPath) to asynchronously transmit the binaries to the clients. It seems that this puts a temporary lock handle on the binaries (even when they are on a read only share).
We have removed this line of code, and modified the application to server binaries in another way (we now rewrite the path so that IIS can transmit the files directly). This seems to have solved the issue, and improved site performance.
Thanks for all your suggestions, it helped me rule out all the things that were not causing the issue, so I was able to find the root cause.
Are there any Anti-virus or indexing services running. These tend to take very short-lived locks at just the moment you don't want them to. Particularly with Anti-virus, this is typically just as one process relinquishes its lock and just before your other process tries to take one. If this is the issue, then setting up some exclusion directories should help.
I see you have used Process Monitor, but have you tried Sysinternals Process Explorer? "Find->Find Handle or Dll" is pretty useful for this kind of thing. Or if you prefer a command line tool, Sysinternals aslo make handle.exe, which dumps everything out for you.
Basically I want the effect that would occur if I were to edit the web.config file. The application basically completely unloads itself and starts again, thus re-firing Application_Start and also ditching any dynamically created Types created by the now-defunct AppDomain.
EDIT
I need to do this in my C# code inside my web application. I know it can be done; I did it ages ago but have since lost the code and forgotten how I did it.
For full trust you can use HttpRuntime.UnloadAppDomain(). If you aren't running in full trust you can modify the last write time on the web.config file. Rick Strahl has wrapped these two approaches up in a nice class.
You can "touch" the web.config file (i.e. rewrite it to disk unchanged), or any file in the bin directory to recycle the application. Of course this means the identity under which your application is running needs appropriate permissions.
Lately I seem to be answering my own questions a lot :P
Here we go:
HttpRuntime.UnloadAppDomain();
If all the options above fail, you can also create an endless recursive function as a final resort. The resulting stackoverflow exception will force a reload of the application. (don't do this when you have the visual studio debugger attached)
In IIS you can recycle the worker processes. You don't need to restart IIS.
http://www.microsoft.com/technet/prodtechnol/WindowsServer2003/Library/IIS/24e3c22e-79a9-4f07-a407-dbd0e7f35432.mspx?mfr=true
If you have created a separate application pool for your application, you can reset the Application Pool.
In general, it's always a good idea to have separate app pool's for each application.
Is it possible to update the site dll for a precompiled site without stopping IIS.
Currently, if I try to just copy the new file to overwrite the current file, All users receive runtime errors while the file is being copied. Is there a way to avoid this?
even if you don't stop, any change to the web.config file, BIN folder, App_Data or App_Code will force the .NET compiler to perform ...
and you will loose any Session variables in memory.
What I do is to use Session State in SQL Mode and if your system is set up like this, user will remain in the site (after a longer exposition to a page reload)
.NET will still invoke the compiler in order to compile the new set of instructions but soon it is done, all sessions will be read from SQL Server and because they are still there (and not lost with a memory refresh) users will remain in the website with current credentials.
it is a little bit slower than In-Memory Session State, but much more reliable, specially with Shared hosting :) this is the way to increse/decrese the minutes in your session, as Shared hosting do not allow it to change even if you do
Session.Timeout = 5;
their machine configuration will override everything you do, with SQL Session State, you will be able to set your time as this is all made by SQL Server.
Fell free to read this article to know how everything is done.
Hope it helps.