We are getting frequent errors in the Event Viewer, Application section. The source is ASP.NET 4.0.30319.0, category is File Monitoring. The Event ID is 1185. Text says "Failed to start monitoring changes to "file-path-here" because the network BIOS command limit has been reached." Then there is a reference to Microsoft knowledge base article 810886.
The question is: what process or service is doing this file monitoring, and why? We are not aware of how this is running or how it started. The monitoring seems to look at various folders on our web site, some are .NET folders, some are not.
We are looking for explanation of what is causing this monitoring; then we will try to address the errors.
When a asp.net starts to run a site, is monitoring one basic file (on the root of the web site), this is the app_offline.htm and if he finds it is stop the program showing only this file.
If find that other file change is recompile them if this is nessesary, but still showing the app_offline.htm if exist and not run the site.
Ones you remove the app_offline.htm the web pages starts run again, but asp.net still monitoring for this file - if exist or not.
So this is the monitoring of the asp.net you search for. Of course this is the default behaviour of asp.net. If you have install other software or something else have been on computer and fill it with monitoring, this is something different. I assume that you have too many web sites asp.net on the same server ? 500 or more ? if not then maybe you start searching for other software that make this monitoring of your files.
Analysis
How to find your self whats happends, download the Handle from sysinternals and run it, make the out on a text file like handle.exe >> result.txt and look the results.
http://technet.microsoft.com/en-us/sysinternals/bb896655
Look there if any suspicious program have open huge amount of files and what program is that. Monitoring files and directories are shown like
runningprogram.exe pid: 1352 ServerName\User
AC: File (RW-) D:\Monitor1
E8: File (RW-) D:\Monitor2
F8: File (RW-) D:\Monitor3
408: File (RWD) D:\InetPub\MySite
More
I check on my servers and found that a blog creation program have add a monitoring on every directory blog - I do not know why - but this is the way they have made it, to monitoring every blog for some reason, maybe you have something similar that creates a lot of file/directory monitoring for some reason.
The monitoring is being done by IIS (or the aspnet process with IIS6). It's watching for changes to files so that the site can be recompiled when needed.
You didn't mention your environment, but I used to run into this problem frequently when trying to run websites from Windows XP when the sites were located on a remote file share. I think the error comes up due to a limitation in CIFS (the network stack for file shares). Windows Server didn't seem to have the same limitations.
So, a few possible fixes:
Switch to Windows Server (or possibly Win 7)
Switch to a Web Application (doesn't allow recompiles)
Move your files from a remote share to a local drive
Related
Our TeamCity agent machines have been struggling for disk space recently. I did a little snooping on each of the machines to find that the Temporary ASP.NET folder in the .NET installation directory was taking up more than 10GB of space on each box, each folder comprising of about 5MB each.
C:\Windows\Microsoft.NET\Framework\v4.0.30319\Temporary ASP.NET Files
I've done my research on the subject and I know that the files are a biproduct of ASP.NET's dynamic compliation, I also understand how IIS uses them for request optimisation (See: Understanding ASP.NET Dynamic Compilation)
What I don't understand is why no one else is complaining about how these files are taking up disk space on their build servers when they really only need to be used on their web servers.
Surely someone out there has run into this problem before, can anyone offer me a solution other than
Disabling dynamic compliation (outlined here)
Doing a brute force scheduled job deletion (outlined here)
These files are created when a website is actually run under IIS. They are not created when it is built and should not be on your build server. I am using TeamCity to build websites and do not have this issue. The files are created on the webservers of course, but not the build server.
Are you launching the website (for UI unit testing) perhaps?
In the end I had to settle for a PowerShell script that is run once a month as scheduled task, I followed Bredan's process outlined in this SysAdminSpot post, with only a few modifications
I have the following scenario:
I publish a page which contains multiple binaries which is then received by an HTTP Receiver and Deployed using an in-process Deployer all hosted in IIS in a dedicated application pool running as the Local Service user.
The Page is stored in the Broker Database, and the binaries are published to the local file system using a path like "D:\Binaries\Preview".
The preview folder is shared to a domain user as a read only share at something like \machinename\PreviewBinaries so that the binaries can be displayed using the web application.
Nine time out of ten everything works fine, but occasionally publishing fails, and it seems to be because the binaries can not be overwritten due to them being locked by another process. I have used ProcessMon and other tools to try and establish what might be locking these files (to no avail). Sometimes I can manually delete the images, and then publishing works again. If I restart IIS on the server I can always delete the files and publish.
Does anyone have any suggestions on what processes could be locking these images? Has anyone seen this issue before? Could there be any issues that I am publishing to a share? Or could SiteEdit 2009 possibly be locking these files as it only seems to occur on our preview server and live (no SiteEdit) seems fine.
Thanks in advance
If you're on Windows 2008, you can try and delete the file from disk. It will then tell you what process has locked the file. But given that restarting IIS unlocks the file, it seems quite likely that it is IIS that keeps a lock on them.
I don't see how SiteEdit 2009 could cause a lock on these files. Given that you can have your preview server on another box, SiteEdit only talks to that server through HTTP. It never accesses the files on the preview server directly and not even through a CD API. Just regular requests to your web server, just like a visitor would.
Again, not a direct answer but I wanted to share this anyway:
I've seen a similar situation where I published Pages to the Broker Database and Binaries to the file system. When I changed the Identity of the Application Pool to Network Service this problem disappeared, and I haven't looked into it further.
OK, well it seems the offending code was in the Presentation Framework we are using. The framework used Response.TransmitFile(binaryPath) to asynchronously transmit the binaries to the clients. It seems that this puts a temporary lock handle on the binaries (even when they are on a read only share).
We have removed this line of code, and modified the application to server binaries in another way (we now rewrite the path so that IIS can transmit the files directly). This seems to have solved the issue, and improved site performance.
Thanks for all your suggestions, it helped me rule out all the things that were not causing the issue, so I was able to find the root cause.
Are there any Anti-virus or indexing services running. These tend to take very short-lived locks at just the moment you don't want them to. Particularly with Anti-virus, this is typically just as one process relinquishes its lock and just before your other process tries to take one. If this is the issue, then setting up some exclusion directories should help.
I see you have used Process Monitor, but have you tried Sysinternals Process Explorer? "Find->Find Handle or Dll" is pretty useful for this kind of thing. Or if you prefer a command line tool, Sysinternals aslo make handle.exe, which dumps everything out for you.
We have an asp.net web site that is deployed on several IIS servers. The site is compile-on-demand as opposed to a pre-compiled web application.
Normally deployments go fine but every now and again we get a 401 for one of the deployed pages on one of the servers. There is nothing special about which page or which server apart from the fact that it's generally the higher traffic pages that it happens to.
The only way to rectify this is to deploy the same page again.
The ACLs look fine on the files themselves so the thought is that there is a file locking issue in the Temporary ASP.NET Files folder when the specific page is re-compiled.
Has anyone seen this before or have any suggestions how to avoid this?
Note: This only seems to have happened since we moved to .net 4.0
As far as I can tell we are getting a 401.3 Denied by resource ACL http://support.microsoft.com/kb/907273
But I have not been able to confirm this.
Those kinds of locks have always been a problem with live site deployment. The reason it's hard to replicate is because you are mid-request when copying/compiling on the server, and this ends up confusing IIS.
We operate a Blue/Green deployment strategy on a 4 tier architecture which has a web site over 4 servers at the top tier. Due to the complexity the architecture introduced for deployments, we needed a way to deploy without disturbing any traffic to the "live" site. Following Fowler's advice, but not quite in the same way, we came up with a solution that means we have 2 sites on each server (a blue and a green, or in our case site A and site B). The live site has the appropriate host header, and once we have deployed and tested to the non-live site, we then flip the headers of the 2 sites so that what was once live is now the non-live site, and vice-versa. The effect is, a robust deployment that can be done in hours and with the highest level of confidence.
This of course complicates your configuration and deployment slightly, but it's worth the effort. I guess it kind of goes without saying that you want to script both the deployment, and the host header swapping.
When i deploy to a server i bring the site down for a minute (or however long the deployment takes) - it may be down anyway during this time as pages are recompiled so it is not too much of a hit. You can do this by creating a file in the root of the app called app_offline.aspx (it needs at least 512 characters in length) once that file is created you can then copy the resources ot the folder knowing there will not be any locking issues. then when the copy is complete remove the app_offline file.
For those that want to achieve a .net website deployment without these issues, one option is to copy the new website files into a new folder first ( not the active website). Then you just change IIS to point to the new folder after all copying is complete.
This can be done in a single server environment for those of us on more limited resources without multiple servers per website.
At my work we write power shell scripts to deploy websites. The powers shell script creates a new directory with a time stamp, copies the new deployment there, then tells IIS to point the website to the new directory (leaving the old directories "orphaned" but still there).
If we really messed something up, we can simply revert by pointing IIS back at the previous date stamp directory. Otherwise if everything tests ok, we can delete the old folder.
This technique works well because you are never writing over a file while it is in use. However it still results in zero downtime. The only effect you will see is the normal .net "warm up" that occurs anytime you change the code behinds or assemblies.
I had several answers suggesting a new environment to deploy. This is something we have been considering for the long term but it's hard to justify the extra work when we regularly deploy only one or two files without a problem. I was really more interested in finding out what is actually happening and why.
In terms of a workaround, and this might sound obvious after the fact, a simple app_pool recycle solves the permissions issue and is much easier than testing for the issue and redeploying the file until the problem goes away.
I have a web application project that I publish via Visual Studio 2010 to my server. My problem is that it can take very long time before it can actually publish it. Maybe 10 minutes! It doesnt happen everytime but very often.
Here's a summary of what i have in the Output -> Build console when I try to publish:
The project is compiling - OK
Connecting to C:\Users{user}\Desktop\MyProjectTest... (this is where it can take up to 10 minutes)
The files are publishing...
Process Explorer
When I open ProcessExplorer, I see that devenv.exe is taking all the CPU. When I open this process, I see that the task that consume all the CPU is clr.dll!StrongNameSignatureVerification+0x11ee1. As soon as this task finish, after 10 minutes, the publishing task finish quickly.
Process Monitor
With Process Monitor, I have monitored the TID of clr.dll!StrongNAmeSignatureVerification and I've got MANY redondant events. For over 5 minutes, the task try to access a file that I don't have on my computer. He is searching for Microsoft.Build.Task.resources.dll. It's like if the publishing task was trying again and again and again something that doesnt exist. For your information, I'm using Windows 7 French with Visual Studio 2010 English. On the screenshot, you see like 10 events of over 2000 events of the same thing!
More info on my setup
Here is some info that can help to identify the problem:
My application is build with MVC3
I have a few third party dlls. Some of them are signed.
I'm publishing with the File System method.
I tried to publish on my local computer and the problem is also there so its not a network problem between my computer and my server.
I have tested on Windows 7 x86 & x64 French edition
My Visual Studio 2010 SP1 is the English edition
UPDATE 2011-09-23
I now know how to solve the problem BUT I don't know what is causing it. If I delete the **.suo file (at the same level as the .sln file) and i reopen Visual Studio, the publishing will be really fast. So reinitializing the .suo file seems to solve the problem each time the publishing get slow.
Just to make another test, I've made a backup of the .suo file when the publishing was slow and deleted it. Now the publishing is fast. If I copy the .suo file back to his position and reopen Visual Studio, the publishing will be slow again. So all seems to point to that file.
Any idea on this one?
Try this way
In order to deploy the release on the development or production server, please follow the following steps.
Install Web Deployment MSI.
Right Click on the your project under the solution explorer and add the web deployment project( Here I am not using convert to web application or publishing)
Then compile the files. This will create a folder in your project directory which will contain the required files to be deploy on the server.
Take the back up of your virtual directory and remove the virtual directory as well as files from inetpub.
Goto Inet mgr , type inetmgr in run hit enter.
Under the default website, create a virtual directory, keep the deployed files in the inetpub and browser the files.
Allow appropriate access such as read, run script and browse. That's all
Flag it as your answer if you have find it useful else let me know ...
I am not sure if it's a suo file that is causing, but for me, this solved the problem.
After compiling, the publish will call aspnet_compiler, which actually takes longer as it is generating custom dll for all code.
But check with your VSPackages, is there any package written for some interpretation or so, that might be interrupting your publish.
I just experienced the same problem publishing to a network share and discovered that copying the files in Windows Explorer was also extremely slow. When I zipped the build folder and copied it across it took a few seconds. I conclude that the VPN OR Antivirus OR Firewall at one end or the other are adding some overhead to every file transfer.
Windows is far faster transferring 1 giant file than transferring thousands of tiny files, even it the net size is the same. So try this:
Publish to a local file (not directly to the IIS file share).
Zip the local files (these files will compress well).
Use file explorer, delete the IIS file share files.
Copy/paste the local zipped file to the network share.
Unzip the files on the server using file explorer.
(you will not have to remote into the server to do this)
This accomplishes a couple things. 1) Its 1 giant file, not thousands of tiny files. 2) The compressed file will be compressed 50% to 80% so the data transfer over the wire will be that much smaller.
If you need a backup, its the same process in reverse but without publishing. I typically use L7.zip, but the built in windows zip will work. I don't know why Visual Studio cannot do this programmatically.
Maybe i'm totally outdated but for last four years i've been using simple FTP upload feature while uploading new website even without building it within Visual Studio. Just bunch of ASPX and CS files as in Visual Studio.
I do understand that compiling the project will provide me with some security defence so ones who have access to the server won't be able to read those files in text editors and i will avoid first time compilation but is that so important?
I mean, you can always do a lot of harm if you have access to server that just reading CS files instead of DLL.
First time compilation usually takes no more than 1 minute just searching for compiled version of the site will take as much time.
Now i'm watching video on PluralSight which explains new MSDeploy tool available from ASP.NET and i can't see any good reason to use it.
So what's wrong with the old fashioned way of just sending files via FTP without compiling or using fancy tools?
I did speed test and with MSDeploy i can deploy a website twice faster than old-fashioned FTPing. So instead of 4 minutes it will take 2.
Now from another perspective, when i already have alive project on the web. In which have to change Default.aspx because i have typo in some html tag. Deployment via MSDeploy will take 10 times more than uploading one file
Maybe i miss something?
MSDeploy does things which FTPing to a site can't do. Need to change a machine.config? You're unlikely to have FTP write access to the folder which contains it. Want to change a server setting in a server-version-independent manner? FTP won't do that. Etc. FTP works fine for copying files to folders in which you have write access, but that's all it can do.
When you deploy a project you can do a lot of things with it.
You can set up a job in your deploy that packages all your javascript into one file and all your css into one file.
You can set up a job in your deployment that changes a bunch of config settings to match your production server settings (rather then development settings).
The idea of deployment is that you take your current development website and transform it into a production website without having to do any of that manually.
The most important thing is that when you can only deploy your website you will never forget to package your js or forget to remove some debugging code because you can't just sneakly update a single file.