We currently have an web application running on IIS6 on a 32bit machine.
This application is running smooth and stable. It is built with the target "Any CPU".
Now we are copying this exact application to an 64bit IIS7 machine. We only changed the web.config, according to the IIS7 format.
On the application pool we have set the "enable 32bit applications" to true.
When we put some load on this new server, the application behaves entirely different from it's old environment.
We see high CPU and high memory usage. And the memory (private bytes) goes up, but does not seem to be reclaimed when the load ends.
This is not what we expected.
Did anyone encounter this same behavior?
We expect it to be some mistake in the server or application configuration. Any suggestions what settings to check?
IIS 7 introduces the new pipe line mode "integrated" for application pools. Running your application in this mode can change the applications behaviour. If you use this mode, change it to "classic" and try again.
Related
I have a curious problem.
A 64-bit Windows 7 Pro box has IIS 7.5/64 installed. An application pool has been configured to enable 32-bit applications, and a site has been created using this application pool. However, when the worker process for the site starts, it is always started as a 64-bit process. The "bitness" of the app pool is evidenced in Task Manager, which has no "*32" tag with the w3wp.exe process name.
I've tried deleting and recreating the application pool, creating a new application pool, and re-registering the 32-bit framework, restarting IIS each time, all to no avail. I'm sure I'm overlooking something trivial, in "forest for the trees" mode, so a push in the right direction to clear up the fog would be appreciated.
I've read through several posts on issues like this, but most center on ensuring the app pool is configured to "Enable 32-bit apps," which was one of the first steps taken.
Have you set
enable32BitAppOnWin64
to true?
This is exposed as a property on the ApplicationPool class that can be set programmatically: ApplicationPool.Enable32BitAppOnWin64.
Also, here is a SO question about setting the property on IIS 7 (not sure if you are on Azure or not, but I believe the idea shold be transferrable to a non-Azure environment).
How to change property Enable32BitAppOnWin64 of Application Pool on IIS 7 on Windows Azure?
Finally, you could set it up in your Application Pool Defaults: http://www.iis.net/configreference/system.applicationhost/applicationpools/applicationpooldefaults
Well, after a couple of days of head scratching, I finally solved this problem.
After verifying through the IIS management tool that each application pool on the box had "Enable 32-bit Applications" explicitly set to True, I went into the machine's applicationHost.config, and those same pools had this property set to False. I manually edited these entries to True, restarted IIS, then re-registered the 32-bit version of ASP.NET, and the machine now operates properly.
Thanks for the suggestions. I'm not entirely why the IIS admin tool was essentially lying to me about the Enable 32 bit app setting, but oh, well... :)
I have a .net 4.0 website that was running on physical Windows server 2003 boxes on IIS6 32bit just fine. We have migrated to new virtual servers running Windows Server 2008 32 bit with IIS7. The application pool is running in classic mode.
Since the move, I at random get a situation where the application hangs. The request queue rockets and then I get 503 errors. If the application pool is recycled, then the error goes away till the next time it occurs.
There are no entries in the event logs relating to it except that it notes when the application pool took to long to shut down during the recycle process. I have reporting in my .net application that logs to a DB and sends me errors but it sends me nothing when this application is hanging.
What tools can I use to diagnose the problem and figure out what is causing it?
In opinion, whatever the structure of application is, if you're pretty sure that there is no bug, changing sort of properties on the application pool might solve the problem.
First on the 'Advanced Settings...' menu of the application pool change the 'Enable 32-Bit Applications' to the relevant value. If the platform used for building application is x86 then the value should be 'True'.
Second if your application demands to access disk resources, also you should the relevant security context for running the application pool in the 'Identity' property. The identity should access all the directories that your application wants to change or list.
After doing all the things, in case the problem existed, you should let me know about the platform of your application. Is it .NET or ISAPI?
Cheers
I have referenced some 32 bit and some 64 bit DLL in my ASP.NET MVC 3 project.
The projects compile but I get runtime errors.
It's because I am running the web project as 64 bit.
How do I "enable 32 bit" in my local IIS (just how I can do it in IIS 7.5 Pro)?
I am using .NET 4.0
The error I get is:
Retrieving the COM class factory for component with CLSID {A6775dfd2-1dfF-421C-A187-4D55F4DDFBFF} failed due to the following error: 80040154 Class not registered (Exception from HRESULT: 0x80040154 (REGDB_E_CLASSNOTREG)).
If you don't require the 64 bit component (not sure what is running there or if this can be excluded as you simply wanted to know how to run in 32 bit more)
http://learn.iis.net/page.aspx/201/32-bit-mode-worker-processes/
You can set it at the server level via:
%windir%\system32\inetsrv\appcmd set config -section:applicationPools -applicationPoolDefaults.enable32BitAppOnWin64:true
Or set your particular app pool (more recommended imho) you can try the following. Sorry the page this came from is no longer seemingly active and only googles caches is showing it now:
Force IIS to create a 32-bit app pool worker process
If your application is running as a web app, meaning it is running in
an IIS app pool worker process, you’ll want that worker process
(w3wp.exe) to be a 32-bit process. That can be specified in the
advanced settings of the app pool:
Select the app pool for your web app. Click Advanced Settings… under
Edit Application Pool on the right. Change the value of Enable 32-Bit
Applications under (General) to True.
Note that the word “Enable” in the setting doesn’t mean “allow” as in
“allow either 32- or 64-bit applications.” It actually means “force”
as in “force the worker process to run in 32-bit mode so that 32-bit
applications are supported.” This means that the app pool worker
process will always be launched as a 32-bit process when this setting
has a value of True. Setting it to False will launch a 64-bit app
pool worker process.
Note that when an app pool worker process is started, it shows up in
the Image Name column on the Processes tab in Task Manager as
w3wp.exe. When the Enable 32-Bit Applications setting has a value of
True, it will be listed as w3wp.exe *32.
IIS Express 7.5 (as used by Visual Studio 2010 if you install it) is 32 bit only:
http://learn.iis.net/page.aspx/1265/iis-75-express-readme/
To quote:
Both 32-bit and 64-bit systems are supported, however only a 32-bit build of IIS 7.5 Express exists.
So I can't imagine that your problems would be related to the usual 32bit / 64bit pool mode issues that can arise if all of your DLL's are 32bit.
However if you're trying to load a 64 bit COM DLL then this will fail; 64 binaries can't be loaded into a 32 bit process and vice versa.
Another gotcha is forgetting to tick the Use IIS Express checkbox when choosing which web server to debug with:
If you don't tick that checkbox then you'll run your site in a child application in the DefaultWebSite on the version of IIS7 that ships with Windows.
The DefaultWebSite runs in the DefaultAppPool, which in 64 bit versions of Windows runs as a 64 bit process. So you need to change the DefaultAppPool to run as 32 bit if you want to use this instead and consume 32 bit binaries.
You need to do this using IIS7's MMC snap-in or by running the appcmd.exe tool from the command line.
Set your compile target to x86 instead of AnyCPU or x64. Your dll will always run in 32-bit without you needing to mess with the IIS server settings.
I have a web service which loads a 32-bit COM component. I am running this web service with IIS server in my local machine.
When I load the the test page from Visual Studio it succeeds, on the other hand, while loading it using IIS, it display following error
Retrieving the COM class factory for component XXX failed due to the following error: 80070005.
I tried changing the webservice's platform to x86 from Any CPU but that dint help. I am running this on Windows Server 2008 R2 - 64 bit.
I had to enable the 32-bit Applications from the Application Pools settings.
Check permissions on that COM. It may be that when you're running tests from VS, you're running as you (admin), while the user running the website's app-pool is totally different. That user needs to be added read+execute (or, activate, whatever) permissions for "local".
Maybe also see this: Retrieving the COM class factory for component error while generating word document
Sarat, this cannot be right. The "Enable 32-bit Applications" under Application Pools Defaults is not for running 32-bit applications or to solve your problem. It is there to enforce running 32-bit applications UNDER 32-bit processes only, which is not necessary in this case. Most 32-bit applications run fine on 64-bit processes. That's why you can run MS Office 2010 (which is still a 32-bit application) on Windows 7 64-bit machines.
You must have other settings tried and true after nearly a full year answering the original problem.
In IIS6 it's possible to have more than one ASP.NET application running in the same application pool. This is fine, except that there is nothing in IIS6 that prevents you from running multiple .NET versions in the same pool.
When you create application pools in IIS7 you must explicitly state was .NET version will be running in that pool. Running multiple .NET versions in IIS7, in the same application, is impossible.
How can I enforce such rules on my IIS6 server in order to prevent my deployment team from creating such problems?
What I do:
Step 1. Create the following application pools:
.NET 1.1 Apps
.NET 2.0 Apps
Step 2. Disable the "Default App Pool"
Now, any time a new application is configured in IIS, it will not work right away because the default app pool is disabled. This forces the person configuring the application to select an app pool that is appropriate for the .NET framework version of the app.
We tend to use one application pool per site, so that each application is isolated in its own process space. Application pool recycles will only affect a single application, and each worker process ends up with its own 4gb memory space. Badly programmed applications have no chance of affecting other applications, resulting in a highly isolated deployment model.
We've also standardized on x64 OS builds running 32bit application pools. While there is overhead using this technique since each application ppol contains a separate copy of the .Net framework, we feel that the added granularity of the application space adds stability to our deployments. You also get the ability to run each application as it's own domain identity, allowing for further memory space isolation and eliminating any need for identity impersonate in web configs.
With IIS 7, you have the ability to run each application pool as either 32 bit or 64 bit, so you can run large memory applications in 64 bit application pools. IIS 7 application pool security is also much more simplified.
I don't think you can. What I do is name my app pools in IIS 6 so that they show what .Net version they host. That way it's easy to pick the correct app pool when creating a new application.
If I remember correctly, you can also setup application pools in IIS6 (Windows 2003). Create one application pool per framework version in use.
I am not aware of any possibility to enforce the version of the .NET framework being used by an application. If you have setup an application pool to use .NET 1.1 and you have a .NET 2.0 application running in that application pool, you will get an exception in the application (yellow screen of death), since it will not find some referenced assemblies and classes.