Upgraded to VS2015 yesterday and noticed when running our web application locally that the performance is awful - Chrome dev tools shows requests for scripts, images and fonts etc. taking upwards of 60 seconds to complete which makes debugging a nightmare.
Has anyone else had similar issues with IIS Express 10? I've tried disabling failed request tracing as suggested here (it's an old thread though) but it made zero difference to the speed.
This is definitely an IIS Express 10 issue as I've just uninstalled it and reverted to version 8 and the problem has gone away. For now we'll keep using 8 but as I couldn't find anything online about this issue I wanted to raise it and see if I'm the only one.
So I eventually worked this one out: in our organization we have home drives that are mapped to network locations. With IIS Express 10 the default location for logging is set to %IIS_USER_HOME% which for me was the network folder.
By default, trace logging is enabled which results in a ~500kb log file for each resource on the page (images, scripts, style-sheets etc.) and this is what caused the slow page loading due to the amount of data being saved across the network.
The fix is fairly straightforward: within the solution directory, open the .vs\config folder and edit the applicationhost.config file. Find the <sites> collection and update the folder location for logFile and traceFailedRequestsLogging to a local path, for example:
<siteDefaults>
<logFile logFormat="W3C" directory="c:\IISExpress\Logs" />
<traceFailedRequestsLogging directory="c:\IISExpress\TraceLogFiles" enabled="true" maxLogFileSizeKB="1024" />
</siteDefaults>
If required you can also disable trace logging by setting enabled="false" in the above snippet.
With these changes made, IIS Express 10 was back to normal and running just as quick as previous versions.
Related
I'm trying to learn ASP.NET and .NET/C# for work. I wasn't able to find how to make something go live...the right way. Every instruction I find forgets to speak on how to make an app live.
Taking it into my own hands, I've triend many things and I was able to publish my files online to my Godaddy hosting account (PLESK), but I kept getting 403/404/500 errors.
Eventually after a few days and trying a few things, I was able to get a basic test page up by editing web.config and as an example, adding something like:
<defaultDocument enabled="true">
<files>
<add value="MYHOMEPAGEFILEHERE.html" />
</files>
</defaultDocument>
This worked because it's so basic and it points to basic html file, but the problem I'm having though is that I'm trying to launch a site that has razor/routes established/classes and C# stuff (no DB yet) and on my local machine, the site/app works but when I upload it, it always breaks and I get 403s, 500s etc...
I thought to myself "maybe add default document path/value to any of the starting C# files, like I would do in a MEAN app (via app.js)".
But when I looked in the server, none of those files are there, I'm going to assume that all the C# files, are compiled into dll files?
If this is the case, how do I go about instructing the server to start at any particular file? Essentially, what can I do, to upload my working mini app(locally) to an online server, and it just work?
I've followed everythng I could find wether it was allowing user access, adding permissions to directories etc...
I have a .Net project that started in Visual Studio 2008, upgraded to VS 2012 and is now in VS 2015. I'm having an issue where I update basic HTML in an ASPX page and when I refresh my browser, none of the changes are pushed across. This happens every time the project is opened and only gets the current version of the file on the first build. If I open the project, build the project, view it, any changes after this point are never displayed in the browser unless I completely close out of VS.
I did think it might be the browser but I've loaded a completely different browser (after clearing my local cache) that I hadn't used before on this machine and the page is still displaying the old information. So I feel like there is a setting somewhere in the project that got carried over from a previous version of VS that is causing the local IIS Express to not pull the latest ASPX file from the hard drive.
I tried to delete the .vs directory while VS was closed and then start it up but I had the same result.
How can I get my changes in VS be visible to IIS Express and then to whatever browser I'm using to view it?
Update: I recently updated a CSS file and those changes were immediately seen in the browser. So, it's only the ASPX files that are not updating when saved.
After some time and digging around, I found that this happened on my DEV server as well. I looked through the web.config file and this setting had some how been added to the system:
<system.web>
<httpRuntime fcnMode="Disabled" />
</system.web>
This fcnMode is a file change notification setting that communicates when a file has been updated so that the old information isn't served up again. I removed this setting entire (setting it back to the default) and it's working as expected now in all environments.
Some more info on this setting if you'd like to read the MSDN docs:
https://msdn.microsoft.com/en-us/library/system.web.configuration.fcnmode
I set up iis to handle .exe with "asp.net 2.0"s isapi filter to enable dynamic url replacement.
For this I setup the extension in iis and added the following line to web.config. Works fine so far.
<add path="*.exe" verb="*" type="System.Web.StaticFileHandler" />
the problem is that form that point w3wp process has several handles on these files. Most likely because someone is downloading them at the moment.
Is there a way to tell iis/asp.net not to put a exclusive handle on the file? I want to replace them even if the site is running. This works as long as asp.net does not handle these files.
I don't think there is a way to do this. You are telling IIS to handle this filetype, so the server is assuming a certain exclusivity. This behavior is intended IMO, because it prevents corrupted files in production environment.
Maybe it helps to limit the requests to some selected HTTP verbs (POST/GET), but i don't see any other options.
And the other question is: why do you want to replace a file, that is currently downloaded by a user? It will corrupt his download, forcing him to start all over again.
I have installed Static and dynamic compression for IIS7, as well as setting the two web.config values at my application Virtual Folder level. As I understand it, I don't need to enable compression at the server, or site level anymore, and I can manage it on a per folder basis using my web.config file.
I have two settings in my .config file that I have set to customize gzip for my app:
<httpCompression dynamicCompressionDisableCpuUsage="90"
dynamicCompressionEnableCpuUsage="0">
<scheme name="gzip" dll="%Windir%\system32\inetsrv\gzip.dll" />
<dynamicTypes>
<remove mimeType="*/*"/>
<add mimeType="*/*" enabled="true" />
</dynamicTypes>
</httpCompression>
<urlCompression doDynamicCompression="true"
dynamicCompressionBeforeCache="true" />
However, when I run the application, I can clearly see that gzip is not used, because my page sizes are the same. I am also using YSlow for FireFox, which also confirms that my pages are not being gziped.
What am I missing here? In IIS6 it was a simple matter of specifying the file types, and setting the compression level between 0-10. I don't see the need documented to specify the file types or compression level, since the defaults seem to cover the file types, and I'm not seeing the level anywhere.
There was a thread on forums.iis.net about this during the iis 7 beta. Turned out the guy didn't have the modules installed, but it sounds like you've ruled that out from your opening sentence.
Microsofts key advice for him was to enable failed request tracing to find out what was going wrong. This is possibly one of the most under-appreciated features of IIS7, but certainly one of the most powerful.
Open IIS Manager.
Go to your site, and on the actions pane (the very far right), click 'Failed Request Tracing...' under the 'Configure' section.
Click 'enable'.
Then, in the features view, click 'Failed request tracing rules'. Click add, next, enter 200 for the status code, next, click finish.
If you don't see "Failed Request Tracing" in the actions pane, you'll need to add the feature to the server - either using the "Add Role Services" wizard (Health and Diagnostics\Tracing) or through the Web Platform Installer (Products\Server\IIS: Tracing), and then close and re-open IIS Manager.
Next, rerun your test. This will generate some log info for us to examine.
Look in c:\inetpub\logs\FailedReqLogFiles\w3svcx. You will see a bunch of files named fr000xx.xml. Open up any one of them in your browser. (By the way, if you copy these files anywhere, make sure freb.xsl is there. Also, don't delete freb.xsl - if you do, just delete the whole directory or copy it from another location, as IIS only creates it once per folder.)
Click the 'request details' tab and select 'complete request trace'. Search the page for 'compress' - you should find it in several areas; once for static content, and once for dynamic content.
If you don't find either of them, IIS isn't configured correctly. If you do find them, you should see them followed by a compression_success and a compression_do. Success is self explanatory; the 'do' indicates what it did - in my case, it showed "OriginalSize 1462784 CompressedSize 179482"
Since yours isn't working, hopefully you will see something different that helps you solve the problem.
Make sure you turn this off when you're done by disabling failed request tracing in the actions pane for your website.
We had a similar problem and it turns out that IIS7 does some dynamic CPU based throttling here..
http://www.iis.net/ConfigReference/system.webServer/httpCompression
dynamicCompressionDisableCpuUsage
Optional uint attribute.
Specifies the percentage of CPU utilization at which dynamic compression will be disabled.
Note: This attribute acts as an upper CPU limit at which dynamic compression is turned off. When CPU utilization falls below the value specified in the dynamicCompressionEnableCpuUsage attribute, dynamic compression will be reenabled.
The default value is 90.
dynamicCompressionEnableCpuUsage
Optional uint attribute.
Specifies the percentage of CPU utilization below which dynamic compression will be enabled. The value must be between 0 and 100. Average CPU utilization is calculated every 30 seconds.
Note: This attribute acts as a lower CPU limit below which dynamic compression is turned on. When CPU utilization rises above the value specified in the dynamicCompressionDisableCpuUsage attribute, dynamic compression will be disabled.
The default value is 50.
Note the defaults -- if your IIS7 hits 90% CPU usage, it will disable all dynamic gzipped content until CPU usage dips back below 50%!
Also, some great recommendations and benchmarks here on the real CPU cost of GZIP.
http://weblogs.asp.net/owscott/archive/2009/02/22/iis-7-compression-good-bad-how-much.aspx
Long story short, unless you regularly have dynamic pages well in excess of 200kb, it's a non-issue.
Following the excellent advice of JohnW, I too enabled logging to find the culprit, though the reason for the failure turned out to be different:
STATIC_COMPRESSION_NOT_SUCCESS
Reason 14
Reason NOT_FREQUENTLY_HIT
In short, it appears that if you don't hit the page frequently enough then IIS7 will not deem it worthy of compressing, which seems a little bit odd to me. Nonetheless, makes sense in this case because I was just trying to test it on a local machine.
According to this page, the default appears to be that a page has to be hit 2 times within 10 seconds to be a "frequent hit". If you really want to, you can override the default in applicationHost.config (%systemroot%\Windows\System32\inetsrv\config). At least for me it's a locked attribute, so you won't be able to override it in your own web.config.
<serverRuntime frequentHitThreshold="1" />
Also, I note now that SO already had this answer here: In IIS7, gzipped files do not stay that way.
I solved my problem by installing dynamic compression at Add/Remove programs.
In the system.webServer section of your Web.config file, add the following lines:
<remove fileExtension=".js" />
<mimeMap fileExtension=".js" mimeType="application/x-javascript" />
The compression scheme in IIS7 is enabled by default, but it maps only a single javascript mime type to be compressed, application/x-javascript. Adding the line above tells IIS to give all your .js files that mime type, which in turn makes the compression work.
turn on static compression. dynamic compression is for dynamic pages like asp, php, aspx, etc.
Here's a link to the IIS config reference for compression:
For me it turned out to be the setting
noCompressionForProxies
as we are on a proxy here...took myself off proxy and voila, compression.
Platform: IIS 6, ASP.Net 2.0 (.Net 3.5), Server 2003.
I'm building an application that accepts files from a user, processes them, and returns a result. The file is uploaded using HTTP POST to an ASP.Net web form. The application is expecting some large files (hundreds of MB).
I'm using SWFUpload to accomplish the upload with a nice progress bar, but that's not contributing to the issue, because when I bypass it using a standard HTML form pointing at my upload accepter page, I get the exact same error. When using the progress bar, the upload continues to 100%, then fails. With a standard form, the behavior appears to be the same.
I'm having a problem right now uploading a file that's about 150MB. I've changed every settings I can find, but still no luck.
Here's a summary of what I've changed so far:
In Web.config:
Added this inside system.web:
<httpRuntime executionTimeout="3600" maxRequestLength="1536000"/>
In machine.config:
Inside system.web, changed:
<processModel autoConfig="true" />
to:
<processModel autoConfig="true" responseDeadlockInterval="00:30:00" responseRestartDeadlockInterval="00:30:00" />
and in MetaBase.xml:
Changed:
AspMaxRequestEntityAllowed="204800"
to:
AspMaxRequestEntityAllowed="200000000"
When the upload fails, I get a 404 error from IIS. My web form does not begin processing, or at least, it doesn't make it to the Page_Load event. I threw an exception at the beginning of that handler, and it doesn't execute at all on large files.
Everything works fine with smaller files (I've tested up to about 5.5MB). I'm not exactly sure what file size is the limit, but I know that my limit needs to be higher than 150MB, since this is not the largest file that the client will need to upload.
Can anyone help?
Urlscan was active on all websites, and has it's own request entity length limit. I wasn't aware that Urlscan was running on our server because it was a global ISAPI filter, not running on my individual website.
Note: to locate global ISAPI filters, right click on the Web Sites folder in IIS Admin and click Properties, then on the ISAPI Filters tab.
(A note for googlers):
For IIS7 add below to web.config (I added above <system.serviceModel>):
<system.webServer>
<security>
<requestFiltering><requestLimits maxAllowedContentLength="262144000" /></requestFiltering> <!-- maxAllowedContentLength is in bytes. Defaults to 30,000,000 -->
</security>
</system.webServer>
When we ran into this issue we had to increase the buffer size limit according to this KB article:
http://support.microsoft.com/kb/944886/en-us
I know this mentions ASP, but I believe it worked for ASP.NET as well.
Edit: Here is a link that might be more relevant to your issue and provide other options:
http://weblogs.asp.net/jgalloway/archive/2008/01/08/large-file-uploads-in-asp-net.aspx
404 and missing Page_Load: IIS can only process the request once the complete POST is on the server. Therefore, if the POST fails (due to its size), it cannot fire the page's events.
You might try NeatUpload http://www.brettle.com/neatupload.
From the Manual: "By default, NeatUpload does not directly limit the size of uploads."
You can also try Velodoc XP Edition which has several advantages over NeatUpload including the fact that it uses ASP.NET Ajax extensions. See also the Velodoc web site for more information.
You say:
But 1536000 is only 1.5MB?