Azure WebJob timeout configuration settings - asp.net

We have an Azure web site with a daily scheduled job. In our case, the scheduled job runs a stored procedure that takes about 10 minutes so there is no local processing. Our web job is terminating after about 4 minutes with this error.
Command 'cmd /c ...' aborted due to no output and CPU activity for 121 seconds.
You may increase SCM_COMMAND_IDLE_TIMEOUT setting to solve the issue.
We've tried adding the following app settings to the web job's app.config file:
<appSettings>
<add key="SCM_COMMAND_IDLE_TIMEOUT" value="100000" />
<add key="WEBJOBS_IDLE_TIMEOUT" value="100000" />
</appSettings>
Those files are referenced in this document https://github.com/projectkudu/kudu/wiki/Web-jobs but they don't seem to have any effect. (still timing out after adding the settings to the app.config file)
Are we adding the configuration settings to the right place?
Thanks for the help.

Login to Microsoft Azure portal
Go to App Services,
Select your website
Go to settings --> Application settings tab
Create a key “WEBJOBS_IDLE_TIMEOUT” under App settings section
Save

You need to set SCM_COMMAND_IDLE_TIMEOUT from the portal to your desired timeout value in seconds. For example, set it to 3600 for a one hour timeout.
Go to your website's config section, find the app settings section, and add the setting.

As this is the first stack overflow error that comes up when searching for the timeout problem i want to update it from my perspective;
If expecting a CONTINUOUS webjob, this isn't something that happens by default (Even though the webjob appears to run by the rules until timeout). This is something that has to be selected in the Visual Studio Publish properties as detailed here:
https://learn.microsoft.com/en-us/azure/app-service/webjobs-dotnet-deploy-vs
Once continuous is selected and the webjob is published, this error goes away

Related

IIS 10 Express performance issues

Upgraded to VS2015 yesterday and noticed when running our web application locally that the performance is awful - Chrome dev tools shows requests for scripts, images and fonts etc. taking upwards of 60 seconds to complete which makes debugging a nightmare.
Has anyone else had similar issues with IIS Express 10? I've tried disabling failed request tracing as suggested here (it's an old thread though) but it made zero difference to the speed.
This is definitely an IIS Express 10 issue as I've just uninstalled it and reverted to version 8 and the problem has gone away. For now we'll keep using 8 but as I couldn't find anything online about this issue I wanted to raise it and see if I'm the only one.
So I eventually worked this one out: in our organization we have home drives that are mapped to network locations. With IIS Express 10 the default location for logging is set to %IIS_USER_HOME% which for me was the network folder.
By default, trace logging is enabled which results in a ~500kb log file for each resource on the page (images, scripts, style-sheets etc.) and this is what caused the slow page loading due to the amount of data being saved across the network.
The fix is fairly straightforward: within the solution directory, open the .vs\config folder and edit the applicationhost.config file. Find the <sites> collection and update the folder location for logFile and traceFailedRequestsLogging to a local path, for example:
<siteDefaults>
<logFile logFormat="W3C" directory="c:\IISExpress\Logs" />
<traceFailedRequestsLogging directory="c:\IISExpress\TraceLogFiles" enabled="true" maxLogFileSizeKB="1024" />
</siteDefaults>
If required you can also disable trace logging by setting enabled="false" in the above snippet.
With these changes made, IIS Express 10 was back to normal and running just as quick as previous versions.

ASP.Net Prevent Timeout for Long Running Process

I have an ASP.net page that uploads a CSV file and then does some processing on it. This can possibly take up to 10 minutes to complete for a large file. However, the process ends up timing out.
I have added the following to the web.config:
<httpRuntime executionTimeout="1200" maxRequestLength="104856" />
Also, I have gone into IIS and set the Connection Timeout to 1,200 seconds. I also set the ASP script timeout to 1,200 seconds in IIS as well.
However, after approximately 2 minutes the web log file stops getting updated.
Any ideas on what is causing this to stop processing? What other timeout settings am I missing?
THanks!
I usually try to avoid long running requests. Are you sure this is the best way to do this? In the past I have either:
Uploaded the document through the web app, but not acted on it. Basically upload it to a watched folder and then process it through a separate process.
Use an alternate method to upload the document (ftp usually). Again, process the file with a separate process.
Probably not the answer you were looking for, but it might be a better solution to your problem?

It is an error to use a section registered as allowDefinition='MachineToApplication' beyond application level

I want to manage two web.config file in a application one for the front-end user and second for the back-end(admin) user.for admin section I have created a folder with name admin in the same website.following settiongs are in the admin/web.config
when I am trying to run the application I am getting following error message:
It is an error to use a section registered as allowDefinition='MachineToApplication' beyond application level. This error can be caused by a virtual directory not being configured as an application in IIS
Same problem have discussed on below
ASP.NET What causes: It is an error to use a section registered as allowDefinition='MachineToApplication' beyond application...?
It is an error to use a section registered as allowDefinition='MachineToApplication' beyond application level
Please give me some suggestion to solve this problem.
Thanks in advance
I faced same issue when I had a web application inside another web application resulting in two Web.config files.I deleted one and issue was solved.
As configuration settings in the child level can override the ones in parent level,this error can occur when you have 2 web.config files one redefining configuration settings that you cannot override such as authentication or session state. Issue happens when you may have authentication / session state set on the lower level web.config and also in higher level web.config
There are an enormous amount of
information on getting this error in
IIS. I have spent hours on issues
like this, because, there has never
been a CLEAR ANSWER.
You will hear things like
It means that if you have an application, you should not configure
the MachineToApplication settings in
web.config's in the subdirectories.
All you have to do is set the
MachineToApplication settings in your
Application's root, and they will be
inherited in the subdirectories.
Solution :
Configure all MachineToApplication
settings in your application's root,
and remove all MachineToApplication
settings from your application's
subdirectories.
What all of the answers FAIL TO
MENTION is that there are a myriad of
file permissions, security, etc..
going on with any file in IIS.
It used to be easier, but with the
increase in the proliferation, it is
getting much, much more complicated.
I have one solution which I have used
which got my IIS to work-
1st. Test a simple HTML page in a new
Web App. If you can get that to work,
then you should be good for the next
steps.
2nd. When you are thinking about the
issue that started this, you have a
Machine.config file in the
E:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\CONFIG
directory for your Framework.
This is the file that sets the Type
of AllowedDefinitions. ( look for
that in the config file)either
allowDefinition="MachineToApplication"
or allowDefinition="Machine"
When someone says you have to put
your web.config in your applications
"Root" Folder, depending on how your
application is setup, will dictate
where to put your web.config.
To get mine to work, I put it in the
E:\Inetpub\wwwroot folder.
3rd. You will have to test which
sections to include in the config
root folder, and which ones you can
put in your various web "Application"
folders.
4th. This is a work in progress, but
I wanted to get this out there while
I was working on it and it was fresh
in my mind...
I'll post more later when I iron out
more of the issues.
I plan to post an extremely detailed
explanation later on when I decipher
all of the cryptic, purposefully
undocumented, sometimes misleading,
information on the matter of configs,
etc. to get IIS running properly.
ie. I'm tired of hacking around
trying to get it to behave properly,
but I want to thoroughly test the
system out and document, EXACTLY what
needs to be done, and why, to get
applications up and running
quickly...
Thanks for reading this.
You need to set it as as an application in IIS.
I had the same issue when using a Kentico installation. Admittedly this is no silver bullet, but in this case it was because one of the base templates CorporateSiteAspx had its own webconfig file containing configuration settings that used requirePermission="false" allowDefinition="MachineToApplication"
On excluding this file (which wasn't used anyway) the application built successfully.

How can I get gzip compression in IIS7 working?

I have installed Static and dynamic compression for IIS7, as well as setting the two web.config values at my application Virtual Folder level. As I understand it, I don't need to enable compression at the server, or site level anymore, and I can manage it on a per folder basis using my web.config file.
I have two settings in my .config file that I have set to customize gzip for my app:
<httpCompression dynamicCompressionDisableCpuUsage="90"
dynamicCompressionEnableCpuUsage="0">
<scheme name="gzip" dll="%Windir%\system32\inetsrv\gzip.dll" />
<dynamicTypes>
<remove mimeType="*/*"/>
<add mimeType="*/*" enabled="true" />
</dynamicTypes>
</httpCompression>
<urlCompression doDynamicCompression="true"
dynamicCompressionBeforeCache="true" />
However, when I run the application, I can clearly see that gzip is not used, because my page sizes are the same. I am also using YSlow for FireFox, which also confirms that my pages are not being gziped.
What am I missing here? In IIS6 it was a simple matter of specifying the file types, and setting the compression level between 0-10. I don't see the need documented to specify the file types or compression level, since the defaults seem to cover the file types, and I'm not seeing the level anywhere.
There was a thread on forums.iis.net about this during the iis 7 beta. Turned out the guy didn't have the modules installed, but it sounds like you've ruled that out from your opening sentence.
Microsofts key advice for him was to enable failed request tracing to find out what was going wrong. This is possibly one of the most under-appreciated features of IIS7, but certainly one of the most powerful.
Open IIS Manager.
Go to your site, and on the actions pane (the very far right), click 'Failed Request Tracing...' under the 'Configure' section.
Click 'enable'.
Then, in the features view, click 'Failed request tracing rules'. Click add, next, enter 200 for the status code, next, click finish.
If you don't see "Failed Request Tracing" in the actions pane, you'll need to add the feature to the server - either using the "Add Role Services" wizard (Health and Diagnostics\Tracing) or through the Web Platform Installer (Products\Server\IIS: Tracing), and then close and re-open IIS Manager.
Next, rerun your test. This will generate some log info for us to examine.
Look in c:\inetpub\logs\FailedReqLogFiles\w3svcx. You will see a bunch of files named fr000xx.xml. Open up any one of them in your browser. (By the way, if you copy these files anywhere, make sure freb.xsl is there. Also, don't delete freb.xsl - if you do, just delete the whole directory or copy it from another location, as IIS only creates it once per folder.)
Click the 'request details' tab and select 'complete request trace'. Search the page for 'compress' - you should find it in several areas; once for static content, and once for dynamic content.
If you don't find either of them, IIS isn't configured correctly. If you do find them, you should see them followed by a compression_success and a compression_do. Success is self explanatory; the 'do' indicates what it did - in my case, it showed "OriginalSize 1462784 CompressedSize 179482"
Since yours isn't working, hopefully you will see something different that helps you solve the problem.
Make sure you turn this off when you're done by disabling failed request tracing in the actions pane for your website.
We had a similar problem and it turns out that IIS7 does some dynamic CPU based throttling here..
http://www.iis.net/ConfigReference/system.webServer/httpCompression
dynamicCompressionDisableCpuUsage
Optional uint attribute.
Specifies the percentage of CPU utilization at which dynamic compression will be disabled.
Note: This attribute acts as an upper CPU limit at which dynamic compression is turned off. When CPU utilization falls below the value specified in the dynamicCompressionEnableCpuUsage attribute, dynamic compression will be reenabled.
The default value is 90.
dynamicCompressionEnableCpuUsage
Optional uint attribute.
Specifies the percentage of CPU utilization below which dynamic compression will be enabled. The value must be between 0 and 100. Average CPU utilization is calculated every 30 seconds.
Note: This attribute acts as a lower CPU limit below which dynamic compression is turned on. When CPU utilization rises above the value specified in the dynamicCompressionDisableCpuUsage attribute, dynamic compression will be disabled.
The default value is 50.
Note the defaults -- if your IIS7 hits 90% CPU usage, it will disable all dynamic gzipped content until CPU usage dips back below 50%!
Also, some great recommendations and benchmarks here on the real CPU cost of GZIP.
http://weblogs.asp.net/owscott/archive/2009/02/22/iis-7-compression-good-bad-how-much.aspx
Long story short, unless you regularly have dynamic pages well in excess of 200kb, it's a non-issue.
Following the excellent advice of JohnW, I too enabled logging to find the culprit, though the reason for the failure turned out to be different:
STATIC_COMPRESSION_NOT_SUCCESS
Reason 14
Reason NOT_FREQUENTLY_HIT
In short, it appears that if you don't hit the page frequently enough then IIS7 will not deem it worthy of compressing, which seems a little bit odd to me. Nonetheless, makes sense in this case because I was just trying to test it on a local machine.
According to this page, the default appears to be that a page has to be hit 2 times within 10 seconds to be a "frequent hit". If you really want to, you can override the default in applicationHost.config (%systemroot%\Windows\System32\inetsrv\config). At least for me it's a locked attribute, so you won't be able to override it in your own web.config.
<serverRuntime frequentHitThreshold="1" />
Also, I note now that SO already had this answer here: In IIS7, gzipped files do not stay that way.
I solved my problem by installing dynamic compression at Add/Remove programs.
In the system.webServer section of your Web.config file, add the following lines:
<remove fileExtension=".js" />
<mimeMap fileExtension=".js" mimeType="application/x-javascript" />
The compression scheme in IIS7 is enabled by default, but it maps only a single javascript mime type to be compressed, application/x-javascript. Adding the line above tells IIS to give all your .js files that mime type, which in turn makes the compression work.
turn on static compression. dynamic compression is for dynamic pages like asp, php, aspx, etc.
Here's a link to the IIS config reference for compression:
For me it turned out to be the setting
noCompressionForProxies
as we are on a proxy here...took myself off proxy and voila, compression.

How do I configure IIS to handle really large file uploads?

Platform: IIS 6, ASP.Net 2.0 (.Net 3.5), Server 2003.
I'm building an application that accepts files from a user, processes them, and returns a result. The file is uploaded using HTTP POST to an ASP.Net web form. The application is expecting some large files (hundreds of MB).
I'm using SWFUpload to accomplish the upload with a nice progress bar, but that's not contributing to the issue, because when I bypass it using a standard HTML form pointing at my upload accepter page, I get the exact same error. When using the progress bar, the upload continues to 100%, then fails. With a standard form, the behavior appears to be the same.
I'm having a problem right now uploading a file that's about 150MB. I've changed every settings I can find, but still no luck.
Here's a summary of what I've changed so far:
In Web.config:
Added this inside system.web:
<httpRuntime executionTimeout="3600" maxRequestLength="1536000"/>
In machine.config:
Inside system.web, changed:
<processModel autoConfig="true" />
to:
<processModel autoConfig="true" responseDeadlockInterval="00:30:00" responseRestartDeadlockInterval="00:30:00" />
and in MetaBase.xml:
Changed:
AspMaxRequestEntityAllowed="204800"
to:
AspMaxRequestEntityAllowed="200000000"
When the upload fails, I get a 404 error from IIS. My web form does not begin processing, or at least, it doesn't make it to the Page_Load event. I threw an exception at the beginning of that handler, and it doesn't execute at all on large files.
Everything works fine with smaller files (I've tested up to about 5.5MB). I'm not exactly sure what file size is the limit, but I know that my limit needs to be higher than 150MB, since this is not the largest file that the client will need to upload.
Can anyone help?
Urlscan was active on all websites, and has it's own request entity length limit. I wasn't aware that Urlscan was running on our server because it was a global ISAPI filter, not running on my individual website.
Note: to locate global ISAPI filters, right click on the Web Sites folder in IIS Admin and click Properties, then on the ISAPI Filters tab.
(A note for googlers):
For IIS7 add below to web.config (I added above <system.serviceModel>):
<system.webServer>
<security>
<requestFiltering><requestLimits maxAllowedContentLength="262144000" /></requestFiltering> <!-- maxAllowedContentLength is in bytes. Defaults to 30,000,000 -->
</security>
</system.webServer>
When we ran into this issue we had to increase the buffer size limit according to this KB article:
http://support.microsoft.com/kb/944886/en-us
I know this mentions ASP, but I believe it worked for ASP.NET as well.
Edit: Here is a link that might be more relevant to your issue and provide other options:
http://weblogs.asp.net/jgalloway/archive/2008/01/08/large-file-uploads-in-asp-net.aspx
404 and missing Page_Load: IIS can only process the request once the complete POST is on the server. Therefore, if the POST fails (due to its size), it cannot fire the page's events.
You might try NeatUpload http://www.brettle.com/neatupload.
From the Manual: "By default, NeatUpload does not directly limit the size of uploads."
You can also try Velodoc XP Edition which has several advantages over NeatUpload including the fact that it uses ASP.NET Ajax extensions. See also the Velodoc web site for more information.
You say:
But 1536000 is only 1.5MB?

Resources