operation time out error while sending files as attachment with size more than 2mb with e-mail(using SMTP client) in asp.net2.0
It would be good if you could provide more context/info with your question, but nonetheless i will take a guess - i'm assuming you want to know "why" it's timing out, and how to resolve it.
Well it's simple, the SMTP transaction has a timeout value in ASP.NET (like most things), and you will need to change it to satisfy your needs.
You need to set the Timeout property on the SMTP object to a higher value.
More info here
The property is in milliseconds (default value is 100 seconds).
You will also need to increase the ASP.NET MaxRequestSize in the web.config.
Related
I came across a scenario where my manager is asking me to increase the session timeout since a particular user is taking lot of time in providing details to make an order and thus loosing the session state by the time they place the order.
I am just wondering if there are going to be any repercussions for increasing session timeout since this is a application level setting not a page level setting
Session timeout configuration depends on kind of website you are using. unless case of Payment or same product used by multiple users,you can set session timeout to whatever best suites for you.
It may consume memory but is fine to increase.
At a customer of ours, candidates take tests with our software. If their test is finished, some calculations are done on the server. Now, sometimes, 200 candidates can end their test at the same time, so 200 calculations are done concurrent. The calculations all seem to go fine, but some calls to the IIS7 server get back a http error...
In Flex, this is the error:
code = "NetConnection.Call.Failed"
description = "HTTP: Status 200"
details = "http://servername/weborb.aspx"
level = "error"
Isn't Status 200 OK? So what's wrong here? Is it even a IIS7 problem? Of the 200 candidates 20 got this message. When restarting their test, everything worked well.
I have found this on the subject, but I wonder if this has anything to do with my problem (next week our customer will do some stresstests and I'll already asked them to test test if solution in this post works).
Some questions:
Can it be that IIS7 blocks certain http calls when load is to much?
How can you know that IIS7 blocked those calls because of too much load?
Is it possible to configure these things?
Technically, in the future I would like to queue the calculations, but for now, there isn't time nor budget for that.
Application: Flex, WebORB, ASP.NET, IIS7 en SQLSERVER2008. Server is Windows Server 2008.
This problem seems very familiar to me. We have a bunch of flex widgets which are connected to one server-side and sometimes it also returns "Netconnection.Call.Failed". For us, it seems that the IIS(and MSSql behind) cannot process all the requests in time, hence some of them are timed out.
Try to check how much time each request/all requests take, then check your timeout setting.
There are plenty of things you can do to fine tune the performance of both your server and IIS.
To answer your questions:
A maximum concurrent connections limit (plus other settings) in IIS 7 can be configured by selecting your website in IIS Manager and selecting 'Advanced Settings' in the Actions Pane on the right. Though by default this is a number much higher than 200.
Looking in the IIS log files, specifically the return status codes can give you an indication of what went wrong. Equally the Windows event log should also tell you of any exceptions that have occurred.
I suggest you turn on load balancing between instances of IIS, or consider using nginx for load balancing.
also set the limit of 200 User higher. Since in IIS, each user connect to your application is count as 1 instance of user, at some point you will use up 200 user slot. This is the default setting and you can set it to much higher number.
Also set your time out to a higher number.
Also look at Comet if you trying to call consistent result like live data (stock, weather, chat, shoutbox)
Technically, in the future I would like to queue the calculations, but for now, there isn't time nor budget for that.
A queue isn't that hard to put together with a batch-processing script running off Windows' scheduled tasks. Just dump results into a SQL DB, or if you're really lazy, insert rows in SQL with a serialized array, then have them "come back" to see their results. "Please wait, your results are still processing."
It'd take you less time than waiting around on SO for a silver-bullet answer in my opinion.
I have been experiencing an error that I believe is caused by the database timing out due to a large amount of data being processed and read to the database.
I keep getting this error message:
Distributed transaction completed.
Either enlist this session in a new
transaction or the NULL transaction.
I timed how long it takes to timeout and it is constantly around 60 seconds. Hence, I thought that it might have something to do with the transaction timeout limit (default 60s) set in Component Services (Windows XP). I increased it to 300 seconds.
When that didn't work, I edited the machine.config file by adding:
<system.transactions>
<machineSettings maxTimeout="02:00:00" />
</system.transactions>
This did not work either.
I don't believe it has anything to do with my data. It is read from an excel spreadsheet. It runs fine when I cut the spreadsheet into two separate files.
Hopefully, I'm just missing something simple like another max timeout setting somewhere.
Hopefully, somebody has run into this before!
EDIT: I am using SQL Server and Linq2SQL.
Andrew, I don't think this is caused by a timeout. Otherwise you would receive a specific timeout error. More than likely this is a programming error. I've encountered this myself, and almost always it was my poorly crafted code causing the issue.
I think you have another issue. It's not a good idea to have transactions running for this long. If your company has DBA's, they will likely, for good reason, throw fits over this. You're locking a lot of resources for a long period of time. Something is going to suffer for this.
BTW, if you're concerned about timeouts, check the timeout setting on your connection string.
Randy
I'm not sure why, but I fixed the problem by going into web.config and adding:
<system.transactions>
<defaultSettings timeout="02:00:00"/>
</system.transactions>
I thought that this setting would be inherited from machine.config. Perhaps they are two different timeout settings? I don't know.
If anyone has additional clarification, please comment!
EDIT 1: Also, if anyone is using ASP.NET Ajax controls, be sure to increase the script manager's AsyncPostBackTimeout property to accommodate a longer period as well.
EDIT 2: I removed the lines I added to the machine.config and reset the distributed transaction timeout setting to it's default. This appeared to have no affect and my program ran fine with just the changes to the web.config file and the script manager.
To change the timeout, you can set the CommandTimeout property in your data context:
var db = new YourDataContext();
db.CommandTimeout = 300;
Having said that, any time you have a distributed transaction, it's worth taking a careful look at the reasons why, and trying to avoid them if at all possible -- your issue may well be related to that rather than a timeout....
I am trying to use Channel Factory and caching it in my asp.net mvc.
I am using PerSession Instance mode as I need to know the state.
Because of this I cannot close the proxy immediately. And I dont want
to reopen ,close proxy everytime.
If I leave the proxy open it is timing out at the 12th time. I can
increase the concurrent session timeout but I want to know if it is
the right approach to go.
I am new to WCF so pardon If my question is stupid.
-Thanks in advance
Pratt
The answer maybe activating the slidingExpiration property in the forms authentication element, although by default this is turned on. With this, after each call the timer is reset to the timeout value so the session stays active whilst it's in use.
See this MSDN Link: Forms Authentication & slidingExpiration property
EDIT - response to comment:
Yes, when the session timeout is reached you will need to reauthenticate before being able to accesss the services again. You should set the timeout value to the length of inactivity (in minutes) that you would consider the user is no longer active (default 30 mins), then the sliding expiration will reset this value if the user keeps calling. I'd try doing some simple tests with the timeout set to 1 minute with different scenarios to prove it to yourself.
By default the session expiry seems to be 20 minutes.
Update: I do not want the session to expire until the browser is closed.
Update2: This is my scenario. User logs into site. Plays around the site. Leaves computer to go for a shower (>20 mins ;)). Comes back to computer and should be able to play around. He closes browser, which deletes session cookie. The next time he comes to the site from a new browser instance, he would need to login again.
In PHP I can set session.cookie_lifetime in php.ini to zero to achieve this.
If you want to extend the session beyond 20 minutes, you change the default using the IIS admin or you can set it in the web.config file. For example, to set the timeout to 60 minutes in web.config:
<configuration>
<system.web>
<sessionState timeout="60" />
... other elements omitted ...
</system.web>
... other elements omitted ....
</configuration>
You can do the same for a particular user in code with:
Session.Timeout = 60
Whichever method you choose, you can change the timeout to whatever value you think is reasonable to allow your users to do other things and still maintain their session.
There are downsides of course: for the user, there is the possible security issue of leaving their browser unattended and having it still logged in when someone else starts to use it. For you there is the issue of memory usage on the server - the longer sessions last, the more memory you'll be using at any one time. Whether or not that matters depends on the load on your server.
If you don't want to guesstimate a reasonable extended timeout, you'll need to use one of the other techniques already suggested, requiring some JavaScript running in the browser to ping the server periodically and/or abandon the session when a page is unloaded (provided the user isn't going to another page on your site, of course).
You could set a short session timeout (eg 5 mins) and then get the page to poll the server periodically, either by using Javascript to fire an XmlHttpRequest every 2 minutes, or by having a hidden iframe which points to a page which refreshes itself every 2 minutes.
Once the browser closes, the session would timeout pretty quickly afterwards as there would be nothing to keep it alive.
This is not a new problem, there are several scenarios that must be handled if you want to catch all the ways a session can end, here are general examples of some of them:
The browser instance or tab is closed.
User navigates away from your website using the same browser instance or tab.
The users loses their connection to the internet (this could include power loss to user's computer or any other means).
User walks away from the computer (or in some other way stops interacting with your site).
The server loses power/reboots.
The first two items must be handled by the client sending information to the server, generally you would use javascript to navigate to a logout page that quickly expires the session.
The third and fourth items are normally handled by setting the session state timeout (it can be any amount of time). The amount of time you use is based on finding a value that allows the users to use your site without overwhelming the server. A very rough rule of thumb could be 30 minutes plus or minus 10 minutes. However the appropriate value would probably have to be the subject of another post.
The fifth item is handled based on how you are storing your sessions. Sessions stored in-state will not survive a reboot since they are in the computer's ram. Sessions stored in a db or cookie would survive the reboot. You could handle this as you see fit.
In my limited experience when this issue has come up before, it's been determined that just setting the session timeout to an acceptable value is all that's needed. However it can be done.
This is default. When you have a session, it stores the session in a "Session Cookie", which is automatically deleted when the browser is closed.
If you want to have the session between 2 browser session, you have to set the Cookie.Expired to a date in the feature.
Because the session you talk about is stored by the server, and not the client you can't do what you want.
But consider not using ASP.NET server side session, and instead only rely on cookies.
Unfortunately due to the explicit nature of the web and the fact there is no permanent link between a website server and a users browser it is impossible to tell when a user has closed their browser. There are events and JavaScript which you can implement (e.g. onunload) which you can use to place calls back to the server which in turn could 'kill' a session - Session.Abandon();
You can set the timeout length of a session within the web.config, remember this timeout is based on the time since the last call to the server was placed by the users browser.
Browser timedout did not added.
There's no way to explicitly clear the session if you don't communicate in some way between the client and the server at the point of window closing, so I would expect sending a special URI request to clear the session at the point of receiving a window close message.
My Javascript is not good enough to give you the actual instructions to do that; sorry :(
You cant, as you can't control how the html client response.
Actually why you need to do so? As long as no one can pick up the session to use again, it would expire after that 20 minutes. If resources does matter, set a more aggressive session expiry (most hosting companies did that, which is horribly annoying) or use less objects in session. Try to avoid any kind of object, instead just store the keys for retrieving them, that is a very important design as it helps you to scale your session to a state server when you get big.
Correct me if I am misreading the intent of your question, but the underlying question seems to be less about how to force the session to end when a user closes the browser and more about how to prevent a session from ending until the browser is closed.
I think the real answer to this is to re-evaluate what you are using sessions to do. If you are using them to maintain state, I agree with the other responses that you may be out of luck.
However, a preferred approach is to use a persistent state mechanism with the same scope as the browser session such as a cookie that expires when the browser is closed. That cookie could contain just enough information to re-initiate the session on the server if it has expired since the last request. Combined with a relatively short (5-10 min) session timeout, I think this gives you the best balance between server resource usage and not making the user continually "re-boot" the site.
Oh you have rewritten the question.
That one is absolutely feasible, as long as javascript is alive. Use any timed ajax will do. Check with prototype library http://www.prototypejs.org PeriodicalExecutor or jQuery with the ajax + timer plugin. Setup a dummy page which your executor will call from time to time, so your session is always alive unless if he logouts (kill the ajax timer in the same time) or close browser (which means the executor is killed anyway)