Our company has a order page which has code in the code behind to send an email confirmation of an order using SmtpClient class. It works fine. The problem is that when there is a failure with our exchange server the email send event errors. We catch it and log it and notify the customer another way. It is usually caused by our backup server running which causes issues with email sends from time to time. How could we have the email try again every 30 minutes? I should note that all of our code is in our pages so we were thinking of creating a separate email class that would the page could call and process the request in the background and the page would be allowed to complete loading. Any advice would be appreciated.
You could have a timer that starts each time the email fails to send, checking and trying to send the email every thirty minutes, if it is able to send, the timer stops. However, this could be taxing on your system if there are a lot of emails.
EDIT.
You could add the failed emails to a list or to a database. You could add a function to your code or make a new app that checks the time of the failed emails and last attempted sent time. If it is >= 30 minutes, attempt to send those emails again
Related
Background
I have a batch processing system that needs to send out messages (via SMS/Email) to groups of people.
As our message publihsing system is fairly slow, when the user hits the "send" button, the system posts all the message informtion into a database with a "batch ID" and then does an Asynchronise call (WebRequest.BeginGetRequest) to a "ProcesBatch" ASHX handler request, with the batch ID as a URL request parameter.
This releases the front end page back to the user to do the next batch of messages as the users dont actually need any feedback, however, the recording in the database is subsequently used in a reporting module.
In the mean time, the batch process handler simply cycles around the records from the database for the given batch ID and then posts the messages to our (slow) message publisher sequentially.
The Problem
The problem is that during the batch processing, asp.net is throwing a
System.Threading.ThreadAbortException: Thread was being aborted.
half way through and the remaining messages are not sent.
I have checked IIS and the recycle mins is set at the default 1740 so is there anything else that would cause this?
Or is there a more appropriate way to approach this.
Have you tried to increase executionTimeout under httpRuntime in web.config?
The default value is 90 or 110 seconds (depending .net version).
Perhaps your ashx requires more time to end its job
http://msdn.microsoft.com/en-us/library/vstudio/e1f13641(v=vs.100).aspx
Edit: in general it's not a good idea to set a very long executionTimeout. As other users suggested, consider to develop a Windows Service to do the long jobs.
I have tried a lot of variations to return the response quickly to user and send email in background but failed. Here is one sample that I have tried,
ThreadPool.QueueUserWorkItem(callback =>
// It is important to put this line inside try catch otherwise it might crash the process
client.SendAsync(message, message)
);
When I put a break point on client.SendAsync, no respose will be sent to user until this line execute. So what is the proper way to send email in background without waiting for response.
Update: I think I got the issue, actually it's the VS debugger that stop the asp.net thread(suspend ) until you Step Through. The above code works for my scenario.
You could place the email in a Message Queue and process the actual sending of it in a separate service that reads from that Message Queue.
Depending upon the number and size of the emails, you might also consider writing them to a pickup folder as well. You can use a local IIS SMTP server to relay the mail through a smart host. In practice, this has worked very well for us.
Send Email Asynchronously with ASP.NET http://www.asp.net/web-forms/videos/how-do-i/how-do-i-send-email-asynchronously-with-aspnet
I am using OpenPop.net in my application. What this application does is that it downloads mails from a pop3 account saves all the attachments (CSV files) and processes them. This processing takes a lot of time. I am getting this exception which I am not able to figure out:
Exception message: OpenPop.Pop3.Exceptions.PopServerException: The stream used to retrieve responses from was closed
at OpenPop.Pop3.Pop3Client.IsOkResponse(String response)
at OpenPop.Pop3.Pop3Client.SendCommand(String command)
at OpenPop.Pop3.Pop3Client.DeleteMessage(Int32 messageNumber)
At the end of processing the CSV, the mails are deleted from the pop3 account. I believe this is where this exception is happening.
You are really having two issues here.
One is that you are doing a lot of processing while being connected to the POP3 server. When you are idle too long, the server will simply disconnect you to save resources.
What you should do, is fetch one email, process the attachments and then reconnect to fetch the next. You could also fetch all the attachments and then process them offline.
Second, I guess you are connecting to a gmail account. Gmail has some weird characteristics. A thread tries to find these characteristics. One of them is that, when you have fetched an email, it will not be available in the next POP3 session with the server. You can connect using a special username, where you append recent: in front of your normal username. This will show you the emails received in the last 30 days, despite of having been showed to an earlier POP3 session.
Hope it helps.
It sounds like something is trying to read a stream that has already been closed. Are you handling the streams at all, or is this done completely internal to the API? If you are handling them at all, there is a chance you are closing the streams (this often happens if someone uses a StreamReader, most people don't realize that closing the StreamReader also closes the underlying stream).
I have an ASP.NET MVC application that utilizes NHiberante and SQL Server 2008 on the backend. There are requirements for sending out both event-driven notifications on a daily/weekly basis AND general notifications on a weekly basis.
Here is an example of how the event-driven workflow needs to work:
Employee(s) create a number of purchase orders.
An e-mail notification is sent daily to any supervisor with a subordinate employee that has made a purchase order with a list of all purchase orders created by subordinates that require his approval. The supervisor should only get this once (e.g. if Employee A creates a PO, his supervisor should not get an e-mail EVERY DAY until he approves). Also, the list of purchase orders should ONLY include those which the supervisor has NOT taken action against. If no purchase orders need approval by a given supervisor ... they should not get an e-mail.
An e-mail notification is sent daily to Dept. Managers with a list of all purchase orders APPROVED by subordinate supervisors in a similar fashion to #2 above.
Any time any action is taken with regards to approving a PO by a supervisor or dept. manager, the employee should get an e-mail notification daily listing ALL such changes. If there are none for a given employee, they should not get an e-mail at all.
So given such a workflow:
What is the best way to schedule such notifications to happen daily, weekly or even immediately after an event occurs?
How would you ensure that such event-driven notifications ONLY get delivered once?
How would you handle exceptions to ensure that failed attempts to send e-mail are logged and so that an attempt could be made to send the following day?
Thanks!
I would have all your emails, notifications, etc saved to a DB / table first and then have a service that polls for new entries in this database or table that handles the actual sending of the email / notification.
To address your specific situations you could have controllers write to the DB when a email / notification is required and a service that does you interval / event specific checks also write to the DB to create new email. This way your application and service don't really care about how or what is happening to these notifications, they are just saying, "Hey do something." and the emailer/notification service is actually doing the implementation.
The advantage to this is that if your email provider is down you don't lose any emails and you have a history of all emails sent with their details of when, who, etc... You can also rip out or change the emailer to do more, like send to Twitter or a phone text message etc. This effectively decouples your notifications from your application.
All applications I have made recently use this type of model and it has stopped emails from being lost due to service failures and other reasons. It has also made it possible to lookup all emails that have gone through the system and get metrics that allows me to optimize the need to send emails by storing extra information in the email record like reason sent, if it's to report an error, etc... Adding on additions such as routing notifications (eg go to text message instead if email) based on time of day or user has been possible with no changes to the primary applicaton.
Your customer might think all they need is email today but you should make sure your solution is flexible enough to allow for more than just email in the future with just minor tweeks.
You can add a normal action in a controller
Function SendEmails() As ActionResult
Dim result As String = ""
''//big timeout to handle the load
HttpContext.Server.ScriptTimeout = 60 * 10 ''//ten minutes
result = DoTheActualWork()
''//returns text/plain
Return Content(result)
End Function
And then call the page from a scheduled task. Can be a scheduled task on the server or in any machine. Use a .vbs for this:
SendEmails.vbs:
''//Force the script to finish on an error.
On Error Resume Next
''//Declare variables
Dim objRequest
Dim URL
Set objRequest = CreateObject("Microsoft.XMLHTTP")
''//Put together the URL link appending the Variables.
URL = "http://www.mysite.com/system/sendemails"
''//Open the HTTP request and pass the URL to the objRequest object
objRequest.open "POST", URL , false
''//Send the HTML Request
objRequest.Send
''//Set the object to nothing
Set objRequest = Nothing
You can host a windows workflow or windows service. and setup a message queue to process these events. You might just use the database for your message queue or you could use ms message queue or use triggers in the database. But this kind of functionality really shouldn't be a responsibility of your front end web app. If push comes to shove you could spawn off another thread in you asp.net application to process this queue.
This can be done using SQL Server Agent read more about it here:
http://msdn.microsoft.com/en-us/library/ms189237.aspx
Sounds like a job for a service or a scheduled job.
You don't want to do it in ASP.NET because you'd have to configure IIS to keep your app alive all the time, which may not be the best idea.
A scheduled task is fine, but you'll have to program it with all the logic for parsing out your data. Not the best for a separation of concerns. Also you'll have to update two code bases if something changes.
A service isn't ideal as it would only truly be doing something once a day. But you could set up a wcf service and have the website queue up emails using the service.
After clicking on a button I want to fire an Ajax call and then redirect to other pages without waiting the ajax result. Is it possible? My code as follow:
$('button').click(function(){
$.get('mailer.php', function(){
window.location.replace("result.php");
});
});
The code above will wait for the ajax result before redirection. Can I just let the ajax run at the back and move on to other pages?
It looks like the script you're AJAX-ing to is sending an e-mail. I can see how waiting for that to complete would result in a lag on the user's end. Here's how I typically solve this problem:
Instead of having your mailer.php script immediately send the e-mail, store the message data in your db. Don't send the e-mail during the AJAX call. This will result in a very fast process for the user.
Then, you can setup a CRON job that points to a script you have specially created for sending out e-mails that have been stored in the DB.
Make sense?
If you move to another page the connection to the server might be dropped (there is no rule preventing the browser from doing so) and the action may not be completed. In short: you have to wait.
If you leave immediately, the request will be abandoned. You would need to include logic on the server to keep whatever process going that was started by the jquery ajax request.
For example, in PHP there exists the ignore_user_abort() function, which, when set to true, will continue a process even after the user has aborted the request.