Emails via rules are sent multiple times - drupal

I created a rule to send an email to the author of the node after saving the node. In case of new nodes and whend changed existing ones.
But the mails are sent multiple times. Sometimes at the same time and sometimes over about 3 hours. Simetimes 10 emails, sometimes 20.
I don't know where I can search for a reason.

you might want to turn debugging on for your workflow (admin/config/workflow/rules/settings) and check the logs after the emails are sent (admin/reports/dblog)
I found out after hours of trying to solve a similar problem, that some triggers were being defined automatically in (admin/structure/trigger/workflow) for my workflow and they were saving and publishing my node again, thus creating a recursion. Drupal stops after a few iterations of this, so a random number of messages is being sent. My server was sending over 40 every time I changed workflow states
Also please look at your rules page and make sure you don't have any contradicting rules in your workflow that makes it run the same stuff all over again.

Related

Azure Service Bus - Renew message lock automatically when using ServiceBusReceiver

Having spent long hours trying to find documentation and help around this resulting in nothing, I have decided to reach out to the community.
I would like to read messages from a topic subscription. Using the message, a UI is populated for a human to work on it. The time it approximately takes to process each message is 15 minutes and each client can work on only one message. At the end of processing the message, the client can either decide to stop processing messages or request a new message.
With the max lock time set at 5 minutes on the subscription, I need to be able to automatically renew my lock for up to 15 minutes.
The first attempted approach was to use CreateReceiver and fetch the message, read it and Complete message when done. The issue with this is I have not been able to figure out how to automatically renew the lock for 15 minutes. I see the RenewLockAsync function but would like for this to be automatic and not have to run a background timer to keep track of the expiring lock.
The second attempted approach was to try using ServiceBusClient.CreateProcessor() with options to set the AutoLockRenewal timespan. The issue faced here is with the processor itself running based on events in the background. Since I need to populated a UI, I need to be able to stop the processor after the message has been read, return the callback and once the human interaction is done, complete the message. I have been unable to find a way to do this.
What would be a good approach to achieve this? The subscription acts as a workqueue that multiple people pull items from and individually work them. Any help in a proposing an approach to this is appreciated.

Delay Queue for amount of time per session

I'd like to create a system that 'appends' mails to each other.
Situation: Everytime an entity is changed I'd like to send a mail to subscribers of that entity.
But when the entity is changed 10 times on a small time (like 5 / 10 minutes) the subscribers don't need to be spammed with emails.
So I was thinking of creating a 'Queue'. And to be more precise I was thinking about using the Azure Servicebus.
After searching some of the documentation. I found two interesting properties.
SessionId => Would be the entity of the Id
BatchFlushInterval (Client-side batching)
'If the client sends additional messages during this time period, it transmits the messages in a single batch'
This sounded perfect.
In this way I recieve all the 'changes of the entity' in a single batch. And could construct a single e-mail to send.
But I don't seem to find this option anymore in the new "Azure Service Bus NuGet".
Now that I searched for alternatives, I have a feeling this is not a 'normal' practice.
Does someone have some experience in this field?
I would like to avoid having to use a cron job. But if this is the best solution please let me know.
I know this a really broad question and more a 'need for information'. So commenting with links can already make me real happy.
Thanks in advance
Brecht
Don't think Message Sessions or BatchFlushInterval is the approach to take here. What you're looking for is to buffer messages to create a single notification rather than multiple ones. I'd personally go with receiving a batch from the Azure Service Bus and process the batch to "append" notifications.

Trigger a series of SMS alerts over time using Twilio/ASP.NET

I didn't see a situation quite like mine, so here goes:
Scenario highlights: The user wants a system that includes custom SMS alerts. A component of the functionality is to have a way to identify a start based on user input, then send SMS with personalized message according to a pre-defined interval after the trigger. I've never used Twilio before and am noodling around with the implementation.
First Pass Solution: Using Twilio account, I designated the .aspx that will receive the inbound triggering alert/SMS via GET. The receiving page declares and instantiates my SMSAlerter object within page load, which responds immediately with a first SMS and kicks off the System.Timer.Timer. Elementary, and functional to a point.
Problem: The alerts continue to be sent if the interval for the timer is a short time span. I tested it at a minute interval and was successful. When I went to 10 minutes, the immediate SMS is sent and the first message 10 minutes later is sent, but nothing after that.
My Observation: Since there is no interaction with the resource after the inbound text, the Session times out if left at default 20 minutes. Increasing Session timeout doesn't work, and even if it did does not seem correct since the interval will be on the order of hours, not minutes.
Using Cache to store each new SMSAlerter might be the way to go. For any SMSAlerter that is created, the schedule is used for roughly 12 hours and is replaced with a new SMSAlerter object when the same user notifies the system the following day. Is there a better way? Am I over/under-simplifying? I am not anticipating heavy traffic now (tens of users), but the user is thinking big.
Thank you for comments, suggestions. I didn't include the code, because the question is about design, not syntax.
I think your timer is going out of scope about 20 minutes after the original request, killing the timer. I have a feeling that if you keep refreshing the aspx page it won't happen - but obviously that doesn't help much.
You could launch a new thread that has the System.Timers.Timer object so it stays alive, and doesn't go out of scope when there are no follow up requests to the server. But this isn't a great idea to be honest - although it might help with understanding the issue.
Ultimately, you'll need some sort of continuously running service - as you don't want to depend on the app pool for this, so I'd suggest a Windows Service running in the background to handle it, which is going to be suitable for a long term solution.
Hope this helps!
(Edited slightly to make the windows service aspect clearer)

How to time-delay email deliveries?

I'm currently learning about the Drupal email functions, such as drupal_mail and hook_mail, and hook_mail_alter, and I have a problem before me where I'll need to be able to queue emails for delayed delivery. For example, an event signup notification that needs to wait for an hour after the event was registered by a user. And that email has user specific data, so can't be just a generic template email...
I'm using the MailQ module, mainly for it's logging capabilities, but I wonder if it (or something else) could be modified to add a configurable delay function?
Any ideas?
======================
Some additional thoughts:
There are 2 modules in D6 that are useful for queuing emails, MailQ and Job queue (in Nikit's list below). They both have the mail queue functionality, which can be very useful, and both have different approaches, strengths and weaknesses- from a cursory investigation. EG: MailQ has a great logging function, with a very useful admin interface for dealing with categories of mail such as queued, sending, sent, failed etc. And you can set how many emails to go out with each cron run.
But while Job Queue doesn't have those categories and logging-archiving as far as I can tell, it does allow for prioritizing different jobs, which can be very useful if you have different email types going out simultaneously, such as contact confirmations, event signup confirmations, and "invite-a-friend" emails for example.
So my idea of a perfect queue would have both of those modules integrated (they don't get along very well) with an additional delay function for each job type.
But writing such a thing is way beyond my meager talents, so I'll just have to write a little custom module that is activated by cron, and that will look for flagged contacts with a creation date of at least an hour in the past, and find a way to write the relevant info directly into the MailQ DB table.
There seems to be some collision going on when I have both my new function being triggered by cron, as well as the MailQ function being triggered by the same cron run, so I'm trying to sort out how to cron-prioritize the MailQ over my function- so that my function will add the table data AFTER MailQ has run.
It's not the most elegant solution, but I did something in D6 before I knew of transactional email services with hook_mail_alter where I loaded the email parameters into a database table, set the message value to null so it didn't send after the hook, and then later a cron job running every 5 mins would check the table and send a couple messages at a time if any where waiting. Once it sent, I flipped the status to sent. You will get a watchdog error saying the email failed to send.
But with this method, you could either call something like Mailgun/Mandrill and send a scheduled email within the hook, or put it in a database table with a timestamp field like strtotime('+2 weeks') and then have a cron job check on the table every x minutes and send when timestamp conditions are met. hook_mail_alter gives you a field to identify the message, so you could have a switch that would only catch specific emails if you want.
If you're comfortable with custom modules, take a look at the Mandrill module. It likely has a more elegant solution for taking over Drupal's email than what I've described above.

Recommended Way to Control Making HTTP requests?

Hypothetically, if the user clicks "save,save,save,save" a bunch of times on a text file, making single character changes at a time and managing to resave 5 times before the first save is processed, what is best practice in that situation? Assuming we don't have a "batch process" option... Or maybe they push save,save,save,save on 4 files one after the next before they've all been processed.
Should you queue up the requests and process them one at a time? Or should you just let it happen and it will work itself out somehow?
Thanks!
We usually send down a GUID in the page when a form is initialized, and send it along on the save request. The server checks a shared short-term memory cache- if it's a miss, we store the GUID and process the save, if it's a hit, we fail as a dupe request. This allows one save per page load (unless you do something to reinit the GUID on a successful save).
If you make your save operation light enough, it shouldn't matter how many times the user hits save. The amount of traffic a single user can generate is usually quite light compared to the load of thousands of users.
I'd suggest watching the HTTP traffic when you compose a Gmail message or work in Google docs. Both are very chatty, and frequently send updates back to the server.

Resources