The slot function QTimer::start have a interval of millisecond:
Starts or restarts the timer with a timeout interval of msec milliseconds.
But msec is a int type and couldn't take a long interval. What is a alternative to QTimer which could take a long interval ?
In this case you can use something like an hourly timer connected to a function or lambda that checks the current timepoint against a target...
/*
* We want to trigger some event one year from now.
*/
auto endpoint = QDateTime::now().addYears(1);
QTimer hourly_timer;
QObject::connect(&hourly_timer, &QTimer::timeout,
[endpoint]()
{
if (QDateTime::now() >= endpoint) {
/*
* Target time reached. Do whatever...
*/
}
});
hourly_timer.start(3600 * 1000);
Related
I noticed a weird behavior of my application on an ESP32.
After some "debugging" I think the issue is due to this function:
bool getLocalTime(struct tm * info, uint32_t ms)
{
uint32_t start = millis();
time_t now;
while((millis()-start) <= ms) {
time(&now);
localtime_r(&now, info);
if(info->tm_year > (2016 - 1900)){
return true;
}
delay(10);
}
return false;
}
By the way, ms defaults to 5000.
In my code I use getLocalTime in a finite-state-machine in order to execute actions at specific time, i.e.:
void Irrigation::fsm()
{
time_t now = timing.getTimestamp();
switch (state)
{
case Running:
if (now - lastExecution)
{
// do something
}
break;
}
}
where:
time_t Timing::getTimestamp()
{
struct tm tm;
getLocalTime(&tm);
return mktime(&tm);
}
It seemed to me the application hangs (the fsm is called every second).
Actually, looking at the getLocalTime implementation I don't understand what it does. Why it needs a while cycle and a delay of 10 ms every cycle, just to retrieve the current time?
I'm looking for the epoch time in second. Is my approach wrong?
Thank you for raising this. I just found the same issue when using the ESP32Time by fbiego. Which as you have identified (Thanks) is that delay when fetching the time.
Looking the code, it is always having that 5000ms delay while the time is set before 2016.
I fixed mine by setting the rtc to the start of 2022.
This gets rid of the delay for me until the GPS I am using gets a lock and gives me the correct time.
How to fix that:
Just pass the 2nd parameter to the function. It will limit the delay:
getLocalTime(&timeinfo, 5);
It will finish in 10ms max (if current year is < 2016).
Why it is implemented that way:
Just sharing my guess:
Function returns immediately if the evaluated year is > 2016. And in this case it returns true, i.e. success.
In case if the evaluated year is < 2016, then such result is considered as failed result. So, function waits 10ms and does another attempt to calculate the time.
It runs in that loops for 5 sec (5000 ms).
I think, the assumption here is that the correct date/time might be retrieved from NTP during this delay.
(surely it won't happen automatically. But technically there might be started request to the NTP server, running on the other core)
I am trying to make it so that when you click it will show a different cursor_sprite for 0.25 seconds. I currently need some way to add a delay to this. Here is my code so far:
In create event:
/// #description Set cursor
cursor_sprite = spr_cursor;
In step event:
/// #description If click change cursor
if mouse_check_button_pressed(mb_left)
{
cursor_sprite = spr_cursor2;
// I want to add the delay here.
}
You could use the build-in Alarms for this, but I don't like these much when it becomes nested with parent objects.
So instead of Alarms, this is the way I would do it:
Create Event:
cursor_sprite = spr_cursor;
timer = 0;
timermax = 0.25;
I create 2 variables: the timer will be used to count down, and the timermax to reset it's time.
Step Event:
if (timer > 0)
{
timer -= 1/room_speed //decrease in seconds
}
else
{
cursor_sprite = spr_cursor;
}
if mouse_check_button_pressed(mb_left)
{
cursor_sprite = spr_cursor2;
timer = timermax;
}
For each timer, I let it count down in the Step Event through 1/room_speed, that way, it will decrease the value in real-time seconds.
Then you can set the timer through timer = timermax
Then if the timer reaches zero, it'll do the given action afterwards.
Though a reminder that it's in the Step Event, so once the timer reaches zero, it'll always reach the else statement if there are no other conditions before. Usually I use the else-statement to change conditions so it doesn't reach the timer code multiple times.
#Steven:
This is useful as far as it goes, but I think you mixed up the starting values for timer and timermax. If timer is counting down, then it obviously can't start at 0.
Also, starting timer at your intended duration completely obviates the need for even having a second variable (timermax).
So it could go:
Create Event:
cursor_sprite = spr_cursor;
timer = 0.25;
Step Event:
if mouse_check_button_pressed(mb_left)
{
cursor_sprite = spr_cursor2;
timer = 0.25;
}
if (timer > 0)
{
timer -= 1/room_speed //decrease in seconds
}
else
{
cursor_sprite = spr_cursor;
}
I'm facing a programming question in which I want to trigger some code whenever a capacitive touch sensor has been touched for 100 ms (to distinguish false positives in my prototype). My sensor is touched by this code
if (digitalRead(touchPin))
Now whenever it has been touched for 100ms I want some other code (for instance, activating a LED) to run. I can't really seem to find a solution because my startTime = millis() variable keeps resetting.
Does anyone know how to tackle this problem?
You need a bool variable, to store last state (TRUE if touched and FALSE if not)
Also, you need to store time when it has been changed to TRUE. Time could be taken by millis() function
If your bool variable is true, check, if time passed is more than your 100 ms.
So:
// In your global scope:
...
// Last touch state
bool isTouched = FALSE;
// time, when last touch happened
int touched_t = 0;
// In your loop:
...
bool isTouchedNow = (digitalRead(touchPin) == HIGH);
// Touch state is changed till last measure:
if (isTouchedNow != isTouched)
{
// Set "last isTouched state" to new one
isTouched = isTouchedNow;
// If it wasn't touched before, store current time (else zero):
touched_t = isTouched ? millis() : 0;
}
else //If touch state isn't changed till last time:
{
//If state was "touched" and now it "touched", and 100ms has passed:
if (isTouched && touched_t > 0 && millis() - touched_t > 100)
{
// Call your function, that should be called,
// whan sensor is touched for 100 ms (activate a LED of something)
DOTHESTUFF();
}
}
...
I currently working to recreate a quad-copter controller.
I am working on getting data from my gyro sensor, and to do that, I'm using
an ISR with an interuption.
My problem is, when I call my function "gyro.getX" on the main program, it work.
But, when I call this function from my ISR, it doesn't work.
I thing that I find the reson of the bug, the function I'm using is provided by the "Adafruit_LSM9DS0" library (from ST), and it used a "timestamp".
I thing that the current time from my ISR is different that the current time from my Main program, but i don't know how to fi it.
Here a shortcut of my program:
void loop(){
/*main prog*/
}
/*
*Reserve interrupt routine service (ISR) by Arduino
*/
ISR(TIMER2_OVF_vect)
{
TCNT2 = 256 - 250; // 250 x 16 µS = 4 ms
if (varCompteur++ > 25)// 25 * 4 ms = 100 ms (half-period)
{
varCompteur = 0;
SensorGet(pX, pY);//Feed gyro circular buffers
}
}
void SensorGet(float * pRollX, float * pPitchY)
{
lsm.getEvent(&accel, &mag, &gyro, &temp);
GiroX_Feed(pX, gyro.gyro.x);
GiroY_Feed(pPitchY, gyro.gyro.y);
}
bool Adafruit_LSM9DS0::getEvent(sensors_event_t *accelEvent,
sensors_event_t *magEvent,
sensors_event_t *gyroEvent,
sensors_event_t *tempEvent)
{
/* Grab new sensor reading and timestamp. */
read();
uint32_t timestamp = millis();
/* Update appropriate sensor events. */
if (accelEvent) getAccelEvent(accelEvent, timestamp);
if (magEvent) getMagEvent(magEvent, timestamp);
if (gyroEvent) getGyroEvent(gyroEvent, timestamp);
if (tempEvent) getTempEvent(tempEvent, timestamp);
return true;
}
The problem isn't the time. The problem is likely that your sensor uses the I2C and it is disabled during an interrupt routine, or it's some other communication protocol that relies on interrupts to function and is therefore disabled during your ISR.
You are really abusing the interrupt. This is not the kind of thing interrupts are for. Interrupt should be super fast, no time for communications there. So the real question is why do you think you need an interrupt for this?
I have two data bases that contain temperature data from an arduino...
I want to send the data to these data bases for a minute,
then after of send the data in the another data base, send it to the next data base (LATER OF TEN TIMES OF THE FIRST DATA BASE)
My code below:
int count = 0;
for(int a = 1; a <= 10; a++) {
Cayenne.run();
delay(60000);
count = count + 1;
}
if(count== 10) {
ToPostWStemp();
count = 0;
}
But doesn't send anything, and I don't know how to do.
Many people said me that it's much better use millis() function, but I don't know how code that function on my Arduino.
D.P. The function "Cayenne.run" calls the first function of server, then the
"ToPostWStemp" calls the second last server function.
Thank you!
If I'm understanding the question correctly, it sounds like you want Cayenne.run() to be called once every minute, and ToPostWStemp() to be called once every 10 minutes.
To do that using millis(), you can simply keep track of the last time each function was called, and compare that against the current value of millis(), calling each function only when the elapsed time exceeds the desired interval. Something like this:
unsigned long cayenneTime = 0;
unsigned long postWSTime = 0;
void loop()
{
if (millis() - cayenneTime >= 60000)
{
// Do this every 60 seconds
Cayenne.run();
// Keep track of the last time this code ran, so we know
// when to run it next time
cayenneTime = millis();
}
if (millis() - postWSTime >= 60000 * 10)
{
// Every 10 minutes, do this
ToPostWStemp();
// Keep track of the last time this code ran, so we know
// when to run it next time
postWSTime = millis();
}
// Do other stuff here
}
Note that millis() will overflow and reset to 0 every 4,294,967,295 milliseconds (about 49 days), so if this is to be a long-running program you'll want to take that into account and adjust for it.