I am working on an arduino board with Ethernet Shield (v2).
I have took a look to this sample program: Files > Samples > Ethernet > WebServer
There is something very strange for me in loop function: When the server ends to print data to client, i see a delay:
// give the web browser time to receive the data
delay(1);
// close the connection:
client.stop();
What happens if it takes more than one second to send data to client ?
What happens if data is immediatly send and if i have a second client trying to connect to the server in the 1 second delay ?
Is there a way to do something clean, for example waiting data to be flushed instead of waiting 1 second ?
Thanks
Related
Following code comes form the Adafruit Mqtt documentation:
// Adjust as necessary, in seconds. Default to 5 minutes (300 seconds).
#define MQTT_CONN_KEEPALIVE 300
// ping the server to keep the mqtt connection alive
// NOT required if you are publishing once every KEEPALIVE seconds
if(! mqtt.ping()) {
mqtt.disconnect();
}
What does the "MQTT_CONN_KEEPALIVE" actually do? I cannot figure it out.. If i write the code as shown above here and put it in my loop, then the ping is executed constantly, and all packets are rejected... I was expecting that the MQTT_CONN_KEEPALIVE variable is used in the ping() function to execute the ping only if the 300 seconds have passed, but id does not seem to be like that.
How am I supposed to write the code in order to ping only once every few minutes?
MQTT Keep Alive is part of MQTT protocol to maintain a connection between broker and clients. You can read more about it in the documentation.
MQTT uses a TCP/IP connection that is normally left open by the client so that is can send and receive data at any time. To detect a connection failure MQTT uses a ping system where it sends messages to the broker at a pre-determined interval if no messages have been sent after a certain period (KeepAlive).
Specific to Adafruit_MQTT implementation, if you publish data and you are sure that you will publish data within the time period set by MQTT_CONN_KEEPALIVE, then you are good to go.
If the server/broker didn't receive data or PINGREQ from client within the MQTT_CONN_KEEPALIVE + an extra of 50% of MQTT_CONN_KEEPALIVE, the broker will disconnect from the network(timeout) and the client will have to re-establish the connection.
So if a MQTT client only subscribe to a topic without publishing, the client then must send the ping (PINGREQ) to the broker at least once in every MQTT_CONN_KEEPALIVE sec. However, you don't want to constantly ping the server. One way of doing it is only send the mqtt.ping() every MQTT_CONN_KEEPALIVE sec.
#define MQTT_KEEP_ALIVE 300
unsigned long previousTime = 0;
loop() {
if((millis() - previousTime) > MQTT_KEEP_ALIVE * 1000) {
previousTime = millis();
if(! mqtt.ping()) {
mqtt.disconnect();
}
}
// do something else
}
I'm trying to sample TI CC2650STK sensor data at ~120Hz with my Raspberry Pi 3B and when comparing the signal trace to a wired MPU6050 I seem to have a ~50ms phase-shift in the resultant signal (in the image below orange is the data received over BLE and blue is the data received over I2C with another sensor (MPU6050):
The firmware on the sensor side doesn't seem to have any big buffers:
(50{ ms }/8{ ms/sample } = ~6 { samples }), where each sample is 18bytes long -> 6*18 buffer size req'd I guess...).
On the RPi side I use Bluez with Bluepy library and again I see no buffers that could cause such a delay. For test purposes the sensor is lying right next to my pi, so surely OTA transmission cannot be taking 40-50ms of time? More so, timing my code that handles the incoming notifications shows that the whole handling (my high level code + bluepy library + BLUEZ Stack) takes less than 1-2 ms.
Is it normal to see such huge propagation delay or would you say I'm missing something in my code?
Looks legit to me.
BLE is timeslotted. Peripheral cannot transmit any time it wants, it has to wait for next connection event for sending its payload. If next connection event is right after sensor data update, message gets sent with no big latency. If sensor data is generated right after connection event, peripheral stack has to wait a complete connection interval for next connection event.
Connection interval is an amount of time, multiple of 1.25 ms between 7.25 ms and 4 s, set by Master of the connection (your Pi's HCI) upon connection. It can be updated by Master arbitrarily. Slave can kindly ask for modification of parameters from the Master, but master can do whatever it wants (most Master implementation try to respect constraints from Slave though).
If you measure an average delay of 50 ms, you are probably using a connection interval of 100 ms (probably a little less because of constants delays in the chain).
Bluez contains a hcitool lecup command line that is able to change the connection parameters for a given connection.
I need to communicate with a device using Arduino through RS232, I have everything setup and working. My problem is that the company that made the device a I need to communicate with told me the following:
1) The data sent through serial have one start and stop bit and is the hex "7E".
2) On the end of the message is appended a CRC hash
The device uses a request/response protocol, so the Arduino board must send the data in order to receive something. I've set the start and stop bit as ASCII characters put the data and CRC in between and simply Serial.write(data); I've setup two Arduino boards and both communicating this way. Everything went fine.
But with the device itself it's not working. The device sends back a response like "CE 0F " with spaces in between. So I start searching more about serial communication and found about start and stop bits and parity and start wondering if I'm sending and receiving data in the way the device expects.
To send and receive I used the basics tutorials like:
if (Serial.available() > 0) {
// read the incoming byte:
incomingByte = Serial.read();
// say what you got:
Serial.print("I received: ");
Serial.println(incomingByte, DEC);
}
So far I don't understand very well this type of communication. So I want to know why, for example some resources says that the start bit is 0 and the stop bit is 1, and in this case the start and stop bit is "7E".
I'm suspecting that this has something with:
https://en.wikipedia.org/wiki/High-Level_Data_Link_Control
I am using an Arduino Pro Mini 328P (3.3v, 8Mhz) with Xbee series 1. I have set the frequency to 1 Mhz and the baudrate to 9600. Also I have set baudrate to 9600 in the Xbee. I have also tested that at this baudrate Xbee is sending the data properly in a normal scenario.
Now what I have done in my project:
I have registered my Xbee with the gateway and then it will go to sleep (I have used pin hibernate mode) then it will wake up by a digital pin of the Pro Mini. I have put a delay of 19ms, after which the Xbee will try to send data. After sending the data it will go back to sleep.
The problem is that it behaves randomly when sending data to the gateway (which also has the same Xbee series1). Sometimes it sends the data perfectly, sometimes sending fails. I have also enabled RR to retry 6 times in case the Xbee fails to send the data the first time.
I have no idea how to solve this problem because of the randomness in sending the data.
I have put two Xbees nearer (I have two nodes with the same hardware and the same code). There is an interval between of around 4 minutes. So when one Xbee sends the data perfectly, after that 4 minutes gae (time difference of two RTC on different nodes) the other one fails to send the data. In this condition what can I conclude?
As a side note, the Xbee will try to send the data every hour. To calculate that hour I have to use an RTC, which seems to work fine (I am sure because I have taken the logs, the RTC never fails to generate an interrupt).
So I am wondering what could be the possible reason and how can I fix this problem (without restarting anything if it is possible then nothing will be better than that).
And I have no choice to restart my controller.
How to debug this?
A few things. If possible, increase your baud rate so you spend less time sending data to/from the XBee. If you have a limited power budget, faster baud rates save time and energy. I don't know how the UARTs work on the Arduino, so I can't say whether 115,200bps is possible with a 1MHz CPU clock.
Second, make sure you wait for the XBee to assert CTS back to the Arduino after you wake it up. Never send to the XBee unless it's "clear to send".
Third, if you use API mode, you can watch for a "Transmit Status" frame from the local XBee back to the Arduino which will let you know when the module has successfully sent the frame, and it's safe for you to put it back to sleep.
I am trying to send sensor data from an Arduino UNO to my server running on a local wifi network. My server is an HTTP server run on NodeJS.
I need the Arduino to send data to the server as fast as possible as fast as a request every 100~250ms. The code from the WiFiClientRepeating example sends data to the server properly every second or so. If I reduce the frequency to something lower or equal to 500ms, the server does not seem to receive anything.
Are there any limitations to how many requests there can be done in a set period of time?
EDIT: I am using the official Arduino WiFi Shield.