I want to send five potentiometer values (bytes) through ZigBee using Arduino. I stored the potentiometer values in five different variables (bytes) and used
Serial.print(pot1);
Serial.print(pot2);
.
.
Serial.print(pot5);
The problem is that, when I am varying the potentiometer values, in the receiver section it is coming as 49, 55, 57, etc. (always changing), instead of some constant value from 0-255. (I am using Serial.read() function five times.)
How can I fix this problem?
q
Well, when you are varying the potentiometers it stands to reason that you will be reading varying values as the change is happening AND your program is reading and sending the changing values.
What I understand from your question is that you only want a ONE value sent after a change. I interpret this to mean that as you are changing the potentiometers, the values should not be sent, but in stead the final value should be sent.
What is the final value? Only you can decide, but one way to define it is: If the last value read one second ago is the same value read now, then send that value. The timing can be anything you decide, 1 second, 1/2 second etc.
I would like to help you with code, but you submitted very little code, and I am not sure I understood what you meant. Please clarify your requirements.
Related
I have a task where I need to read 2 parameters from a BLE Beacon. The documentation was seriously lacking and after a fair amount of effort, I managed to get some basic information about reading the data from the BLE Beacon.
The parameters to read are
1) Battery Voltage of the sensor
2) Temperature the beacon has a built in temperature sensor.
I think I have tried almost every popular Python BLE library out there but I just can't seem to get the temperature reading out of the beacon. "I think" I am able to read the voltage. The reason why I said "I think" is because the value seems to match what was provided in the minimal document. And also when I put the beacon into the charger, I can see the value go up - an indication that it is the voltage reading. As I could not read the temperature ( because the UUIDs that are mentioned in the document, the value doesn't seem to change ). I have tried enabling the sensor in every possible way and method described - by writing 01:00 etc. I spent a fair amount of time to reverse engineer the thing. I ran a packet sniffer and managed to capture the data that was being transferred between the beacon and the mobile app ( They have a mobile app ). But then again I am not able to figure out how the temperature readings are being communicated between the beacon and the app. Let me break the whole stuff in smaller blocks.
Hardware: BLE beacon from which voltage and temperature can be read. The temperature sensor is built into the beacon. And the beacon itself is from Texas Instruments but the temperature, voltage sensing part is done by a third party. They provided us with some minimal information and it was difficult to make sense of some of the sentences as they have trouble communicating in English.
The sequence to get the data goes like this
Scan for beacons
When the beacon is found then connect to it
Enable notification
Set notification interval
Get the voltage and temperature reading.
I have been able to do the first 4 real fast, and "half" of No. 5, i.e getting the voltage part. When I say real fast I mean I got that stuff with nearly no documentation available at that time.
As per the info that I have the data resides in these characteristics/UUIDs. Also please note that the UUID are not standard 128 bit and this caused me issues when using certain libraries. But after some tries I got to read/write to them using handles etc. The handles and other stuff I printed are ones that I read using PYGATT (A Python wrapper for gatttool).
The UUIDs are marked as 1st, 2nd, 3rd and 4th parameters and it has the following to say about the parameters
- A: 1 byte (2nd Param)
- B: Maj + Min values, 4 bytes (4th Param)
- C: 4 bytes (3rd Param)
- D: Enable/disable notification ( I have been able to turn this on )
- E: Set notification interval ( I have been able to set this and can notice the change in notification interval )
This is minimal so as to not have a large file. All it does is this - the mobile app connects to the beacon, then the notifications start and the temperate readings are retrieved by the mobile app. Like I had mentioned, I don't seem to have problem reading the voltage, it's only the temperature that I am getting stuck at. I have been at it for a week now. I think I have tried nearly everything that I could think of. I even enumerated all the writable characteristics and tried writing numbers like 1 ( enables the sensor? ). I could have offered a bounty for this straight away if it were possible. I rarely get stuck for so long with a problem. This is driving me a little crazy. I am getting close to my wits end - I guess it's time for a super hero - anyone out there? :) I can provide for every bit of information needed if someone could indicate what is wrong. I even wrote a cordova app ... and tried a bunch of stuff from my Android phone. I can connect ... write to characteristics, read stuff etc but temperature ready, nah!!! It just won't budge. All I get is the same set of values ( I used a JSON.stringify to display A, B and C). I can bother about the byte order later. I guess that is a smaller problem.
The communication between the beacon and a third party mobile app is fine, it is able to read the temperature info just fine.
I have been looking at wireshark data and I am fairly sure that the temperature data is being communicated at this stage. But then when I decode the "value", it looks like it's the voltage. It mentions l2cap but I am not sure how that is being used here to send the temperature readings ( if it is using that in the first place ).
Update: Wrote to every writable characteristics. Wrote values like 1, 0100, 2, 7 on every writable characteristics. At the same time I was reading every readable characteristic ( in a loop ) and doing a comparison (just true/false) with the previous set of values. This seemed like a quick and easier way to know if something changed. Didn't want to take chances with converting the hex to a float. I can figure out the byte order later.
From the sniffed data (wireshark) I can only see 3 writes happening on the beacon.
I am not fully sure, even after a long discussion, but it seems that the four bytes of the notification are used for the voltage as well as the temperature, since the temperature can most probably be derived from the voltage.
From the values it seems that those four bytes represent the voltage in float (if you ignore the absurd factor of 10^-38 that comes in because only 4 bytes instead of 8 bytes are used).
Since typically the temperature T is derived from a resistivity measurement, where the resistivity R is proportional to the voltage U (if the current is constant), you can in principle calculate the temperature T from the voltage U.
The problem is that T(R) is relatively linear, but not perfectly (in contrast to U(R) which is assumed to be U=RI). So you may need to plot the values for T(U) to find out the curve that they are using.
To add to the confusion, I got the best results when only using the first five bits of the third byte and the eight bits of the fourth byte. I am not aware why this is the case, and it might point to some trouble still.
The best option is to ask for their function T(U) that they are using. If they can and will provide it for you...
I have a question. I want to change the time value of a packet in a pcap. when we open a pcap in wireshark, we see a timestamp value in 2nd column after the packet serial numbers.
I want to change the time value of the packets. Though I am able to do the same but face a problem like below.
Let's say current time value showed for the packet in wireshark is 0.960727
I want to add 100000 to this time stamp.
Now the new stamp for the packet becomes 0.1060727 which ideally should be 1.060727.
If you open any pcap file in wireshark, you will never find a time value of more than 6 digits after decimal point.
But when I add this value I get 7 numbers after decimal point.
Could anhyone please let me know how can I make the time value to 1.060727 instead of 0.1060727 ?
Thank you for your suggestions here.
Regards,
Som
It's not completely clear to me what you're trying to do, but I'm going to take a guess that you are attempting to manually modify the timestamp of a single packet by editing the binary capture file. Assuming the file format is a .pcap file, then I suppose you're attempting to add 100000 microseconds to the timestamp of one particular packet?
Assuming this is the case, then you need to locate the packet header's ts_sec and ts_usec values and add 0x000186a0 microseconds to the current value of the ts_usec value, but if this value exceeds 0x000f423f (i.e., it's greater than or equal to 1 second), then you should add 0x00000001 to the ts_sec value and subtract 0x000f4240 from the newly computed ts_usec value.
One important thing to keep in mind is whether the .pcap file is written in big-endian or litte-endian format. This is determined by the so-called magic number (0xa1b2c3d4 implying big-endian and 0xd4c3d2a1 implying little-endian). Make sure you perform the addition/subtraction using the correct byte order of the ts_usec value and ts_sec value if appropriate, and make sure you write the bytes back to the ts_usec and ts_sec fields in the expected byte order of the file; otherwise your resulting timestamp will not be correct.
If this is not what you're attempting to do, then please clarify exactly what it is that you are attempting to do.
I have a problem in getting clear and not jumping values from MPU9050 DMP. I used Jeff Rowberg's code. The problem is when I use the code all is perfect, YPR is very smooth. But when I use that in my program with delay I have jumping values over time. Depending on the delay, jumping values vary.
I used a delay because I'm reading the serial values by unity and unity needs a little delay on the Arduino side to read the data. Can someone please tell me what the problem is and how I can fix it?
Thanks a lot.
It is likely that the fifo buffer is overflowing, leading to incorrect data. This would happen if you put in a delay that lasted for longer than your dmp frequency is. One strategy you could use is to read the data as fast as you can from dmp, but only send data over the serial port every other or every three readings, depending on what kind of delay you need between readings.
If you edit your question with what your dmp frequency is and what your desired serial frequency is I could try to help more.
I am a newbie to Serial Port Analysis and I would appreciate some help on this. my specific question is...
If I have raw data from a serial port analyzer program, how will I locate measures like temperature, pressure, energy etc?
What should I look for in the raw data that will help me identify these units of measure?
What is the best way the extract relevant data from this raw data?
I would be very grateful if you can provide me any help with respect to this. I am unable to figure out how to do this.
Thanks a lot.
The best way that I know of to do this is to find the "reset" identifier, also called the "End of Stream" identifier or sequence. I am assuming that the data is a continuous flow not a one-time transmission.
If the data is continuously cycling, you need to find where the transmit begins (or ends) and then start metering your capture from there. Most devices will have an associated manual or documentation that give you the end sequence (or optionally the start sequence) and then the method by which they identifier their data.
For instance, the device may end a message by sending 4 all zero bytes in a row, then begin again by sending one byte that identifies the sensor, and another two bytes with the data, followed by the next sensor etc.
You would then watch the stream for 4 zero byte entries, and then start capturing 3 bytes at a time, one for the sensor and two for the data, until you saw 4 zero byte entries in a row again.
The Ethernet II frame format does not contain a length field, and I'd like to understand how the end of a frame can be detected without it.
Unfortunately, I have no idea of physics, but the following sounds reasonable to me: we assume that Layer 1 (Physical Layer) provides us with a way of transmitting raw bits in such a way that it is possible to distinguish between the situation where bits are being sent and the situation where nothing is sent (if digital data was coded into analog signals via phase modulation, this would be true, for example - but I don't know if this is really what's done). In this case, an ethernet card could simply wait until a certain time intervall occurs where no more bits are being transmitted, and then decide that the frame transmission has to be finished.
Is this really what's happening?
If yes: where can I find these things, and what are common values for the length of "certain time intervall"? Why does IEEE 802.3 have a length field?
If not: how is it done instead?
Thank you for your help!
Hanno
Your assumption is right. The length field inside the frame is not needed for layer1.
Layer1 uses other means to detect the end of a frame which vary depending on the type of physical layer.
with 10Base-T a frame is followed by a TP_IDL waveform. The lack of further Manchester coded data bits can be detected.
with 100Base-T a frame is ended with an End of Stream Delimiter bit pattern that may not occur in payload data (because of its 4B/5B encoding).
A rough description you can find e.g. here:
http://ww1.microchip.com/downloads/en/AppNotes/01120a.pdf "Ethernet Theory of Operation"