I am working on making a custom controller for an aquarium light. I was able to figure out how to adjust the light's internal clock, and I was able to capture some of the communication, and I found this timecode 545f0d31574d52565951607631 which translated to ascii from hex becomes T_ 1WMRVYQ`v1. I know for sure it's the timecode, because it works as expected.
Anyone know what it is? Is it BLE specific? anyone know how to alter it?
I'm pretty sure the first 4 numbers are not part of the code, but a indicator for the device.
Edit:
It is BLE. I should have been more clear. It does most of the transmission on UUID 1000, with the characteristic uuid being 1001. The device doesn't have a built-in clock that I can see. It turn's on and off at the times I specify in the developer’s app. After a power failure, it "resets" to midnight. I know that value is the timecode, because when I input it using gatter tools, I can see the light reacts accordingly. I added a photo of it updating. –
You hint that that this is a Bluetooth Low Energy (BLE) device.
If it is BLE, then the UUID of the characteristic might be in the 16-bit UUID Numbers document. If it is a custom characteristic, then it will not. Official characteristics have the base address of 0000xxxx-0000-1000-8000-00805F9B34FB and only the four missing values are documented.
The specification for how time can be shared over BLE is documented in the GATT Specification Supplement if it is a Bluetooth SIG adopted characteristic.
It might be helpful if you update the question with what this values gives as the value on the light's internal clock.
Related
The output string is:
▒~▒▒~▒▒~▒▒~▒▒~▒▒~▒▒~▒▒~▒▒~▒▒~▒▒~▒▒~▒▒~▒▒~▒▒~▒▒~▒▒~▒▒~▒▒~▒▒~▒▒ffx▒f▒x▒x`▒x▒x▒x▒`▒x~x▒▒x▒▒x````▒````▒x~xx▒x▒f`▒x▒
And I know that over RS232 the output should look similar to:
ÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿSITE NAME,24/07/18,13:15:00,60,0.000,0.000,2.911,2585,
The time may change, as well as the last two numbers but the rest of the string should be consistent. Is there a way to figure out the character set that was used?
It is written in RS-485 Wikipedia as follows.
Protocols
RS-485 is not a protocol; it's simply an electrical interface. Although many applications use RS-485 signal levels, the speed, format, and protocol of the data transmission is not specified by RS-485. Interoperability of even similar devices from different manufacturers is not assured by compliance with the signal levels alone.
If the specifications of the device you are trying to connect are not documented, you only have to look it up with a measuring instrument such as an oscilloscope?
Your Problem might be having selected the wrong baud rate. Check your manual!
I have a task where I need to read 2 parameters from a BLE Beacon. The documentation was seriously lacking and after a fair amount of effort, I managed to get some basic information about reading the data from the BLE Beacon.
The parameters to read are
1) Battery Voltage of the sensor
2) Temperature the beacon has a built in temperature sensor.
I think I have tried almost every popular Python BLE library out there but I just can't seem to get the temperature reading out of the beacon. "I think" I am able to read the voltage. The reason why I said "I think" is because the value seems to match what was provided in the minimal document. And also when I put the beacon into the charger, I can see the value go up - an indication that it is the voltage reading. As I could not read the temperature ( because the UUIDs that are mentioned in the document, the value doesn't seem to change ). I have tried enabling the sensor in every possible way and method described - by writing 01:00 etc. I spent a fair amount of time to reverse engineer the thing. I ran a packet sniffer and managed to capture the data that was being transferred between the beacon and the mobile app ( They have a mobile app ). But then again I am not able to figure out how the temperature readings are being communicated between the beacon and the app. Let me break the whole stuff in smaller blocks.
Hardware: BLE beacon from which voltage and temperature can be read. The temperature sensor is built into the beacon. And the beacon itself is from Texas Instruments but the temperature, voltage sensing part is done by a third party. They provided us with some minimal information and it was difficult to make sense of some of the sentences as they have trouble communicating in English.
The sequence to get the data goes like this
Scan for beacons
When the beacon is found then connect to it
Enable notification
Set notification interval
Get the voltage and temperature reading.
I have been able to do the first 4 real fast, and "half" of No. 5, i.e getting the voltage part. When I say real fast I mean I got that stuff with nearly no documentation available at that time.
As per the info that I have the data resides in these characteristics/UUIDs. Also please note that the UUID are not standard 128 bit and this caused me issues when using certain libraries. But after some tries I got to read/write to them using handles etc. The handles and other stuff I printed are ones that I read using PYGATT (A Python wrapper for gatttool).
The UUIDs are marked as 1st, 2nd, 3rd and 4th parameters and it has the following to say about the parameters
- A: 1 byte (2nd Param)
- B: Maj + Min values, 4 bytes (4th Param)
- C: 4 bytes (3rd Param)
- D: Enable/disable notification ( I have been able to turn this on )
- E: Set notification interval ( I have been able to set this and can notice the change in notification interval )
This is minimal so as to not have a large file. All it does is this - the mobile app connects to the beacon, then the notifications start and the temperate readings are retrieved by the mobile app. Like I had mentioned, I don't seem to have problem reading the voltage, it's only the temperature that I am getting stuck at. I have been at it for a week now. I think I have tried nearly everything that I could think of. I even enumerated all the writable characteristics and tried writing numbers like 1 ( enables the sensor? ). I could have offered a bounty for this straight away if it were possible. I rarely get stuck for so long with a problem. This is driving me a little crazy. I am getting close to my wits end - I guess it's time for a super hero - anyone out there? :) I can provide for every bit of information needed if someone could indicate what is wrong. I even wrote a cordova app ... and tried a bunch of stuff from my Android phone. I can connect ... write to characteristics, read stuff etc but temperature ready, nah!!! It just won't budge. All I get is the same set of values ( I used a JSON.stringify to display A, B and C). I can bother about the byte order later. I guess that is a smaller problem.
The communication between the beacon and a third party mobile app is fine, it is able to read the temperature info just fine.
I have been looking at wireshark data and I am fairly sure that the temperature data is being communicated at this stage. But then when I decode the "value", it looks like it's the voltage. It mentions l2cap but I am not sure how that is being used here to send the temperature readings ( if it is using that in the first place ).
Update: Wrote to every writable characteristics. Wrote values like 1, 0100, 2, 7 on every writable characteristics. At the same time I was reading every readable characteristic ( in a loop ) and doing a comparison (just true/false) with the previous set of values. This seemed like a quick and easier way to know if something changed. Didn't want to take chances with converting the hex to a float. I can figure out the byte order later.
From the sniffed data (wireshark) I can only see 3 writes happening on the beacon.
I am not fully sure, even after a long discussion, but it seems that the four bytes of the notification are used for the voltage as well as the temperature, since the temperature can most probably be derived from the voltage.
From the values it seems that those four bytes represent the voltage in float (if you ignore the absurd factor of 10^-38 that comes in because only 4 bytes instead of 8 bytes are used).
Since typically the temperature T is derived from a resistivity measurement, where the resistivity R is proportional to the voltage U (if the current is constant), you can in principle calculate the temperature T from the voltage U.
The problem is that T(R) is relatively linear, but not perfectly (in contrast to U(R) which is assumed to be U=RI). So you may need to plot the values for T(U) to find out the curve that they are using.
To add to the confusion, I got the best results when only using the first five bits of the third byte and the eight bits of the fourth byte. I am not aware why this is the case, and it might point to some trouble still.
The best option is to ask for their function T(U) that they are using. If they can and will provide it for you...
I'm handling the Smartthings Zigbee motion sensor and i know this is IAS Zone device.
I read an question-answer and they said, "Before you get the information from the sensor, you need to enroll first."
(zigbee motion detect sensor usage)
so i tried to send 'write attribute command' to the sensor to enroll first on my python code like this,
data='\x00' + '\xaa' + '\x02' + '\x00\x10'+'\xf0'+'my MAC address'
('02' means write attribute command, '0010' means attribute of the Zone setting, 'f0' means data type-IEEE address)
this raw data format is from the 'zigbee cluster library' document.
but the sensor gave me 86 status, which means UNSUPPORTED_ATTRIBUTE
well, i think the command is wrong and my assumptions are,
-the format is wrong.
-the values i used is wrong.
-or both.
If you have any idea or any little hint, you can help me. thanks to read!
For most Zigbee security devices (IAS) you need to use these steps
You must advertise that you support the IAS cluster client when receiving a MatchDescriptorRequest (this one depends on product)
Write your IEEE address to the IAS CIE Address attribute (cluster 0x0500, attribute 0x0010)
Send a ZoneEnrollResponse with status ENROLLED to the device (some devices may require that you "trip" them and wait for them to send the ZoneEnrollRequest first)
For your packet format, Zigbee uses network byte order. I think your attributeId should be \x10\x00. This will matter for for the IEEE address as well.
I have several Microsoft bands, to be used as part of a group health initiative. I intend to develop a single app on a tablet which will pull the data from the bands. This will be a manual process, there will not be a constant connection to the tablet and no connection to Microsoft Health.
Does anyone know if this is possible?
Thanks
Emma
The general answer is no: Historical sensor values are not stored or buffered on the Band itself.
It does however depend on what sensors you are interested in. The sensor values are not buffered, so you can only read the current (realtime) value of the sensors.
But sensors such as pedometer and distance are incrementing over time, so these values will make sense even though you are only connected once in a while. Whereas for, e.g., the heart rate and skin temperature, you will only get the current (realtime) value.
So it depends on your use case.
I'm trying to pick up some serial comms for a new job I am starting. I have done some reading which has helped a lot however, a lot of the reading tells you about the specification of serial comms and what everything is, but not when is best to use particular options.
My searches for this information so far only seem to pull in the spec; perhaps as a novice I am searching for the wrong terms.
My questions then!
Baud Rate - I have read this is signal changes per second and is often mislabelled as bits per second. Is this essentially bits per second including the frame data if asynchronous, and actually bits per second if synchronous?
Parity - Even/Odd.. Is there any difference at all between the two? I'm thinking in terms of efficiency or similar. Does this only still exist for compatabilities sake?
Stop Bits - I have read so far you can have 1 or 2 stop bits. In C# there seems to be an option for 1.5 too. I can't find anything on why you would want/need more than 1.
If anyone can advise on these points, or point me to some recommended reading material I would be very grateful.
Thanks for reading.
edit: typo
You very rarely have a choice, you must make it compatible with the settings that the device uses. If you don't know then you need to look in a manual or pick up a phone. Do keep in mind that it is increasing very rare to work with a real serial port device, one that uses an UART. Most commonly you actually talk to an emulated serial port, implemented by a USB or Bluetooth device driver. The settings you use don't matter in such a case since the actual signaling is implemented by the underlying bus.
If you can configure the device then basic guidelines are:
Baudrate is directly related to the length of the cable and the amount of electrical interference that's present. You have to go slower when you get bit errors. The RS-232 spec only allows for a maximum of 50 ft at 9600 baud.
Parity ought to be used when you don't use an error-correcting protocol. It does not matter whether you pick Odd or Even. Odd people pick odd, it's their prerogative.
Stopbits is usually 1. Picking 1.5 or 2 help a bit to relieve pressure on a device whose interrupt response times are poor, detected by data loss.
Databits is almost always 8, sometimes 7 if the device only handles ASCII codes.
Handshaking is an important setting that never stop causing trouble since many programmers just overlook it. Modern computers are almost always fast enough to not need it but that's not necessarily true for devices. The most basic stay-out-of-trouble configuration is to turn DTR on when you open the port and to tell the device driver to take care of RTS/CTS handshaking. Xon/Xoff handshaking is sometimes used, depends on the device.
A good 90% of the battle is won by implementing solid error checking. It is almost always skimped on, bad idea. Very important for serial port devices since they have no error correcting capabilities themselves and very weak error detection. Always make sure that you can detect and properly report overrun, parity and framing errors. And test them by getting the settings intentionally wrong.