Say I have a BLE device that is both a server(has info) and a peripheral(needs to access outside info), that upon receiving or generating it's own data has to share with other Server/Peripherals in it's vicinity.
Would it be beneficial that I only attempt to connect to the devices through BLE when their is data to be transfered "even though it periodically connects to each server sequentially to see if it can" or would it be better to keep connections simultaneously use callbacks to determine when connected and simply transfer data when required to(through from what i understand the devices that I use can only process on gatt operation at a time meaning having 4 connections to quickly transfer data is irrelevant).
In other words is it beneficial to continually reconnect and disconnect a peripheral and server or simply have a connection to as many servers as I need(even though apparently I can only perform one gatt operation at a time i.e 1 characteristic read/wright).
balanced requirements.
If possible, instead of reading to see if there's data available, I would stay in connection with the devices and have them use Bluetooth notifications to communicate data when it becomes available.
Alternatively you could consider having the Peripherals advertise only when they have data to invite the Central mode device to connect. It would need to be periodically scanning to detect this however.
Advantages/disadvantages depend on your priorities and the nature of the two types of device.
FYI you're mixing terminology a little btw. When discovering devices you have a GAP Peripheral which advertises and a GAP Central which scans. After the Central connects to the Peripheral connect you have a GATT Client and a GATT Server. Usually the GAP Peripheral becomes the GATT Server but it does not need to be this way. The GAP Peripheral can just as easily become the GATT Client. It's the GATT Server that has the state data in an attribute table in the form of Services, Characteristics and Descriptors.
Related
I want two BLE devices to connect to each other, and exchange data bi-directionally.
These devices are exactly equivalent - this is basically a serial cable between two peers.
There is not one 'store of data', or hierarchy between the two.
Both ends basically transmit data to each other (there is no polling for 'read' data), and no acknowledgement is required.
BLE documentation refers to the Characteristic Properties as:
A Client can send data to the server with 'write'
A Server uses a 'notify' characteristic (which the Client subscribes to)
So basically I have just one question:
Can I just have one Characteristic, with two properties
'Notify' (writing to Client) and
'Write' (writing to Server)
That the Peripheral advertises, and this will enable two-way write behavior?
First a few things to make everything clear, in case somebody wonders:
BLE has the concept of Central and Peripheral. Peripherals advertise their presence. Centrals scan and discover peripherals which they then connect to. Once the connection is created, these link layer roles do not really matter in terms of GATT and L2CAP Connection oriented channels, which are both used to transfer data.
Every device (including centrals) supporting connections over BLE must have a GATT server. While it is far more common that the peripheral only uses the GATT server role, it can act as GATT client. Centrals can also act as GATT servers. Furthermore, the both GATT roles can be supported simultaneously. Both sides can for example expose an identical GATT server characteristic with the write property. Not sure if this is a common approach though.
Now, back to your question. You can definitely have one characteristic both having the "write" and "notify" capabilities. When you write, I would suggest using "Write without response" since that has better throughput than "Write with response", since it does not require an acknowledgement between every write. Over the link layer, all kinds of packets are still sent "reliably", meaning there will be no packet drops unless the whole connection drops.
I would however suggest you to have one characteristic per direction. The reason is that some Bluetooth stacks can not in a thread-safe way both send and receive data on one single characteristic, due to the API design structure. In particular, Android's API has one method to set data on a characteristic and another method to send the current data of a characteristic. If a notification arrives in between these calls (where the bluetooth stack internally assigns the new data to the characteristic), the write operation will send the data that was just notified, instead of the intended data.
You could also check out L2CAP Connection oriented channels (introduced in Bluetooth v4.2) which is a good way to transfer raw data byte packets, when the GATT structure is not appropriate.
I am planning to develop a small project with a NRF52 (or other BLE chip if that'd matter). Preliminary, I would like to know, if I can broadcast data without "abusing" the advertising bytes?
Scenario: Two smartphones connect with my device and they enable some notify-characteristic over which i would like to receive data with a potentially high frequency (up to 100Hz maybe) on both devices. (I know 100Hz is already close to the minimum 7,5ms or so that ble supports... just to say i wanna reach that limit basically and be as fast as possible with receiving)
So: if I connect two central devices, will they receive the same notify signal or will I have to send one for each central device, essentially lowering the max frequency at which i can receive data?
In the latter case, is the best way to broadcast ble data to multiple devices via the advertising bytes?
Kind regards, have a good one
When you use GATT notifications over BLE, the notifications are individual per connection. So if you want to send the same notification to two connected clients, the data is duplicated over the air. In general, all GATT traffic is individual per connection.
If you send one packet per 10 ms to two devices, that should be fine. Note though if one packet is lost, it will be resent during the next connection event and hence then two packes will be sent to that device (assuming you produce an additional packet after 10 ms as usual).
You can use advertising instead to broadcast data. Every device that scans can see your data. Data you send in ADV_IND can be seen by an unlimited amount of scanners.
If it's better to use advertisements or GATT to send data to multiple devices depends on a lot of factors. You should experiment what works best for you.
https://developer.android.com/about/versions/10/features#bluetooth-le-coc
Android 10 has the BLE CoC, and it will enable apps to transfer larger data streams. So What's the max throughput of BLE CoC.
Is it better than BLE with Data Length Extension(DLE)?
CoC has nothing to do with LE Length Extension (it's like asking if WebSockets are faster than IPv6). It is just a more convenient way of sending a stream of data than for example with GATT. You will get more or less the same throughput as with GATT notifications/write without response.
An actually good thing with CoC is that the remote device can decrease the incoming throughput if it is not capable of handling the incoming data in time using the Credit system. With GATT notifications or write without response that is not possible and if you follow the GATT standard strictly packets will be dropped if they can't be handled in time.
My goal is this: I have a bunch of sensors out in a field connected in a sort of P2P network. On one side of the field I have a device that provides a BLE server to bridge data between a controller (phone or laptop) and all the devices out in the field.
One of the requirements is a sort of network visualization and management service. The gotcha with this is that there are a variable number of devices out in the field.
I have a plan to have the bridge device send a broadcast out to the network to get all the devices connected. My only problem is that I'm relatively new to BLE and GATT in general and I'm not certain what the standard is for showing a list of data with a dynamic length.
Is there such a standard? Do any of you have any tips to help me wrap my head around how to organize this into a GATT?
Thanks for your help
To the best of my knowledge, BLE and GATT don't have any best practice or pattern that would fit your requirement. So you have to roll your own.
An option would be to implement a request–response protocol: The controller sends a request to the BLE server (e.g. requesting the data for sensor 17) and the server responds with the data.
In GATT terms, the server provides a service with two characteristics:
The request characteristic (writable)
The response characteristic (readable with notifications)
For communication with the server, the controller connects to the server and activates notifications for the response characteristics. Then it writes the request to the request characteristic and waits for an update on the response characteristic.
As BLE has a low bandwidth, you should use a compact, binary protocol (and not JSON or XML).
I have a remote sensor that talks to the world via a low bandwidth TCP connection over GSM. It can often be in a location with extremely patchy GSM connectivity though. At the moment, the remote sensor waits for a GPRS network and then initiates a TCP connection with the server and then listens for commands (which are only a dozen or so bytes every hour or so)
Is an SMS more or less likely to get through to the remote sensor than being able complete a TCP connection? I guess I'm wondering how likely it is that a network signal strength is sufficient for SMS but not for TCP.
Using standard text messaging services for M2M is good idea if you can fit your data in 140 bytes. For short transmissions, opening an GPRS/1xRTT (2G) IP session, transmitting the data to a server and closing the session is less efficient and more likely to go wrong than sending an SMS.
As a side note, you can also use SMS (MT-SMS) to bring IoT device online (mechanism called "Shoulder-Taps" ).
Compared to data (or voice), SMS use only the signaling part of the mobile network. It's a quite cheap operation in terms of network resources reservation so you'll more likely get through with an SMS than with a data connection in a weak signal environment. One additional advantage is much lower power consumption of your terminal which might be important for M2M context.
As stated by #Jarek previously, you need to be able to pack your data into 140 bytes or to forge "long" SMSes which are kind of concatenation of "simple" 140-bytes SMSes