Bluetooth Low Energy Client scan range - bluetooth-lowenergy

I understand that Bluetooth Low energy(BLE) client scan for BLE peripherals. I want to know, how far,in meters, a BLE client can scan/discover a BLE peripheral.

An iPhone 5 will detect a RadBeacon USB (BLE Beacon) set at maximum transmit power about 40 meters away. At greater distances of up to 50 meters, it might be intermittently detected, but detections are not reliable. Outdoors with clear line of sight, it can sometimes be detected at even greater distances of up to 100 meters, but again this is not reliable.
This is typical, but just an example. The specifics depend on the transmitting device, the receiving device, any physical obstructions, and how much radio noise is in the area.

Related

ESP32 reduce BLE connection distance

i need to connect my phone to the esp32 with BLE(Bluetooth Low Energy). But only when you are close to just 1 meters. So, how can i reduce the meters? And also if you go far away from 1 meter, it has to disconnect. I would be happy, if you answer...
Unfortunately this would be very challenging to implement because you have to rely on the RSSI to approximate the distance which is not always very reliable. Have a look at these links below for more information:-
Things you should know about Bluetooth range
Using BLE for indoor positioning
Fundamentals of beacon ranging
Connect if the RSSI more than 40 and keep updating the RSSI for 2 seconds, get the average if it's more than -30 then you're too close to the device.

Kontakt beacon has garbage response time at 6 metres

I'm reading so much propaganda about BLE beacons (Kontakt.io, in my case) being accurate to the centimetre, readable at 70 metres etc etc, but my experience has been nothing like that.
I have 3 beacons. If they're in the next room over (door open, around 6 or 7 metres), it'll detect maybe one or two, after around 20 seconds. Even then I often need to restart my app over and over to detect it.
Move them to the same room, and they're pretty much okay. Everything's default, scanMode is 'LOW_LATENCY', scanPeriod is 'RANGING', I'm not sure what else I can do.
Do these results sound way off, or are they just not that good?
A few tips about Bluetooth beacons in general, not specifically Kontakt beacons:
When you need to restart your app to detect beacons, that clearly means it is something on the phone, not the beacons themselves that are the issue. That issue may be the app, the SDK, the Bluetooth stack on the phone, or the phone's bluetooth hardware. Try an off the shelf detector app like BeaconLocate for iOS or Android and also test with a different phone.
The range of a beacon is dependent on its output transmitter power, typically measured at 1 meter. This output power is adjustable on many hardware beacons and is often set lower than the maximum to save battery on battery-powered models. For best detection results, set the output power to the maximum that configuration allows. An output power at one meter should be at least -59 dBm for best results. Less negative numbers mean more power. Because some phone models have poor sensitivity and measure RSSI inaccurately, you may want to measure with different models. In general iOS models are more predictable receivers.
The range of a beacon between rooms varies greatly depending on materials in walls, furnishings, and local geometry. A beacon with an output power of -59 dBm at one meter can be reliably detected by a phone with a sensitive receiver at 40 meters away, but only with clear line of sight conditions (typically outdoors). Intermittently, I have seen such beacons be detected outdoors at over 100 meters away. Intermittently means that 99% of packets are lost, a small percentage are successfully received.
Always be skeptical of marketing claims from companies trying to sell you something. The above points should tell you what is achievable from an independent engineering perspective.

bluetooth - tx power and rssi

I am experimenting two low energy bluetooth 4. I am getting uuid, tx power level and rss values on the android app that I downloaded.
I noticed that one of these two is sending 0 for tx power level, but the other one is sending 4 for tx power level and see different RSSI values on the android app even though I put them in the same spot. It means that the distance is same between my android phone and these two bluetooth devices. If the difference is +/- 5, I understand, but the difference is +/- 15. Is is because of the tx power level?
And oo I need to take tx power level into consideration to calculate the proximity between the BLE 4 and my android app?
You cannot directly relate RSSI and absolute distance between BLE central and peripheral. Of course RSSI is affected by, but not only, distance. However there are other significant factors such as interference, transmittion medium, etc. If your two BLE peripherals are two different models, the values may even vary more.
RSSI fluctuating for around +-15 is very normal for BLE connections, and nearly impossible to eliminate in practical cases. So basically you cannot only rely on RSSI for calculating distance if you want the error to be less than several meters.

Why does GSM use full-duplex while cell phones have only one antenna?

According to this website:
Although GSM operates in duplex (separate frequencies for transmit and receive), the mobile station does not transmit and receive at the same time. A switch is used to toggle the antenna between the transmitter and receiver.
What, then, is the advantage of using separate channels for two-way communication? Communications can never go both ways at the same time; why have them separated (broadcast channels aside)?
One reason is to help with radio planning - if powerful cell antenna's (i.e. the base station antenna's) broadcast on the same frequency they received on, then they would have a hard time 'hearing' the relatively weak broadcasts from the phones, over the relatively strong signals from neighboring cell antennas.
"...GSM operates in duplex (separate frequencies for transmit and receive), the mobile station does not transmit and receive at the same time."
That's called half-duplex. It's done to save bandwidth and battery power, and to make cell phone conversations more difficult.

HM-10 battery uses too much energy on ibeacon mode

I m using hm-10 to deploy ibeacon with CR2450 battery.
But in several configurations, CR2450 battery drains too fast.
I need AT+MODE2
And AT+ADTY0 commands for ota and battery level.
In AT+ADTY3 mode device cannot send battery level in advertising data.
Is it possible to work battery level data in AT+ADTY3 mode or AT+ADTY3 mode with low power consumption?
Thanks
There used to be a firmware bug in the HM-10 that causes AT+BATC1 to draw excessive current when in sleep mode with AT+SLEEP or AT+PWRM0. It would not show at first, however usually after 3 hours, the bug arises and the devices starts to draw more current.
advertising # 1284 ms with AT+BATC0 = 1.6 uA with radio peaks.
advertising # 1284 ms with AT+BATC1 = 600 uA with radio peaks when the bug occurs.
I have not found a solution except turning off battery level reporting... It might be that the bug is solved in later firmware, I have not tested that.
Kind regards

Resources