I read a paper discussing about a method to reduce beacon's measurement error up to 10%.
I was wondering if it's possible. Basically the self-correcting beacon system is composed of two beacons: target beacon and self-correcting beacon. These two beacons are positioned one meter far from the other.The measuring device (the device we want to calculate the distance from) is positioned at d meters far from the target beacon. Basically the target beacon broadcasts a message, self-correcting beacon will get, calculate the RSSI save as srcPower (RSSI at 1m distance) and broadcasts again. On the other hand, the target device will get the message from target beacon that was sent in the beginning and a message from the self-correcting beacon. So the measuring device uses this "more recent RSSI(d0) (srcPower) to calculate the distance"... does anybody tried to implement and it worked?
The link to the scientific paper is this one:
https://www.ronpub.com/OJIOT_2015v1i2n03_Cho.pdf
Related
I'm reading so much propaganda about BLE beacons (Kontakt.io, in my case) being accurate to the centimetre, readable at 70 metres etc etc, but my experience has been nothing like that.
I have 3 beacons. If they're in the next room over (door open, around 6 or 7 metres), it'll detect maybe one or two, after around 20 seconds. Even then I often need to restart my app over and over to detect it.
Move them to the same room, and they're pretty much okay. Everything's default, scanMode is 'LOW_LATENCY', scanPeriod is 'RANGING', I'm not sure what else I can do.
Do these results sound way off, or are they just not that good?
A few tips about Bluetooth beacons in general, not specifically Kontakt beacons:
When you need to restart your app to detect beacons, that clearly means it is something on the phone, not the beacons themselves that are the issue. That issue may be the app, the SDK, the Bluetooth stack on the phone, or the phone's bluetooth hardware. Try an off the shelf detector app like BeaconLocate for iOS or Android and also test with a different phone.
The range of a beacon is dependent on its output transmitter power, typically measured at 1 meter. This output power is adjustable on many hardware beacons and is often set lower than the maximum to save battery on battery-powered models. For best detection results, set the output power to the maximum that configuration allows. An output power at one meter should be at least -59 dBm for best results. Less negative numbers mean more power. Because some phone models have poor sensitivity and measure RSSI inaccurately, you may want to measure with different models. In general iOS models are more predictable receivers.
The range of a beacon between rooms varies greatly depending on materials in walls, furnishings, and local geometry. A beacon with an output power of -59 dBm at one meter can be reliably detected by a phone with a sensitive receiver at 40 meters away, but only with clear line of sight conditions (typically outdoors). Intermittently, I have seen such beacons be detected outdoors at over 100 meters away. Intermittently means that 99% of packets are lost, a small percentage are successfully received.
Always be skeptical of marketing claims from companies trying to sell you something. The above points should tell you what is achievable from an independent engineering perspective.
I try to use beacon(HM-10) to broadcast my sensor's data, but there is a problem that I use a loop to write AT commands, after a while, it doesn't respond anything.
Here is the part of the code:
String pre = "AT+MARJ0x";
int sensorData = 0;
loop () {
sensorData = getSensorData(); // always returns 100 ~180
String atCommand = pre + sensorData; // ex: AT+MARJ0x100
BTSerial.print (atCommand);
delay (200);
}
It initially work successfully about 3-mins, and then it doesn't work and can't be sent any at commands.
Can anybody help me fix this problem?
What you are trying is not possible with an Ibeacon.
All you do is set-up the major number of Ibeacon data in your HM-10device over and over again with sensor data.
major number is a part of the Ibeacon data spec:
(source: https://developer.mbed.org/blog/entry/BLE-Beacons-URIBeacon-AltBeacons-iBeacon/)
Data Spec:
IBeacons broadcast four pieces of information:
A UUID that identifies the beacon.
A Major number identifying a subset of beacons within a large group.
A Minor number identifying a specific beacon.
A TX power level in 2's compliment, indicating the signal strength one meter from the device.
This number must be calibrated for each device by the user or manufacturer.
A scanning application reads the UUID, major number and minor number and references them against a database
to get information about the beacon;
the beacon itself carries no descriptive information - it requires this external database to be useful.
The TX power field is used with the measured signal strength to determine how far away the beacon is from the smart phone.
Please note that TxPower must be calibrated on a beacon-by-beacon basis by the user to be accurate.
For a HM-10 device AT-commands are normally only used to set-up the device, not for sending data.
Google some examples and learn how to setup communication between BLE devices.
I want to ask about I Beacon advertising, especially Tx Power.
I used two BLE module HM10 and HM11. I make one as a ibeacon (HM10). and other one used to connect and listen to HM10 broadcasting.
I used MCU ATmega32 AVR tied with HM11 and I used scanf function to read the broadcast. I want to extract the last byte (Tx Power). I want to measure the distance with AVR programming.
Could you tell me the algorithm?
The formula Apple uses to calculate a distance estimate to an iBeacon is not published. There are a number of alternative formulas including this one, based on a best fit power curve, that we wrote for the Android Beacon Library.
Further research we have done shows that the formula above basically works, but it has two main imperfections:
It does not work well for weaker beacon transmitters. With weaker broadcasts, the distance is underestimated.
It does not account for varying signal gains in receivers. Different receivers have different antennas and receivers which measure the same signals differently.
There is an ongoing discussion of the best formula here.
A bit late but hopefully useful to others. I have given up on Apple's "Accuracy" number; as #davidyoung points out, different devices will have different signal gains. Now I am not an engineer but more of a math and statistics person, so I have gone down the route of "fingerprinting" an indoor space instead. Essentially I read all RSSI from all beacons installed in a certain "venue". Some might not be within reach and therefore I just assume, in such cases, an RSSI of -95 dBm (which seems to be the floor past which a signal is not read any more). Such constituted array has the same beacons in the same positions at all times (even across app launches). I compute a 5 seconds moving average for each beacon (so a I se 5 arrays to do that). The resulting avg array is then shifted up by 95 units and normalised so that the sum of all of its values is one. If you want to tag an an indoor "point" you collect many of these normalised average arrays on that specific spot. I go ahead and construct a database of "spots". To forecast your proximity to any spot in a database you simply compute a quadratic distance of your current reading and the all of the fingerprints in the database.
Which beacons to use? At least class 2 in power. How many? At least a couple per room (put them in two adjacent corners, on the ceiling or high up).
The last step that you need to do is match the fingerprints with an x,y coordinate on your map. I never did this step, because I am mainly interested in proximity applications and not fully fingerprint and indoor space.
Perhaps the discussion above will serve you as a guidance on a technique that is used by many indoor location companies.
Disclosure: I have recently open sourced my code doing the above calculations.
iBeacon Question. Is this possible ..
Can I have 4 iBeacon’s that are used on the corners of a stage that is say 10ft by 25ft. This is used so I can detect this stage area in the App. (Are 4 iBeacon’s needed to do this or can it be done with 3?)
I Then need the App to detect 2 things ...
The app needs to detect if it’s inside or outside of the area, and how close it is to the area it is outside of, and on what side of the area it is.
If a 5th iBeacon is brought inside the area, can the app detect that there is a separate iBeacon that is within the stage area setup by the previous 4 iBeacon’s.
E.g. You App tells you there is a stage in front and where in relation to the stage you are by using the 4 iBeacon’s. The app then tells you that there is now another 5th iBeacon has entered the stage and where it is on the stage. The App can then detect that the 5th iBeacon has now left the stage.
Is this possible???
Yes you can, as long as each beacon has a unique identifier. Most likely I would configure the beacons to:
Have the same UUID
Have the same Major
Have a unique Minor
If you use CoreLocation's locationManager:didRangeBeacons:inRegion: the delegate will give you a callback with all the beacons it can see at a current time. You can use this to track your own list of beacons and see if a new beacon has appeared.
Example code might look like this:
- (void)locationManager:(CLLocationManager *)manager
didRangeBeacons:(NSArray *)beacons
inRegion:(CLBeaconRegion *)region
{
for (CLBeacon *beacon in beacons)
{
// TODO: Handle each beacon logic here
}
// TODO: Clean up any stale beacons here (e.g. remove old beacons)
}
```
csexton's answer is correct in terms of ranging multiple beacons, but you may have difficulty using multiple beacons to accurately determine a position in a 10'x25' stage.
iBeacon distance calculations are based on received signal strength and is affected considerably by things that absorb the radio signal - such as people. The iPhone doesn't have a directional bluetooth antenna so it can't triangulate the received signals, so this means if the beacon on, say, the left edge of the stage is being received the app won't know whether the device is on the stage, to the right of the iBeacon, or off the stage in the audience.
I guesss all you can do is try.
When using a non beaconing Zigbee network, I know that the 802.15.4 spec defines the use of CSMA-CA to control when two devices get access to a channel to make sure no two nodes "step on each others toes" so to speak. My understanding is that very simply, it requires each node to "listen before talking". Is that correct? Is there more information on the Zigbee implementation of this? In other words, where do I go to learn more about how to program a Zigbee chip to implement the same?
Also, if i have 20 end nodes sending data asynchronously to one coordinator, is the channel access mechanism enough to ensure that they do not broadcast at the same time and flood the coordinator? If five nodes (for example) attempt to broadcast at the same time, how will mutual exclusion be ensured? Where can I get some details on that?
Thanks
Rishi
The maximum size of a 802.15.4 packet is 1024 bits of payload. So the maximum duration of the frame (running in standard 250kbps rate on the 2.4GHz band) is about 5ms when you take preamble etc into account. If your end devices are polling at 1 poll/second it should easily manage 20 end nodes I think. If it gets too much the exponential backoff should ease the collision rate.
I'm sure you've seen these when searching, but just in case:
http://www.prismmodelchecker.org/casestudies/zigbee.php
http://www.dagstuhl.de/Materials/Files/07/07101/07101.FruthMatthias.Slides.pdf
http://www-public.it-sudparis.eu/~gauthier/Tools/802_15_4_MAC_PHY_Usage.pdf