Using BLE for indoor positioning - bluetooth-lowenergy
I have a project that i need to know that if some smartphones are in the same room to implement that i checked several approaches but i found using BLE the best.
The algorithm is that i have a master smartphone that i know is in the room and i want to validate all other smartphones with RSSI that this smartphone get from them. I want to know that does any body have any experience about this approach like that how much the RSSI should be that i be sure that other smartphone is in the same room that master smartphone is?
Unfortunately, RSSI is not a perfect model for getting the distance, and in your use case this is highly dependent on the structure of the room and architecture of the building. There have been many attempts in the past to get distance data from RSSI and most of them have been fruitless. Please see the links below for more information:-
https://stackoverflow.com/a/16927330/2215147
https://stackoverflow.com/a/32900428/2215147
https://stackoverflow.com/a/27943540/2215147
If you still want to use RSSI to get the distance data, the best mechanism would be a trial-and-error model. Basically, if your project is going to be carried out only in one room then you can run tests there on multiple devices from inside and outside the room, then collect the RSSI data that you get accordingly. You can then write an app that tells you whether the device is in the room or outside based on the data that you have already collected. Please see the links below that can help you get started:-
https://stackoverflow.com/a/13724027/2215147
https://developer.radiusnetworks.com/2014/12/04/fundamentals-of-beacon-ranging.html
I hope this helps.
Related
Reading data from car's ECU ABS wheel speed sensors
I'm currently working on a project that requires gathering data from a car's wheel speed sensors(4 hall efect speed sensors). Those sensors are connected to the car's ECU responsible for ABS/ESP/Stability control etc. In order to extract the data from the ECU i need to make a request with a specific PID(parameter ID) AND know how to decode/compute the answer in order to extract any meaningful data. Unfortunately vehicle manufacturers don't seem to make such information public. So far i've ordered an arduino CAN BUS shield and a OBD2 to RS232 cable in order to make the physical connection. I have tried using a specialized hardware/software(that costs more than 1500euro) capable of extracting those parameters, but unfortunately it lacks logging functions. I tried using Wireshark to sniff the PIDs called, but had no luck there either. If you guys have any ideas, questions or suggestions, please write them down. I'm open to criticism and know that i might be missing something important. Thanks. P.S. This is a university project im working on. I need data samples from the wheel speed sensors and further computing of the data sample is done with the purpose of researching car safety and behavior in dynamic road scenarios.
You can only read the OBD data from the OBD-port. The OBD PIDs are generalized in ISO/CD 15765-5. You probably find non reliable descriptions also in Wikipedia. But in order to get the other PIDs, firstly you should know that those data are heavily under control by the car manufacturers and you have to hack them. One way to find them (but very unlikely to find one!) is the try and error method. You should access the main CAN-BUS wires and the buy a connector device so you can sniff the packets. then monitor all the packets and make a small change. Monitor it again and compare these two. Maybe maybe maybe you have a chance to find some non-safety features with this method but finding security functionalities like ABS is heavily in doubt. UNLESS you are some sort of genius hacker who can do weird stuff! If you can do it, then call the manufacturer and show them what you have so you would likely get a nice job and salary by them! ONCE I saw a youtube clip that a guy could control a TOYOTA (if I remember correctly) with a laptop! and also maybe you can buy such info on the dark web which I advise highly against it!
Sub Second iBeacon Monitoring
I have no hands on experience with BLE and beacons at this point, and am having a hard time figuring out the viability of using them in a particular manner. Wondering if anyone can provide some high level feedback about the viability of this use case: The goal is to use beacons to track a running race. Runners with their smartphones would be able to log times when they hit various beacons spread throughout an indoor course. Pretty simple scenario. The problems that I foresee are 1) the ability to continuously scan for beacons at sub second intervals, and 2) the ability to then determine closest range to the beacon at sub second intervals. I've tried parsing through the estimote and kontakt.io SDKs and am not certain that what I want to do is entirely possible or feasible with these particular beacons (or any for that matter). Further, would there be any device (the smartphones) specific limitations that would apply? Thanks!
If you are using Estimote SDK you can set this property on BeaconManager. See BeaconManager#setForegroundScanPeriod. SDK Docs
Decode IR (RC5) steps
I have captured the IR signal ( I believe RC5) of a HVAC remote control, like this one.... (using Saleae) This gave me a sequence of pulses of different width that I can make the Arduino reproduce and the HVAC recognize the request. An example is: unsigned int power_ON[180] = {2888,3918,1911,1049,907,1992,903,989,1936,1023,907,1049,903,989,903,1049,903,1049,907,1992,1851,1992,1915,1049,928,963,928,1023,903,1049,907,1049,928,963,928,1023,903,1053,928,1023,928,963,928,1023,928,1027,928,1023,928,963,928,1023,907,1049,928,1023,928,1906,1941,959,2940,3866,1962,997,932,1967,929,963,1962,997,933,1019,959,933,933,1023,954,997,928,1971,1902,1941,1941,1019,958,933,958,997,954,997,933,1019,959,933,959,997,954,997,928,1023,958,933,958,997,954,997,933,1019,958,933,958,997,954,997,933,1019,958,1881,1962,937,2940,3862,1966,993,958,1941,933,959,1966,993,958,997,954,937,954,997,933,1023,954,1941,1880,1966,1962,997,954,937,928,1023,933,1023,954,997,928,963,928,1023,933,1023,929,1023,928,963,929,1023,928,1027,928,1023,928,963,928,1023,928,1027,928,1023,928,1910,1911,989,3832}; Could anyone guide me on the steps to decode the message? or understand the different pulse width? I guess there must be certain defined pulse widths? Each meaning something different? My initial though is that I need to: 1) Decode raw data by converting pulses to digital 1,0 2) Identify from digital data each section of the code, I think all the configuration is send on every key press, so identify the section of the code where it states the temperature, fan speed, hvac mode, clock, etc 3) Be able to put together a full IR code based on wanted setup, instead of just saving the whole code and reproducing it. Any hint or guideline on how to do this? Am I on the right track? edit: I have tried analysing one same mode and try to figure out which pulses change, but I cant figure it out as the number of pulses varies. Here you can see Cooling mode and maximum fan speed with changing temperature setting. here is the excel file for anyone really into helping: http://www.filedropper.com/analysiscoolingmodefanspeedmaximum and the end of the message
So I put your pulse widths (?) into a diagram: http://i.imgur.com/C9k64qB.jpg Without knowing more about what this actually represents, this does not really help i guess.. What buttons did you press while recording this? How did you record this? I would try to visualize all the data you can get. Record all buttons and put what you get in diagrams. Then stare at them and maybe you will find some logic hidden in there. Also, open the remote, look what IC's are inside and look up their datasheets. Maybe there you will find the protocol and you won't have to do any reverse engineering at all. Keep us updated!
DirectShow - Order of invocation of IAMStreamConfig::SetFormat and ICaptureGraphBuilder2::RenderStream creates issues in some video cameras
I have to configure my video camera display resolution before capturing and processing the data. Initially I did it as follows. Created all necessary interfaces. Added camera and renderer filters Did RenderStream with Capture and Preview PIN Categories. Then did the looping through AM_MEDIA_TYPE structures and setting the params. This worked for a lot of cameras, but a few cameras failed. Then I changed the order of 3 and 4 given above. That is, I did the setting of params before the RenderStream. This time, the error cases went through, but a few On board cameras in SONY VAIO laptop etc seem to fail. Now, my questions are Which is the optimal and correct method of getting and setting AM_MEDIA_TYPE parameters and running the graph? If there are different cameras, if I get an indication of which order is the best for a particular camera by going through the camera's DirectShow interfaces, that will also serve my purpose. Please help me in this at the earliest, Thanks and regards, Shiju
IAMStreamConfig::SetFormat needs to be used to set capture format before the pin is connected and rendered. This way the downstream subchain of filters is built with proper media types.
Are IO Control codes determined by the hardware or ...?
I have a small project (for my cell phone) on the go, and I believe I have found IO Control codes for what I want to accomplish (theres nothing at a higher level unless I can reverse engineer the dlls and call them). However, the codes are from a different device from a different manufacturer (the board is the same - a snapdragon 8650) Will those control codes be likely to work on my device, or is that going to be dependent on something manufacturer specific? Am I likely to be able to do permanent damage to my phone by trying them?
The answer itself is manufacturer-dependent. Having the same board, chances are at least some of the codes are the same. And the likelihood of causing damage is low (unless you hit FLASH memory). I'd give it a go, if it were my phone.