I am trying to get the dangerous value of LPG leak using MQ5 Gas sensor. But I don't know the value from serial monitor that the analog reads. Is the output of gas sensor in ppm?
You need to provide more information for people to help answer the question. You need to provide a circuit schematic, how you are interfacing it to the microcontroller etc.
However, it is unlikely you are getting a PPM reading from the sensor unless you have calibrated the sensor and developed a function to convert voltage (or resistance) to a gas concentration. First of all, you typically need to calibrate the sensor by exposing it to a known concentration of the analyte (gas) and measuring the voltage / ADC reading. Second, those types of sensors usually have non-linear response to the gas concentration, therefore developing a calibration function is difficult (and you need to characterize the sensor's response to different levels of gas concentration to develop an algorithm). There are many ways to do this, but that can be an expensive and time consuming process. What compounds the problem is that different sensors typically exhibit different responses to the same gas concentrations between batches or even two sensors from the same batch. The sensors are also sensitive to temperature, relative humidity and other contaminants which can skew the readings. These sensors can be useful for generally detecting the presence of the analyte (gas), but typically aren't very useful or accurate when estimating PPM within a reasonable error margin. There are also many variables to control with that sensor such as the heater voltage, heating pulse duration, load resistance, etc. These sensors have MANY cross sensitivities and responses to other analytes (other than LPG).
If you have implemented the sensor as per the manufacturer's data sheet and have converted the resistance into a usable voltage and are reading the values from an ADC, you may be able to interpret the ADC readings as general indication of the presence of the gas (but maybe not.. you could be measuring humidity fluctuations), in any case you would likely need to calibrate the sensor and develop an algorithm to estimate the PPM.
It appears from looking at a couple websites that this sensor returns the value of Rs (sensing resistance).
Looks like you will have to calculate a ratio of Rs/Ro (Ro being the resistance of "clean" air) for which you would derive the ppm from the MQ5 datasheet graph.
Related
Is it possible to calculate distance between BLE-BLE or Beacon-BLE device using Time(T) taken on packet received on receiver device with measured power or RSSI value?
Is there any formula for that?
You can get a formula for estimating distance from time of flight measurements from this paper.
However, commercially available Bluetooth chipsets do not provide accurate timers capable of measuring time of flight. Further, smartphones do not provide access to such time of flight data. As a result, such a formula is if little practical value for most use cases.
I need to acquire some analog signals and read a digital signal at fixed sampling frequency.
What is the correct way for do this?
Note that this is not trivial because during the acquisition process, at given sampling time, the digital signal can missing due the fast sampling rate respect to the digital signal frequency.
For example think at digital square wave of 50Hz frequency and a sampling rate of 100Hz.
You can check for a suitable data acquisition system. For example Dewes oft data acquisition systems are really easy to use and have highly flexible software that requires zero programming to acquire, store and analyse the data.
They offer interface to any analog signal and sensor and also offer digital data buses like CAN bus, CAN FD, XCP, EtherCAT, EtherNET, video, and others.
I'm using bluetooth LE to stream some pressure sensor data, along with an inertial motion unit data.
The IMU sensors need self-calibration to provide useful data. Some example of the calibration is moving it in a 8-figure path, or lay it down still for about 1 second. The IMU provide data, along with the calibration level (uncalibrated -- partially calibrated -- totally calibrated).
I currently stream pressure sensor + IMU data through a single service. Where should I put the IMU calibration data? In a different service, or a different characteristic?
Ideally, I want to be able to check the calibration level, perform the self-calibration, and then will start recording real data.
I would suggest you use one service and different characteristic.
Actually on Bluetooth SIG there is are many similar BLE profiles which may same with your requests, you may refer them.
There was an ongoing profile named TPMS but not adopted yet.
I am experimenting two low energy bluetooth 4. I am getting uuid, tx power level and rss values on the android app that I downloaded.
I noticed that one of these two is sending 0 for tx power level, but the other one is sending 4 for tx power level and see different RSSI values on the android app even though I put them in the same spot. It means that the distance is same between my android phone and these two bluetooth devices. If the difference is +/- 5, I understand, but the difference is +/- 15. Is is because of the tx power level?
And oo I need to take tx power level into consideration to calculate the proximity between the BLE 4 and my android app?
You cannot directly relate RSSI and absolute distance between BLE central and peripheral. Of course RSSI is affected by, but not only, distance. However there are other significant factors such as interference, transmittion medium, etc. If your two BLE peripherals are two different models, the values may even vary more.
RSSI fluctuating for around +-15 is very normal for BLE connections, and nearly impossible to eliminate in practical cases. So basically you cannot only rely on RSSI for calculating distance if you want the error to be less than several meters.
I have been working on a ARM cortex A8 board on mp3 decoder.
While doing this i have a requirement saying the mp3 decoder solution i am doing should consume 50 milli-watts of power. This generated few questions in my mind when i thought about it:-
1.) I recall that there is some relation between the Core Voltage applied(V), the clock frequency(f) of a processor and power consumed(P) as something like P is directly proportional to the voltage and frequency squared. But is the exact relation. Given operating clock Frequency, voltage of a processor, how can we calculate power consumed by it.
2.) Now if i get the power consumed from step 1.) at some clock frequency, and i am told that the decoder solution i am giving, can consume only 50 milli-watts, how can i get the maximum limit on MCPS, which will be the upper bound on the MCPS of my decoder solution running on that hardware board?
Can i deduce that if power obtained as in step 1.) say P, is consumed at frequency F, so for 50 milli-watts power, what is clock frequency frequency and calculate accordingly the frequency. And then call this frequency as my code MHz (MCPS) upper bound?
Basically how does one map(is there any equation) power consumed by a software to MCPS consumed
I hope this is relevant here, or should it go to superuser?
Thank you.
-AD.
It really depends on the architecture.
From their own page:
Core area, frequency range and power consumption are dependent on process, libraries and optimizations.
Power with cache (mW/MHz) <0.59
<0.45
Basically, it states that you can't accurately calculate the power consumption, so your best bet would be to do some measurements yourself. Try to run a full CPU-usage application and meassure the power consumption. It will give you some idea of the max-load, which will be a good start for you (to know how much you need to optimize your code and insert idle points).