simulate a real world car on SUMO - traffic-simulation

Aoa, is there any way to simulate the latitude and longitude of android hardware by using GSM values of any android device on SUMO.

The fcd-output can be used to generate lon-lat traces for vehicles and pedestrians. Afterwards the tool traceExporter.py can be used to convert the trace to various trace file formats. See http://sumo.dlr.de/wiki/Simulation/Output/FCDOutput
and http://sumo.dlr.de/wiki/Tools/TraceExporter

Related

Labview 13 - Waveform graph not accepting all data

I am having trouble being able to plot three different sets of data onto a waveform graph. The waveform chart has no problem in accepting all the three sets of data, and displaying. However i need a history of data that i can export to an excel document and examine.
The circuit is setup as follows:
An NI DAQ 6001 takes a temperature reading from an LM35 that is measuring a brass block. Separate circuitry drives a current through a peltier device to maintain a specific temperature on this brass block. It is fundamentally controlled by PID control to allow an operator to choice a temperature that the brass block can be set at. To be able to tune the system perfectly i need to make a set of step changes and record the data and be able to graph at a later data to determine such characteristics as: linear / non-linearity / Oscillation / stability.
Unfortunately i do not know how to upload my program but i have attached a screen shot
enter image description here
enter image description here
It looks like you're using direct 1D array wired to the waveform graph. You should build all waveforms using Build Array function to form a 2D array which will eventually display as different plots.
A proper way in displaying waveform in graph includes time component. You would need to build waveform (Block Diagram --> Function Pallette --> Programming --> Waveform --> Build Waveform) with start time, delta Time for each 1D array. Then you can bundle this waveform into a 1D array to plot multiple plots.
Sharing your VI can help solving quickly. Just select all of your block diagram using Ctra+A and save it as VI snippet using Edit-->Create VI Snippet from Selection, then attach that image generated in this stack overflow response.

Photosphere viewer explanation

I am trying to build my custom Photosphere viewer to run using SDL2 and a custom IMU I purchased. So far, I have managed to read IMU values, open the .jpg and display it using SDL2.
My issue is how to make sense IMU data to read parts of the jpg appropriately. Basically, I do not want to display the whole jpg but just parts of it based on IMU data (I receive Euler angles or Quaternions). Right now, I am just using a single mono photosphere (I am not concerned with stereo yet), which is stored as a equirectangular projection, and I need to use the IMU to get it to a polar projection (I believe?)
I am not sure how to index the jpg based on IMU data to create a working photosphere viewer and I cannot seem to find a good explanation of how to address the jpg. Can anyone point me into the right direction? Thanks!
I was able to find a really great OpenGL based simple Python photosphere viewer here. I just then needed to create a rotation matrix from the sensor IMU. There are good tutorials to convert from Quaternion to Matrix like this one.

Convert lat/lon to zipcode / neighborhood name

I have a large collection of pictures with GPS locations, encoded as lat/lon coordinates, mostly in Los Angeles. I would like to convert these to (1) zipcodes, and (2) neighborhood names. Are there any free web services or databases to do so?
The best I can come up with so far is scrape the neighborhood polygons from the LA times page and try to find out in which polygon every coordinate is. However this might be quite a lot of work, and not all of my coordinates are in LA. As for the zipcodes, this 2004 database is the best I can find, however zipcodes are encoded as a single coordinates instead of a polygon. So the best I can do is find the minimum distance from a given coordinate to the given zipcode-coordinates, which is not optimal.
I was under the impression that google-maps or open-street-maps should be able to do this (as they seem to 'know' exactly where every neighboorhood and zipcode is), however I cannot find any API's to do the lookups / queries.
You can now do this directly within R itself thanks to the rather awesome ggmap package.
Like others mention, you'll be reverse geocoding using the google maps API (and therefore limited to 2,500 queries daily), but it's as simple as:
library("ggmap")
# generate a single example address
lonlat_sample <- as.numeric(geocode("the hollyood bowl"))
lonlat_sample # note the order is longitude, latitiude
res <- revgeocode(lonlat_sample, output="more")
# can then access zip and neighborhood where populated
res$postal_code
res$neighborhood
Use Reverse Geocoding to convert your lat/lon to addresses. It has some limit on the number of queries per day though.
Here is a nice blog post with examples how to geocode and reverse geocode using google-maps.
Try this one:
http://www.usnaviguide.com/zip.htm
There is some limit as to how many queries per day you can do on the site, but they also sell the complete database, which changes every few months.
Sorry that I don't know of any free resources.
As others suggested, geocode them into street address should work fine for zip code. i am not too sure about neighborhood, because you may have to look if street number is odd/even to see if it is located which side of a road that determines neighborhood.
An alternative way is to prepare GIS polygon feature (ESRI shape file for example), test each point against this set of polygons see which one it intersects.
zip code is very straighforward, you can download shape file from the census.
http://www.census.gov/cgi-bin/geo/shapefiles2010/main
neighborhood is harder, i'd guess. In another part of US i had to create my shape file on my own by combining definitions from municipal government, real-estate website, newspaper etc so that it looks like what people thinks neighborhood in the city are without having any overlap or gap. It can take some time to compose such set of polygons. you may crab census "block group", or even census "block" from the above page and merge them
Once you prepared polygon features, there are couple of GIS tools on different environment (stand-alone executable, GUI program, c/python/sql etc API, probably R as well, to do intersection of polygons and points.

Determining the position/direction of an aircraft

I'm working in a project that involves gyroscopes...
I'm using Arduino and an ITG 3200 to read the data from the gyroscope. I get 3 values in deg/s for each axis (x,y,z).
My question is: How can I know the actual (physical) position or direction of the device (let's say an airplane). There has to be a math formula or something like that.
Using only the gyroscope signal (which you have to integrate numerically), you'll eventually run into trouble, due to drift. What's normally done is combining an accelerometer (for low frequency signals, i.e. drift) with a gyroscope (for high frequency signals). Here's a link few links showing more or less exactly what you want:
http://www.starlino.com/imu_guide.html
http://www.instructables.com/id/Accelerometer-Gyro-Tutorial
http://www.starlino.com/quadcopter_acc_gyro.html
Also, see these StackOverflow questions:
Combine Gyroscope and Accelerometer Data
Integrating gyro and accelerometer readings
gyro, accelerometer, magnetometer and Kalman filter
How to determine relative position using accelerometer and gyro data
We are working on a similar problem.
We found this video on YouTube especially helpful, as it came with a paper as well as an implementation (which runs on Arduino):
http://www.youtube.com/watch?v=fOSTOnQzZCI
The paper and source code:
http://code.google.com/p/imumargalgorithm30042010sohm/
In our case (getting the orientation of a remote-controlled ball), we also had to include an accelerometer and a magnetoscope.

Amplitude of Audio Tracks

I want to develop an audio editor using Qt.
For this, I need to plot a waveform of the music track which I think should be a plot of peak amplitude of sound versus time(please correct me if I am wrong).
Currently, I have been using Phonon::AudioOutput class object as an audio sink and connected it with my Phonon::MediaObject class object to play the audio file.
Now, to draw the waveform I need to know the amplitude of audio track at every second (,or so) from this AudioOutput object so that I can draw a line (using QPainter) of length proportional to sound frequency at different times and hence, obtain my waveform.
So, please help me on how to obtain amplitude of audio tracks at different times.
Secondly,am I using the correct way of plotting waveforms of audio tracks - plotting amplitudes of sound against time by plotting lines by QPainter object on a widget at different times.
Thanks.
There is code which does both of the things you ask about (calculating peak amplitude, and plotting audio waveforms) in the Spectrum Analyzer example which ships with Qt (in the demos/spectrum directory).
Screenshot of Spectrum Analyzer demo running on Symbian http://labs.trolltech.com/blogs/wp-content/uploads/2010/05/spectrum.png
This demo also calculates and displays a frequency spectrum. As another commenter points out, this is distinct from a waveform plot: the spectrum is a plot of amplitude against frequency, whereas the waveform plots amplitude against time.
The demo uses QtMultimedia rather than Phonon to capture and render audio. If you are only interested in playing audio, and don't need to record it, Phonon may be sufficient, but be aware that streaming support (i.e. Phonon::MediaSource(QIODevice *)) is not available on all platforms. QAudioInput and QAudioOutput on the other hand are well supported, at least for PCM audio data, on all the main platform targetted by Qt.

Resources