The .bin file or pcd file of the Kitti dataset has 64 layers.
But I will do 3D detection with my 3 VLP16 Lidars.
For learning, I want to reduce the number of 64 layers in the Kitti dataset to 16 or 32.
The kitti dataset consists of a .bin file, which can be converted to a .pcd file.
Can I post-process these files in my desired direction?
Yes, you can do it. The approach is using range image. First, you need to read the pcd file line-by-line. Then you calculate the angle of those pointcloud and save it in a form of 2D image with numpy. Create new numpy array, then select a row you want to use (e.g. from 64 channel->16 channel you will use row number: 0, 4, 8, etc). Then convert back to pointcloud data.
Related
I'm trying to project some point data in ArcMap.
I created the point data in excel, then summarised it in R and exported it as a csv. Then added it through 'add XY data' tool in ArcMap, and exported it from an events file to a shapefile.
The points appear in the correct place, relative to each other, but when tried to overlay on the world imagery basemap, it simply sits down near Antarctica.
I've checked and re-checked the projections of the imported layer and the dataframe (I'd like to work in GDA_1994_MGA_Zone_56), but have also tried changing to a geographic coordinate system, which doesn't change anything.
In fact, when I try to change the projection to a gcs, using the 'project - data management' tool, this error comes up.
invalid extent for output coordinate system
Failed to execute (Project).
I can't for the life of me figure out what is going wrong, there's probably a simple explanation, but I'm at my wits end!
It helps to have them stored originally as numeric data, but since it's actually coming in as a CSV, I think they are importing as numeric -- as you note, they are correct relative to each other.
If your points are showing up on a map but in the wrong location, then their initial projection definition is incorrect. Looking at the Add XY Data dialog, you need to specify an initial coordinate system. If nothing is specified, ArcMap assumes that it is latitude/longitude decimal degrees.
Once it's imported as the wrong coordinate system, attempts to reproject will just continue the But you do not necessarily need to delete the points and start to re-import from scratch. Try using the Define Projection tool (which modifies the metadata to the desired projection) instead of Project (which mathematically recalculates coordinates from current projection to a new projection).
When I load a tiff image in R as raster brick then I have the expected number of pixels (number rows and columns) when I check afterwards the details of the image. But I expect 44 layers. I think I can change the number of layers with the nlayers function. Unfortunately, this does not work. The number of layers remain at 4.
sample_brick = brick("file_example_TIFF_1MB.tiff", nlayers = 44)
Can someone help me how to adapt the number of layers?
Here you can find a example tiff image.
https://file-examples.com/wp-content/uploads/2017/10/file_example_TIFF_1MB.tiff
The brick function creates a RasterBrick that has all the layers that are in your file.
Your expectation is probably wrong.
You cannot add imaginary layers when creating an object from file.
I am trying to convert a raster of cell size 50x50 into a polygon without success.
The raster file has some values and most of them are zero values. When I convert the raster using the tool Raster to polygon, the result is a big polygon for all the values equal to zero, instead of a polygon for each cell with zero value.
Someone knows how to create an independent polygon for each zero value?
I attached an image of my resutls.
Read this article ... it may solve your problem:
https://support.esri.com/en/technical-article/000012696
Steps:
1) Use the Raster to Point tool to convert each pixel to a point (found in Conversion Tools > From Raster). These points are later used to label polygons.
2) Navigate to ArcToolbox > Data Management Tools > Feature Class > Create Fishnet. The values entered into this tool are taken from the properties of the raster. To access the properties of the raster, right-click the raster in the Table Of Contents in ArcMap > Properties > Source tab.
3) Navigate to ArcToolbox > Data Management Tools > Features > Feature To Polygon. Run the Feature to Polygon tool on the fishnet output from step 2, with the Raster to Points output specified in the Label Features field. This gives the final polygon which has a field named GRID_CODE with the cell values of the raster.
I have a square mesh of latitudes and longitudes for a geographical region. However I only know the latitude and longitude values of the 4 corners of this mesh. Using these I need to calculate the lat-long values at all the cross hairs. So, I separately crested a nested loop program in R for latitudes and longitudes.
tllong<-67.481961
sink("output_long.txt")
for (i in c(1:11447)) {
for (j in c(1:10335)) {
tllong<- 67.481961 + (j-1)*0.0030769
print(tllong)
}
}
sink()
The above program was for calculating longitudes. tllong is the value of longitude at top left corner of the mesh. 11447 are the number of latitudes and 10335 are the number of longitudes.
Similarly I created a program for calculating latitudes.
tllat<-36.348639
sink("output_lat_again.txt")
for (i in c(1:11447)) {
for (j in c(1:10335)) {
print(tllat)
}
tllat<- tllat - (i-1)*0.002508
}
sink()
tllat is the value of latitude of the top left corner mesh square.
So as you can see that the loop first calculate all the lat,long values for first row Then goes to second row, then third and so on. However when I get the exported text files for both the programs, I get a single column containing all the values. This is not much of a use for me. I tried to export the output results of R in xlsx format using sink("output_long.xlsx") but when I get the excel file (after 4-5 hours of constant long run of loop) I fail to open it. The error message shows either the file is corrupted or file is of different format. I have tried this 3-4 times but in vain.
So how do I export the results of these two programs in an excel file such that I do not get all the values in a single column but in an appropriate matrix form (i.e. the values of lat,long in each cell corresponds to the values of lat,long in the corresponding cross hair of the mesh).
Also, it would be nice if someone can tell me how to run these two programs together so that I can get the lat-long values in a single run in the same file.
Seems like you want to create 10335*11447=118304745 pairs of lat/lon values. It's a pretty big number. Is that correct? However, I will show the procedure applied to a smaller example. Try this:
#setting the values of parameters
tllong<-67.481961
tllat<-36.348639
deltalong<-0.0030769
deltalat<-0.002508
#small example: you can set the following to the real values
nlong<-10
nlat<-10
#create vectors of values without loops
lat<-seq(tllat,by=deltalat,length.out=nlat)
lon<-seq(tllong,by=deltalong,length.out=nlong)
#now we build every possible pair of lat/lon values
latlong<-expand.grid(lon=lon,lat=lat)
#we export it to a csv file
write.csv(latlong,"somefile.csv",row.names=FALSE,quote=FALSE)
At the end, the somefile.csv will be created. Keep in mind that, with your values, the created file will be very big.
I am a GIS/Spatial newbie and this might be an easy question.
I have two lists of long/latitudes. The first list will create a buffer of points. I know that it is possible in R to view points and their buffers from a graph. However, is it possible in R to find:
1) How many points from the second list fall within the buffer?
and
2) Which points fall within that buffer?
Note: Suggestions of other common open source software to complete this application would be welcome.