How to compare similarities in point clouds using descriptors? - point-cloud-library

I have two .pcd files and I have extracted features using FPFH estimator. Now I want to compare these descriptors, would that be possible? if so, then how can i do that?
Any help would be appreciated!
Thank you.

Related

Spatstat - how to compare two temperature spatial domains?

I would like to compare a satellite pseudo-image generated by the WRF model and a real satellite image with the R-package named Spatstat. However, I do not know how to begin. I have read that it is possible to carry out a spatial pattern comparison but I do not know which function I should use. I have two temperature images in a predefined domain and I would like to know if the comparison is carried out point-by-point over those images or I have to facilitate the model output and satellite data. In that case, how should I do that? Is there any available script?
Thanks in advance.
Kind regards, Lara.
The spatstat package is designed mainly to analyse spatial patterns of point locations (like the pattern of locations of accidents that occurred in a given year). It does have some basic facilities for comparing pixel images. If A and B are two pixel images, you can do arithmetic by typing e.g. mean(A-B) or mean(abs(A-B)). There are also some image metrics (distances between images) available. Please visit the spatstat.org website for more information.

Measuring deviation in a music waveform in R?

I have posted this question in the R tag but I am open to solutions in other languages.
Lets say you have some waveforms. The first is just a bar. It is completely horizontal so it has no deviation. The other waveforms look like these:
Now I am able to get these waveforms separated into a uniform box so that they are all the same pixel size and resolution. My first idea was to quantify the amount of whitespace within one of these uniform boxes that the waveform used up using the code found here:
Measuring whitespace in a jpeg
Now however I want to measure the deviation between waveforms. That is, how could I quantify how "jumpy" a waveform is? Looking at the picture above, the second waveform seems the most homogeneous, and the third waveform seems to display the most variation, but I am unsure about how to quantify this. Any suggestions would be greatly appreciated.
I would recommend starting by getting familiarized with the packages tuneR and seewave, you can import and extract a lot of parameters from these two packages. In particular you could use the function acustat from seewave, this is a worked example with data from the package
data(tico)
note <- cutw(tico, from=0.5, to=0.9, output="Wave")
a<-acoustat(note)
a will give you 10 acoustic parameters from the sound, you could also use other packages like soundecology, that also extract some other variables, in particular, the function acoustic_diversity measures sound complexity

GEOSoft data analysis and processing

Is there any way to analyze the data within a GEOSoft file, from NCBI, in R? I've tried converting it to an expression set, fitting it with lmFit(), then using eBayes(), but when I use topTable() to look at the results, the adjusted P.Values are always constant throughout the column.
Any help, or even pointing in the right direction would be greatly appreciated.
Thank you so much in advance

Write igraph clustering to file

I am currently testing various community detection algorithms in the igraph package to compare against my implementation.
I am able to run the algorithms on different graphs but I was wondering if there was a way for me to write the clustering to a file, where all nodes in one community are written to one line and so on. I am able to obtain the membership of each node using membership(communities_object) and write that to a file using dput() but I don't know how to write it the way I want.
This is the first time I am working with R as well. I apologize if this has been asked before.
This does not have to do much with igraph, the clustering is given by a simple numeric vector. See ?write.
write(membership(communities_object), file="myfile", ncolumns=1)
write(communities_object$membership, file="myfile", ncolumns=1) also work

Counting objects in image

I want to count no of objects in an image using open cv. I have a soybean image and now I want to count the soybean numbers. If possible please help me and let me know the counting algorithms.
Thanks and I look forward to hear from you.
Regards,
Sumon
Sumon,
There is no one algorithm for counting objects. It greatly depends on the image itself. Depending on the contrast of the beans to the background it may be possible to use a simple threshold then a labeling algorithm, or even just finding contours.
The threshold function in opencv is cvThreshold. The contour finding algorithm is cvFindContours using this you could count the number of contours found.
Also the blob library has many facilities for this type of machine vision applications including connected component labeling which is basically what you need here. The library's description I believe is already included in opencv. The description of it can be found here.
I could provide some more assistance if I knew a little more about the image.

Resources