I have a SPOT image (bands 1 to 4) in DN and I need to transform them to reflectance values. Does anyone know any package in R that can help with that?
Thank you all..
Nora
It would be interesting to know the application of SPOT images you are planning. Atmospheric and radiometric Correction may be unnecessary in a number of circumstances.
There is no specific package for dealing with SPOT image but you could adapt functions provided by landsat package.
There are good references dealing with SPOT application and raw DN transformation to accommodate atmospheric and radiometric corrections. I'd say that Song et al 2001 is a fundamental reading but see Clark dissertation.
You'll find the Spot reference tables, similar to those available for Landsat.
Let us know if you found a solution.
Related
Very new here, hi, postgraduate student who is tearing their hair out.
I have inherited a dataset of secondary data collected from research papers on species populations and their genetic diversity and have been adding more appropriate data to this sheet in preparation to perform some analyses. Part of the analysis will include subsetting the data by biome type to create comparisons between the biomes, and therefore I've been cleaning up and trying at add this information to the data I've added. I have latlong coordinates for each population (in degrees decimals) and it appears that the person working on this before me was able to use these to determine the biome for each point, specifically following the Olson et al. (2001)/WWF 14 biome categorisation, but at this point I'll take anything.
However I have no idea how this was achieved and truly can't find anything to help. After googling just about every combination of "r package biomes WWF latitude longitude species assign derive convert" that you can think of, the only packages that I have located are non functioning in my version of RStudio (e.g. biomeara, ggbiome), leaving me with no idea if they'd even work, and all other pages that I have dragged up seem to already have biome data included with their dataset. Other research papers I have found describe assigning biomes based on latlong coords and give 0 steps on how to actually achieve this. Is it possible in R? Am I losing my mind? Does anyone know of a way to do this, whether in R or not, and that preferably doesn't take forever as I have over 8000 populations to assess? Many thanks!
I am quite new to NLP. My question is can I combine words of same meaning into one using NLP, for example, considering the following rows;
1. It’s too noisy here
2. Come on people whats up with all the chatter
3. Why are people shouting like crazy
4. Shut up people, why are you making so much noise
As one can notice, the common aspect here is that the people are complaining about the noise.
noisy, chatter, shouting, noise -> Noise
Is it possible to group the words using a common entity using NLP. I am using R to come up with a solution to this problem.
I have used a sample twitter data set and my expected output will be a table which contains;
Noise
It’s too noisy here
Come on people whats up with all the chatter
Why are people shouting like crazy
Shut up people, why are you making so much noise
I did search the web for reference before posting here. Any suggestion or valuable inputs will be of much help.
Thanks
The problem you mention is better known as paraphrasing, and it is not completetly solved. Maybe if you want a fast solution, you can start replacing synonyms, wordnet can help with that.
Other idea is calculate sentence similarity (just getting a vector representation of each sentence and use cosine distance to measure similarity to each other)
I think this paper could provide a good introduction for your problem.
I have a river network in a shapefile (class: "SpatialLinesDataFrame"), with some points on it (see picture below).
I would like to compute the distances between points, but along the rivers. I have been searching a lot and I am not able to find any function that allows directly that.
The closest thing I have found is the function "networkdistance" in the package "secrlinear", however I don't manage to transform my shapefile into the format required to use the function (a "linearmask" object).
Any help with this would be extremely appreciated.
Thanks in advance,
Tina.
I know this is an old thread, but just in case someone runs across this in the future: I just released an R package (riverdist) that deals with this issue, and also provides some tools for network editing and data summaries & visualization. It was written with fisheries work in mind, but could probably be applied to what you're working on, or at least that's the hope!
https://cran.r-project.org/web/packages/riverdist/vignettes/riverdist_vignette.html
Sorry this wasn't more timely -
I think we resolved this problem offline: the geographic coordinates (lat/long) of the shapefile needed to be projected before they could be used in secrlinear. That package approximates the linear network and uses igraph functions for distances.
I am doing a project that involves processing large, sparse graphs. Does anyone know of any publicly available data sets that can be processed into large graphs for testing? I'm looking for something like a Facebook friend network, or something a little smaller with the same flavor.
I found the Stanford Large Network Dataset Collection pretty useful.
If you asked nicely, you might be able to get Brian O'Meara's data set for treetapper. It's a pretty nice example of real-world data in that genre. Particularly, you'd probably be interested in the coauthorship data.
http://www.treetapper.org/
http://www.brianomeara.info/
Github's API is nice for building out graphs. I've messed around using the python lib networkx to generate graphs of that network. Here's some sample code if you're interested.
Apologies for the double post, evidently I can only post two links at a time since I have <10 reputation...
DIMACS also has some data sets from their cluser challenge and there's always the Graph500. The Boost Graph Library has a number of graph generators as well.
Depending on what you consider "large", there's the University of Florida Sparse Matrix Collection as well as some DIMACS Road Networks (mostly planar of course).
A few other ones:
Newman's page
Barabasi's page
Pajek software
Arena's page
Network Science
I want to count no of objects in an image using open cv. I have a soybean image and now I want to count the soybean numbers. If possible please help me and let me know the counting algorithms.
Thanks and I look forward to hear from you.
Regards,
Sumon
Sumon,
There is no one algorithm for counting objects. It greatly depends on the image itself. Depending on the contrast of the beans to the background it may be possible to use a simple threshold then a labeling algorithm, or even just finding contours.
The threshold function in opencv is cvThreshold. The contour finding algorithm is cvFindContours using this you could count the number of contours found.
Also the blob library has many facilities for this type of machine vision applications including connected component labeling which is basically what you need here. The library's description I believe is already included in opencv. The description of it can be found here.
I could provide some more assistance if I knew a little more about the image.