I have a square mesh of latitudes and longitudes for a geographical region. However I only know the latitude and longitude values of the 4 corners of this mesh. Using these I need to calculate the lat-long values at all the cross hairs. So, I separately crested a nested loop program in R for latitudes and longitudes.
tllong<-67.481961
sink("output_long.txt")
for (i in c(1:11447)) {
for (j in c(1:10335)) {
tllong<- 67.481961 + (j-1)*0.0030769
print(tllong)
}
}
sink()
The above program was for calculating longitudes. tllong is the value of longitude at top left corner of the mesh. 11447 are the number of latitudes and 10335 are the number of longitudes.
Similarly I created a program for calculating latitudes.
tllat<-36.348639
sink("output_lat_again.txt")
for (i in c(1:11447)) {
for (j in c(1:10335)) {
print(tllat)
}
tllat<- tllat - (i-1)*0.002508
}
sink()
tllat is the value of latitude of the top left corner mesh square.
So as you can see that the loop first calculate all the lat,long values for first row Then goes to second row, then third and so on. However when I get the exported text files for both the programs, I get a single column containing all the values. This is not much of a use for me. I tried to export the output results of R in xlsx format using sink("output_long.xlsx") but when I get the excel file (after 4-5 hours of constant long run of loop) I fail to open it. The error message shows either the file is corrupted or file is of different format. I have tried this 3-4 times but in vain.
So how do I export the results of these two programs in an excel file such that I do not get all the values in a single column but in an appropriate matrix form (i.e. the values of lat,long in each cell corresponds to the values of lat,long in the corresponding cross hair of the mesh).
Also, it would be nice if someone can tell me how to run these two programs together so that I can get the lat-long values in a single run in the same file.
Seems like you want to create 10335*11447=118304745 pairs of lat/lon values. It's a pretty big number. Is that correct? However, I will show the procedure applied to a smaller example. Try this:
#setting the values of parameters
tllong<-67.481961
tllat<-36.348639
deltalong<-0.0030769
deltalat<-0.002508
#small example: you can set the following to the real values
nlong<-10
nlat<-10
#create vectors of values without loops
lat<-seq(tllat,by=deltalat,length.out=nlat)
lon<-seq(tllong,by=deltalong,length.out=nlong)
#now we build every possible pair of lat/lon values
latlong<-expand.grid(lon=lon,lat=lat)
#we export it to a csv file
write.csv(latlong,"somefile.csv",row.names=FALSE,quote=FALSE)
At the end, the somefile.csv will be created. Keep in mind that, with your values, the created file will be very big.
Related
I am trying to extract the values of pixels in a DSM(CHM) within digitized tree crowns.
first I set my working directory read in the shapefile and raster.
TreeCrowns <-shapefile("plot1sag_shape/plot1sag.shp")
CHM <- raster('272280split4.tif')
Then I try to extract the pixel values
pixel <- raster::extract(CHM, TreeCrowns, method= 'simple', weights=FALSE, fun=NULL)
But I get an empty list with all NULL values for every polygon. I have confirmed that the CHM and polygons are in the same location. What can I do to fix this?
Since your shapefile consists of polygon, the extract() function need to know how to summarise the pixel values across a polygon via the fun= argument. Since you provide fun=NULL, the function interpret as returning NULL values to summarise the pixel values.
Try fun=mean or fun=sum (and they mean different thing so see which one suits you).
That probably happens because the polygons and the raster do not overlap. Can you show(CHM) and TreeCrowns? Have you looked at
plot(CHM)
lines(TreeCrowns)
Or are your polygons very small relative to the raster cells? In that case try argument small=TRUE
I have a large raster (145.927.240 cells) with categorical data. The data can be found here:
https://developers.google.com/earth-engine/datasets/catalog/ESA_GLOBCOVER_L4_200901_200912_V2_3
For each cell I would like to calculate the distance to the nearest neighbor of each class. What is the most efficient (i.e. feasible) way to do this? I've looked for suitable packages, but so far I haven't found one that does what I want (with a raster of that size).
To give some context:
I would like to combine several raster files, convert them to a data table to use them as input in different models and then convert the result back to a raster file.
Please see below image. This image is created by first converting a two-column data frame into a study window (call it study_win) using as.owin, and then plotting another two-columns data-frame (call it study_points)on top of the window.
It is clear that the points are lying inside the window! However when I call
ppp(study_points[,1],study_points[,2],win = study_window)
it says that most of my points are rejected as lying outside the window. Could someone tell me what is going on?
Thanks!
First you could have taken a step back to check that the window object study_window was what you intended. You could have plotted or printed this object in its own right. A plot of study_window would show (and you can also see this in the plot that you supplied in the question) that the boundary of the window is a disconnected scatter of points, not a joined-up polygon. A printout of study_window would have revealed that it is a binary pixel mask, with a very small area, rather than a polygonal region. The help for as.owin explains that, when as.owin is applied to a dataframe containing columns of x,y coordinates, it interprets them as pixel coordinates of the pixels that lie inside the window.
So,what has happened is that as.owin has created a window consisting of one pixel at each of the (x,y) locations in the data frame. That's not what you wanted; the (x,y) coordinates were meant to be the vertices of a polygonal boundary.
To get the desired window, do something like study_window <- owin(poly=df) where df is the data frame of (x,y) coordinates of vertices.
To do it all in one step, type something like mypattern <- ppp(x, y, poly=df) where x and y are the vectors of coordinates of the points in the window.
so I solved the problem by using the "owin" and specify the region to be polygon; instead of "as.owin". I have no idea the difference between owin and as.owin, but I am just glad it worked...
While I use R quite a bit, just started an image analysis project and I am using the EBImage package. I need to collect a lot of data from circular/elliptical images. The built-in function computeFeatures gives the maximum and minimum radius. But I need all of the radii it computes.
Here is the code. I have read the image, thresholded and filled.
actual.image = readImage("xxxx")
image = actual.image[,2070:4000]
image1 = thresh(image)
image1 = fillHull(image1)
As there are several objects in the image, I used the following to label
image1 = bwlabel(image1)
I generated features using the built in function
features = data.frame(computeFeatures(image1,image))
Now, computeFeatures gives max radius and min radius. I need all the radii of all the objects it has computed for my analysis. At least if I get the coordinates of boundaries of all objects, I can compute the radii through some other code.
I know images are stored as matrices and can come up with a convoluted way to find the boundaries and then compute radii. But, was wondering if there a more elegant method?
You could try extracting each object + some padding, and plotting the x and y axis intensity profiles for each object. The intensity profiles is simply the sum of rows / columns which can be computed using rowSums and colSums in R
Then you could find where it dropps by splitting each intensity profiles in half and computing the nearest minimum value.
Maybe an example would help clear things up:
Hopefully this makes sense
I have been using Matlab 2011b and contourf/contourfm to plot 2D data on a map of North America. I started from the help page for contourfm on the mathworks website, and it works great if you use their default data called "geoid" and reference vector "geoidrefvec."
Here is some simple code that works with the preset data:
figure
axesm('MapProjection','lambert','maplo',[-175 -45],'mapla',[10 75]);
framem; gridm; axis off; tightmap
load geoid
%geoidrefvec=[1 90 0];
load 'TECvars.mat'
%contourfm(ITEC, geoidrefvec, -120:20:100, 'LineStyle', 'none');
contourfm(geoid, geoidrefvec, -120:20:100, 'LineStyle', 'none');
coast = load('coast');
geoshow(coast.lat, coast.long, 'Color', 'black')
whitebg('w')
title(sprintf('Total Electron Content Units x 10^1^6 m^-^2'),'Fontsize',14,'Color','black')
%axis([-3 -1 0 1.0]);
contourcbar
The problem arises when I try to use my data. I am quite sure the reference vector determines where the data should be plotted on the globe but I was not able to find any documentation about how this vector works or how to create one to work with different data.
Here is a .mat file with my data. ITEC is the matrix of values to be plotted. Information about the position of the grid relative to the earth can be found in the cell array called RT but the basic idea is. ITEC(1,1) refers to Lat=11 Long=-180 and ITEC(58,39) refers to Lat = 72.5 Long = -53 with evenly spaced data.
Does anyone know how the reference vector defines where the data is placed on the map? Or perhaps there is another way to accomplish this? Thanks in advance!
OK. So I figured it out. I realized that, given that there are only three dimensions in the vector, the degrees between latitude data must be the same as the degrees between longitude data. That is, the spacing between each horizontal data point must be the same as the spacing between each vertical point. For instance, 1 degree.
The first value in the reference vector is the distance (in degrees) between each data point (I think...this works in my case), and the two second values in the vector are the minimum latitude and minimum longitude respectively.
In my case the data was equally spaced in each direction, but not the same spacing vertically and horizontally. I simply interpolated the data to a 1x1 grid density and set the first value in the vector to 1.
Hopefully this will help someone with the same problem.
Quick question though, since I answered my own question do I get the bounty? I'd hate to loose 50 'valuable' reputation points haha