how to extract digital elevation model of gedi data in gee? - google-earth-engine

Is there any way to find the dem of gedi data available in gee?
like any inbuilt mechanism. I'm working with JavaScript and I can't find a way to extract the dem of gedi data in gee itself.
I tried running codes similar to the ones that are used to extract dem of SRTM data. but these codes are not working. SRTM data has DEM data also included in it in GEE, but GEDI doesn't have this.

Related

How can I convert a multiband geotiff to a timeseries netcdf file in xarray

I am trying to create a time series object from extracted climate data (NEX-GDDP) using the Google Earth Engine (GEE). The data is daily metrological data, and in the attached file, the data for January, 2005, is collected over an area of interest. The images from GEE are stored in the geotiff as bands (numbered 1-31), and now I am struggling to get these individual bands into a dataset, and add a time dimension to the file. GEE will not export for more than ten years, so my idea is to create yearly files, which, when saved locally, will be merged (concatenated) on the lat/lon and time dimensions.
I am using python in a windows environment, so I am a bit limited (for example, I can't use cdo as this is a Linux based library), and I think that what I would like to do is possible with xarray, but I am missing the (learning) resources to solve this problem with code. Any help and suggestions are more than welcome to help me with this problem.
The image shows the xarray view of the metadata:
xarray metadata view
Showing one band: Test data
Missing the time dimension.
This got resolved on another thread in a special group:
https://gis.stackexchange.com/questions/449759/convert-a-multiband-geotiff-to-a-timeseries-netcdf-file-in-xarray

Is there an R function/package for determining WWF biomes from latlong coordinates?

Very new here, hi, postgraduate student who is tearing their hair out.
I have inherited a dataset of secondary data collected from research papers on species populations and their genetic diversity and have been adding more appropriate data to this sheet in preparation to perform some analyses. Part of the analysis will include subsetting the data by biome type to create comparisons between the biomes, and therefore I've been cleaning up and trying at add this information to the data I've added. I have latlong coordinates for each population (in degrees decimals) and it appears that the person working on this before me was able to use these to determine the biome for each point, specifically following the Olson et al. (2001)/WWF 14 biome categorisation, but at this point I'll take anything.
However I have no idea how this was achieved and truly can't find anything to help. After googling just about every combination of "r package biomes WWF latitude longitude species assign derive convert" that you can think of, the only packages that I have located are non functioning in my version of RStudio (e.g. biomeara, ggbiome), leaving me with no idea if they'd even work, and all other pages that I have dragged up seem to already have biome data included with their dataset. Other research papers I have found describe assigning biomes based on latlong coords and give 0 steps on how to actually achieve this. Is it possible in R? Am I losing my mind? Does anyone know of a way to do this, whether in R or not, and that preferably doesn't take forever as I have over 8000 populations to assess? Many thanks!

How to work with WorldClim data for MaxEnt

I am looking to extract WorldClim climate data for (current data and future projections) and convert the .geotif to .asc in order to run this through MaxEnt and create future climate change projections.
Problem 1: Worldclim gives me 1 .geotif from which I need to extract 19 separate variables, each as their own .geotif file.
Problem 2: Converting these .geotif files into .asc to run using MaxEnt.
I have access to free GIS software (QGIS/DIVA-GIS) and R, although I am fairly new to R. Any solutions would be really helpful, thank you.

Extracting data from NetCDF file

I'm working in R trying to use the data found here (https://datadryad.org/resource/doi:10.5061/dryad.dk1j0; two top files) to create a table similar to this: [administrative_name, GDP2010, GDP2011....., GDP 2015]
As far as i can see i need to extract the name of the administrative units from the "admin_areas_GDP_HDI.nc" file and combine them with the annual data in the GDP_per_capita_PPP_1990_2015.nc file.
With the ncdf4 package i've managed to open the archives, and to get all the attributes and variables, however I don't know how to access the data and extract it.
I've been trying to access the data all day, but i have limited experience with NetCDF archives, and have not managed to extract the data. Any pointers would help me out!
I like to use the raster package for dealing with NetCDF files. It uses the ncdf4 package to read in the files, but offers some additional tools for processing rasters. You did not mention what data you want to extract, so the example below shows the mean GDP for each administrative unit.
library(raster)
#Read in NetCDF files
ad -> brick('admin_areas_GDP_HDI.nc')
gdp -> brick('GDP_per_capita_PPP_1990_2015_v2.nc')
#Calculate mean GDP using admin zones
zoneMean -> zonal(gdp, ad[[1]], fun='mean', na.rm=T)

calculate variogram/GLCM with a raster output in R

I'd like to use the geostatistical texture to classify my remote sensing data, but I can't find a package that give a result I want---it should result a raster type output, so that I can use it with spectral date (such as TM) in R. I search this for days, nothing useful find. so I need your help.
See the glcm package - it will handle a raster input and will output GLCM textures as a RasterStack. There is an example here.

Resources