I'm using xarray and rioxarray in python to create a DataSet then export to Netcdf. After saving, when I check the exported files with GDAL 3.2, I see the correct extent. However, when using GDAL >=3.4 (in a conda env) it seems that the extents are lost:
Here is the output with my system install GDAL:
micha#RMS:Kinneret$ gdalinfo --version
GDAL 3.2.2, released 2021/03/05
micha#RMS:Kinneret$ gdalinfo NETCDF:"Kinneret_velocity.nc":v | grep -A 4 Corner
Warning 1: dimension #2 (x) is not a Longitude/X dimension.
Warning 1: dimension #1 (y) is not a Latitude/Y dimension.
Warning 1: dimension #0 (z) is not a Time or Vertical dimension.
Corner Coordinates:
Upper Left ( 735758.000, 3644806.000) ( 35d31'15.70"E, 32d54'57.98"N)
Lower Left ( 735758.000, 3621606.000) ( 35d30'54.46"E, 32d42'25.30"N)
Upper Right ( 754558.000, 3644806.000) ( 35d43'18.74"E, 32d54'42.80"N)
Lower Right ( 754558.000, 3621606.000) ( 35d42'55.82"E, 32d42'10.25"N)
However, in my conda environment, with a newer GDAL:
micha#RMS:Kinneret$ conda activate geo
(geo) micha#RMS:Kinneret$ gdalinfo --version
GDAL 3.5.2, released 2022/09/02
(geo) micha#RMS:Kinneret$ gdalinfo NETCDF:"Kinneret_velocity.nc":v | grep -A 4 Corner
Warning 1: dimension #2 (x) is not a Longitude/X dimension.
Warning 1: dimension #1 (y) is not a Latitude/Y dimension.
Warning 1: dimension #0 (z) is not a Time or Vertical dimension.
Corner Coordinates:
Upper Left ( 0.0, 0.0)
Lower Left ( 0.0, 58.0)
Upper Right ( 47.0, 0.0)
Lower Right ( 47.0, 58.0)
What do I need to add to the xarray Dataset so that the newer GDAL gets the extent?
(Note: In both cases all the "grid_mapping" metadata entries point to "spatial_ref", and the full spatial_ref details are available.)
Thanks
A couple of months late to the party, but what I found is that the NetCDF format for geospatial information is quite sensitive to the attributes for each dimension.
Specifically, you're interested in the standard_name attribute, as defined in the CF Conventions, and you should set it for both x and y coordinates.
For completion, consider the following (assuming the CRS you're using is metre-based, and that your x and y coordinates):
ds.x.attrs["axis"] = "X"
ds.x.attrs["long_name"] = "x coordinate of projection"
ds.x.attrs["standard_name"] = "projection_x_coordinate"
ds.x.attrs["units"] = "metre"
ds.y.attrs["axis"] = "Y"
ds.y.attrs["long_name"] = "y coordinate of projection"
ds.y.attrs["standard_name"] = "projection_y_coordinate"
ds.y.attrs["units"] = "metre"
Related
I am trying to match some points with long/lat coordinates with the associated regions using a map from GADM and st_join(). So the code is very simple:
x <- st_read(my_map)
y <- st_as_sf(my_points, coords = c('long', 'lat')
st_crs(y) = 4326
y <- st_join(y,x)
where both x and y are sf objects (x with polygons, y with points).
The issue is between my home and office computer, one works and the other one throws an error.
My home computer which runs the code without any issues is running on R4.0.4 and sf_0.9-7 (which I realise is older) on a Windows10x64
My office compute runs on R4.0.5 with sf_1.0-2 on a Windows10x64.
Here's the error code:
Error in s2_geography_from_wkb(x, oriented = oriented, check = check) :
Evaluation error: Found 4 features with invalid spherical geometry.
[201] Loop 2 is not valid: Edge 2 crosses edge 4
[935] Loop 16 edge 390 crosses loop 64 edge 6
[2667] Loop 20 edge 835 crosses loop 37 edge 24843
[3401] Loop 1 is not valid: Edge 818 crosses edge 820
Do you know if I can trust the output of the join by using the older version of the package? I briefly looked at the issues in Github but couldn't find something to alarm me, it might just be that they changed one default to FALSE. But I'm wondering if that is something that I should be wary of.
I am generating polylines in Meshlab through the 'Compute Planar Section' filter, with code as seen here.
for z_value in np.arange(0, 5, 1):
ms.set_current_mesh(0)
planeoffset = float(z_value)
ms.compute_planar_section(planeaxis = 'Z Axis', planeoffset = planeoffset)
m = ms.current_mesh()
m.compact()
print(m.vertex_number(), "vertices in Planar Section Z =", planeoffset)
What I would like to be able to do is obtain the data that is used to connect one point to another. Meshlab holds this data, as when I export my polyline to DXF, the edges are present, correctly joined together.
I imagine a list, where each edge has a start and endpoint (potentially vertex ID), as seen in the DXF would be the most help.
Any guidance in helping obtain this information would be greatly appreciated.
Update: Pymeshlab developpers have already included the method m.edge_matrix() in the current version of pymeshlab to expose edge data. Since then, this is the recommended way to solve your problem if you have a modern version of pymeshlab.
I have to bring bad news. At current day (October 2021) the edge information that you request is stored internally in the VCG meshes but it is not exposed to python API, so you can't read it using pymeshlab. You can only read the number of edges using the m.edge_number() method.
If you need to continue with your project, yours options are:
Write one issue at https://github.com/cnr-isti-vclab/PyMeshLab/issues/ kindly asking the developers to expose the edge information to pymeshlab api.
If your surfaces are convex, you can rebuild the edge data computing the convex hull of the vertex, or by sorting the vertex by angle around the centroid of the vertex.
If your surfaces are complex, you can still store the mesh into a dxf file, and then parse the dxf to read the info back
The option 3 seems to be the most easy to achieve. The DXF files written by meshlab have a lot of sections
LINE
8
0
10
40.243473 -> this is coordinate X of point 1
20
-40.981182 -> this is coordinate Y of point 1
30
0.000000 -> this is coordinate Z of point 1
11
40.887867 -> this is coordinate X of point 2
21
-42.090389 -> this is coordinate Y of point 2
31
0.000000 -> this is coordinate Z of point 2
0
so you can parse the dxf file with this piece of python code
edges=[]
with open("contour.dxf") as f:
line = f.readline().strip()
while line:
if line == "LINE" :
f.readline()
f.readline()
f.readline()
x1 = float(f.readline())
f.readline()
y1 = float(f.readline())
f.readline()
z1 = float(f.readline())
f.readline()
x2 = float(f.readline())
f.readline()
y2 = float(f.readline())
f.readline()
z2 = float(f.readline())
print("edge from", (x1,y1,z1), "to", (x2,y2,z2))
edges.append( ( (x1,y1,z1), (x2,y2,z2) ) )
line = f.readline().strip()
I want to compute the alpha-shape (or even only the concave hull) of a set of points using Julia. In other questions they have solved this problem in python by using Delaunay tesselation Boundary enclosing a given set of points .
This package in Julia can get a Delaunay tesselation https://github.com/JuliaGeometry/VoronoiDelaunay.jl (though I am not sure if it is updated for julia v0.7).
I am wondering if there is an implementation already for julia v0.7 that can get eh alpha-shape, or even just the concave hull of a set of points.
Alternatively, is there a way to efficiently call python (scipy.spatial.Delaunay) to do the job?
VoronoiDelaunay.jl works with Julia 1.0 and 1.1. It should also work with Julia 0.7.
VoronoiDelaunay.jl has some numerical restrictions, i.e. (1.0+eps(), 2.0-eps()), on coordinates so you may need to re-scale your data points.
To create a DelaunayTesselation with your own point type, make sure your type is a subtype of AbstractPoint2D, that is <: AbstractPoint2D, and defines getx, and gety methods.
The following example code, I believe, finds what you call the Concave Hull of a set of points using DelaunayTesselation and plots the result. It basically uses the same algorithm in this answer. You may easily edit the code to get the alpha shape.
I did not wrap some code snippets into a function. If you need high performance, please do so. I used === while checking for equality of points which actually checks if two points are the same object (i.e. address in memory). If you somehow end up in a code which breaks this part, you can extend == and use it instead of ===.
using Random, VoronoiDelaunay, Plots
import Base.==
struct MyEdge{T<:AbstractPoint2D}
_a::T
_b::T
end
==(e1::MyEdge{T}, e2::MyEdge{T}) where {T<:AbstractPoint2D} = ((e1._a === e2._a) && (e1._b === e2._b)) || ((e1._b === e2._a) && (e2._b === e1._a))
###==(p1::T, p2::T) where {T<:AbstractPoint2D} = (getx(p1) == getx(p2)) && (gety(p1) == gety(p2))
### Create a Delaunay tesselation from random points
tess = DelaunayTessellation2D(46)
for _ in 1:23
push!(tess, Point2D(rand()+1, rand()+1))
end
edges = MyEdge[]
function add_edge!(edges, edge)
i = findfirst(e -> e == edge, edges)
if isnothing(i) # not found
push!(edges, edge)
else # found so not outer, remove this edge
deleteat!(edges, i)
end
end
for trig in tess
a, b, c = geta(trig), getb(trig), getc(trig)
add_edge!(edges, MyEdge(b, c))
add_edge!(edges, MyEdge(a, b))
add_edge!(edges, MyEdge(a, c))
end
### PLOT
x, y = Float64[], Float64[] # outer edges
for edge in edges
push!(x, getx(edge._a))
push!(x, getx(edge._b))
push!(x, NaN)
push!(y, gety(edge._a))
push!(y, gety(edge._b))
push!(y, NaN)
end
xall, yall = getplotxy(delaunayedges(tess)) # all the edges
plot(xall, yall, color=:blue, fmt=:svg, size=(400,400))
plot!(x, y, color=:red, linewidth=3, opacity=0.5)
I downloaded a Sentinel1 product from the esa open acces hub. In the downloaded directory is a tif-file containing the measurement data. I would now like to crop this tif-file to the window that I need.
For this I use the gdal_translate function in R
gdal_translate( 'from.tiff', to.tiff , projwin = c( 4.5, 52.4, 4.7, 52.2 ))
This returns an error
ERROR 1: Error: Computed -srcwin 4.5 52 0 0 has negative width and/or height.
I figured it might be a coordinate reference frame problem, but when I open the file in qgis it shows nice wgs84.
When I read in the file using
raster('from.tiff')
I notice that the raster extent is just the pixel dimensions. (As opposed to when I open it in Qgis)
How can I crop this 'from.tiff' file to the desired extent?
It seems you have mixed up your minimum and maximum latitude. The correct command would be:
gdal_translate( 'from.tiff', to.tiff , projwin = c( 4.5, 52.2, 4.7, 52.4 ))
I am trying to execute getNOAA.bathy from package marmap.
I can successfully execute the following (from here):
library(marmap)
getNOAA.bathy(lon1=-20,lon2=-90,lat1=50,lat2=20, resolution=10) -> a
plot(a, image=TRUE, deep=-6000, shallow=0, step=1000)
However, when I execute the following:
getNOAA.bathy(lon1=-80,lon2=-79.833333,lat1=32.7,lat2=32.833333, resolution=10) -> a
plot(a, image=TRUE, deep=-6000, shallow=0, step=1000)
I get the error:
Error in getNOAA.bathy(lon1 = -80, lon2 = -79.833333, lat1 = 32.7, lat2 = 32.833333, : The NOAA server cannot be reached
Questions:
Are there special restrictions to LAT/LON values? Am I
miscalculating something here?
Are there "better" packages that can support my LAT/LON values?
As stated in the help file for getNOAA.bathy(), the resolution argument is expressed in minutes. So resolution=10 means that the cells of your grid will have a dimension of 10 minutes in longitude by 10 minutes in latitude. The bigger the number, the worse the resolution. So considering your region, you need to use the highest resolution possible for the ETOPO1 dataset (i.e. the database that's fetched by getNOAA.bathy()):
getNOAA.bathy(lon1=-80, lon2= 79.833333, lat1=32.7, lat2=32.833333, res=1)
That's definitely not hi-res (you get a grid of 80 values: 10 latitude values by 8 longitude values), but that's the maximum you can get with getNOAA.bathy().
If you need a higher resolution, you should try other sources such as the GEBCO database and the readGEBCO() function of marmap. You should also have a look at sections 2 and 3 of the marmap-ImportExport vignette where other sources are listed.
vignette("marmap-ImportExport")