R rgl lidR slow rendering on Windows 11 64 bit - r

I am trying to manually identify/correct trees using LiDAR data (1.7 GB object) and a tree tops object via the locate_trees function. Part of the problem is:
Rgl is rendering very slow even though the 4 GB Nvidia 3050 should be able to handle it.
The tree tops (red 3D dots) are not even showing in the rgl window. When I close the rgl window, the tree tops start popping up (red dots appear and disappear resulting in a blank white window) in a new rgl window. And if I close that window, a new tree top window opens up so I stop the process to prevent this from happening.
Does rgl automatically use the GPU or does it default to the integrated graphics on the motherboard? Is there a way to fasten up the rendering?
My other system specs are Corei9 (14 threads) and 64 GB RAM. Moreover, I am using R 4.2.1.
Code:
library(lidR)
# Import LiDAR data
LiDAR_File = readLAS("path/file_name.las")
# Find tree tops
TTops = find_trees(LiDAR_File , lmf(ws = 15, hmin = 5))
# Manually correct tree identification
TTops_Manual = locate_trees(LiDAR_File , manual(TTops)) # This is where rgl rendering becomes too slow if there are too many points involved.

There were two problems here. First, the lidR::manual() function which is used to select trees has a loop where one sphere is drawn for each tree. By default rgl will redraw the whole scene after each change; this should be suppressed. The patch in https://github.com/r-lidar/lidR/pull/611 fixes this. You can install a version with this fix as
remotes::install_github("r-lidar/lidR")
Second, rgl was somewhat inefficient in drawing the initial point cloud of data, duplicating the data unnecessarily. When you have tens of millions of points, this can exhaust all R memory, and things slow to a crawl. The development version of rgl fixes this. It's available via
remotes::install_github("dmurdoch/rgl")
The LiDAR images are very big, so you might find you still have problems even with these changes. Getting more regular RAM will help R: you may need this if the time to the first display is too long. After the first display, almost all the work is done in the graphics system; if things are still too slow, you may need a faster graphics card (or more memory for it).

rgl has trouble displaying too many points. The plot function in lidR is convenient and allows to produce ready to publish illustrations but cannot replace a real point cloud viewer for big point clouds. I don't have GPU on my computer and I don't know if and how rgl can take advantage of GPU.
In the doc of the lidR function your are talking about you can see:
This is only suitable for small-sized plots

Related

Cannot use plot() function in RStudio for large objects

I am trying to use the default plot() function in R to try and plot a shapefile that is about 100MB, using RStudio. When I try and plot the shapefile, the command doesn't finish executing for around 5 minutes, and when it finally does, the plotting window remains blank. When I execute the same process exactly in VS Code, the plot appears almost instantly, as expected.
I have tried uninstalling and reinstalling RStudio with no success.
I can't speak for what VStudio does, but I can guarantee that plotting 100MB worth of data points is useless (unless the final plot is going to be maybe 6 by 10 meters in size).
First thing: can you load the source file into R at all? One would hope so since that's not a grossly huge data blob. Then use your choice of reduction algorithms to get a reasonable number of points to plot, e.g. 800 by 1600, which is all a monitor can display anyway.
Next try plotting a small subset to verify the data are in a valid form, etc.
Then consider reducing the data by collapsing maybe each 10x10 region to a single average value, or by using ggplot2:geom_hex .

Plotting graphs using R in Jupyter is slow

I am plotting heavy graphs in Jupyter using the language R. It is extremely slow as I expect it is first exporting it into EPS and then converting it to a png.
If you try to plot on a native R setup ( R for windows for example ) the plotting is nearly instantaneous.
Is there a way to get R in Jupyter to plot more quickly?
I came here looking for a solution to a potentially related issue-- the browser window became relatively unresponsive with lots of lag when drawing plots with a lot of datapoints, likely because everything was being rendered as vector graphics.
In trying to solve my problem, it sped up initial drawing of graphs by an appreciable amount as well. The solution was to change the jupyter plot output type to png using the command:
options(jupyter.plot_mimetypes = 'image/png')
Now when I plot graphs with 10s of thousands of points, the window remains crisply responsive. The downside is that the plots are now bitmap, but you can always remove the options if you want vector graphics.

How to save a pdf in R with a lot of points

So I have to save a pdf plot with a lot of points in it. That is not a problem. The problem is that when I open it. It takes forever to plot all those points. How can I save this pdf in such a way that it doesn't have to draw point by point when someone opens it. I'm OK if the quality of the picture goes down a bit.
Here's a sample. I don't think this would crash your computer but be careful with the parameter length if you have an old machine. I am using many more points than that in my real problem by the way.
pdf("lots of points.pdf")
x <- seq(0,100, length = 100000)
y <- 0.00001 * x
plot(x, y)
dev.off()
I had a similar problem and there is a sound solution. The drawback is that this solution is not generic and does not involve programming (always bad).
For draft purposes, png or any other graphic format may be sufficient, but for presentation purposes this is often not the case. So the way to go is to combine vector graphics for fonts, axis etc and bitmap for your zillions of points:
1) save as pdf (huge and nasty)
2) load into illustrator or likewise ( must have layers )
3) separate points from all other stuff by dragging other stuff to new layer - save as A
4) delete other stuff and export points only as bitmap (png, jpg) and save as B
5) load B into A; scale and move B to exact overlap; delete vector points layer, and export as slender pdf.
done. takes you 30 minutes.
As said this has nothing to do with programming, but there is simply no way around the fact that as vector graphics each and every point (even those that are not visible, since covered by others) are single elements and its a pain handling pdfs with thousands of elements.
So there is need for postprocessing. I know ImageMagick can do alot, but AFAIK the above cant be done by an algorithm.
The only programming way to (partly) solve this is to eliminate those points that will not display because the are covered by others. But thats beyond me.
Only go this way if you really and desperately need extreme scalability, otherwise go with #Ben and #inform and use a bitmap --in whatever container you need it (png,pdf,bmp,jpg,tif, even eps).

Plot two large Raster Data Sets in a Scatter Plot

i have a problem with plotting two Raster Data Sets in R.
I use two different IRS LISS III Scenes (with the same Extent) and what i want is to plot the pixel values of both scenes in one Scatterplot (x= Layer1 and y=Layer2).
My problem is now the handling of the big amount of data. Each Scene has about 80.000.000 pixels due reclassification and other processing i was able to scale down the values to a amount of 12.000.000 in each raster. But when i try to import these values e.g. in a data.frame or load them from an ascii file i always got problems with my memory.
Is it possible two plot such an amount of data, and when yes it would be great if someone could help me, i was trying it for two days now and right now im desperated.
Many thanks,
Stefan
Use the raster package, there's a good chance it will work out of the box since it has good "out-of-memory" handling. If it doesn't work with the ASCII grids, convert them to something more efficient (like an LZW-compressed and tiled GeoTIFF) with GDAL. And if they are still too big resize them, that's all the graphics rendering process will do anyway. (You don't say how you resized originally, or give any details on how you are trying to read them).

Export igraph Issues in R

I have generated a few network graphs in igraph. I like to use tkplot, however, after resizing the window manually and changing the layout, my machine freezes up when I try to export or even take a screenshot of the graph.
Any ideas? My machine is Windows XP Pro and has 2GB memory.
Many thanks in advance.
I can't reproduce your error on Windows 7. What you can try is
to set the canvas size yourself in
the tkplot command
to use the
function tkplot.export.postscript to
call for the export dialog.
This can be done by:
g <- graph.ring(10)
x <- tkplot(g,canvas.width=800, canvas.height=800)
tkplot.export.postscript(x)
Worst case scenario you could extract the coordinates using tkplot.getcoords(x) and use the base plot functions to reconstruct the graph. This is a whole lot trickier off course, as you have to keep track of which vertices are connected and which not.

Resources