Is there an issue with raster::writeRaster writing raster stacks in R4.2? - r

I am using some older code to write a raster stack with bylayer = T and I havent bothered to migrate it to terra yet so I am still using raster. This used to work fine:
raster::writeRaster(stack(rastList3), names(rastList3), bylayer = T, overwrite = TRUE, format = "GTiff")
Now it throws the hard to decipher error:
"Error in if (tolower(e) %in% c(".tiff", ".tif")) { :
the condition has length > 1"
Replies to similar error message here suggest it seems to have to do with R 4.2 but I am not fully sure that is what is happening. I can get it to write one layer at a time using
dsn <- here("Clipped_ENVData/Mask2022//")
nameT = paste(dsn, names(rastList3), ".tiff", sep = "")
writeRaster(rastList3[[3]], nameT[[3]], overwrite = TRUE)
but it wont write bylayer from the stack of 10 rasters :(
Does anyone know if there is a workaround in the writeRaster function that needs to be fixed or is it something broken in my code?

That is a bug. It goes away if you update the "raster" package to version 3.6-5. That is currently the development version. You can install that version with
install.packages('raster', repos='https://rspatial.r-universe.dev')

OH-KAY. I also found the answer to this post here which helped me work out my solution:
lapply(rastList3, function(x) writeRaster(x, filename=paste0(dsn,names(x)), format="GTiff", overwrite = TRUE))
While updating to a development version might fix the bug, I was hesitant as it possibly could create more headaches in other places.

Related

Command describe unrecognized even if the package psych is loaded

I'm using Rstudio 2022.22.1 on MacOS Monterey 12.3.1.
I load libraries at the begininning by doing:
knitr::opts_chunk$set(echo = TRUE)
library("tidyverse", "here", "magrittr")
library("pastecs", "psych")
## dlf<-read.delim("data/DownloadFestival(No Outlier).dat", header=TRUE)
dlf<-here::here("data/DownloadFestival(No Outlier).dat") %>% readr::read_delim(col_names = TRUE)
I also check the thick for the library "psych" in the Packages section of RStudio.
The issue is that, from a certain point (after Knitting) I wasn't unable to use the command describe, this is the error:
could not find function "describe"
I could bypass this, by typing each time I use the function:
psych::describe
instead of describe alone
How can I use describe without specifying the psych:: prefix each time ?
Your problem is that library("pastecs", "psych") isn't doing what you think. Weirdly enough, there isn't an obvious idiom for "load a bunch of packages at once": I wish there were an easier way to do this, but try
invisible(lapply(c("psych", "pastecs"), library, character.only = TRUE))
The answers to this question provide a bunch of different ways to load many packages at once (the accepted answer is the same as the one given here).

R curl::has_internet() FALSE even though there are internet connection

My problem arose when downloading data from EuroSTAT using the R package eurostat:
# Population data by NUTS3
pop_data <- subset(eurostat::get_eurostat("demo_r_pjangrp3", time_format = "num"),
(age == "TOTAL") & (sex == "T") &
(nchar(trimws(geo)) == 5))[, c("time","geo","values")]
#Fejl i eurostat::get_eurostat("demo_r_pjangrp3", time_format = "num") :
# You have no internet connection, please reconnect!
Seaching, I have found out that it is the statement (in the eurostat-package code):
if (curl::has_internet() {stop("You have no inernet connection, please connnect") that cause the problem.
However, I have interconnection and can e.g. ping www.eurostat.eu
I have tried curl::has_internet() on different computers, all with internet connection. On some it work (respond TRUE) on others it don't.
I have talked with our IT department, and we tried if it could be a firewall problem. Removing the firewall, did not solve the problem.
Unfortunately, I am ignorant on network-settings. Hence, when trying to read the documentation for the curl-package I am lost.
Downloading data from EuroSTAT using the command above have worked for the last at least 2 years, and for me the problem arose at the start of 2020 (January 7).
Hope someone can help with this, as downloading population data from EuroSTAT is a mandatory part in more of my/our regular work.
In the special case of curl::has_internet, you don't need to modify the function to return a specific value. It has its own enclosing environment, from which it reads a state variable indicating whether a proxy connection exists. You can modify that state variable instead.
assign("has_internet_via_proxy", TRUE, environment(curl::has_internet))
curl::has_internet() # will always be TRUE
# [1] TRUE
It's difficult to tell without knowing your settings but there are a couple of things to try. This issue has been noted and possibly addressed in a development version which you can install with
install.packages("https://github.com/jeroen/curl/archive/master.tar.gz", repos = NULL)
You could also try updating libcurl, which is the C library for which the R package acts as an R interface. The problem you describe seems to be more common with older versions of libcurl.
If all else fails, you could overwrite the curl::has_internet function like this:
remove_has_internet <- function()
{
unlockBinding(sym = "has_internet", asNamespace("curl"))
assign("has_internet", function() return(TRUE), envir = asNamespace("curl"))
lockBinding(sym = "has_internet", asNamespace("curl"))
}
Now if you run remove_has_internet(), any call to curl::has_internet() will return TRUE for the remainder of your R session. However, this will only work if other curl functionality is working properly with your network settings. If it isn't then you will get other strange errors and should abandon this approach.
If, for any reason, you want to restore the functionality of the original curl::has_internet without restarting an R session, you can do this:
restore_has_internet <- function()
{
unlockBinding(sym = "has_internet", asNamespace("curl"))
assign("has_internet",
function() {!is.null(nslookup("r-project.org", error = FALSE))},
envir = asNamespace("curl"))
lockBinding(sym = "has_internet", asNamespace("curl"))
}
I just got into this problem, so here's an additional solution, blending both previous answers. It's reversible and checks if we actually have internet to avoid bigger problems later.
# old value
op = get("has_internet_via_proxy", environment(curl::has_internet))
# check for internet
np = !is.null(curl::nslookup("r-project.org", error = FALSE))
assign("has_internet_via_proxy", np, environment(curl::has_internet))
Within a function, this line can be added to automatically revert the process:
on.exit(assign("has_internet_via_proxy", op, environment(curl::has_internet)))

Installing pdftotext on Windows (for use with R, 'tm' package)

I am having trouble using R, 'tm' package, to read in .pdf files.
Specifically, I try to run the following code:
library(tm)
filename = "myfile.pdf"
tmp1 <- readPDF(PdftotextOptions="-layout")
doc <- tmp1(elem=list(uri=filename),language="en",id="id1")
doc[1:15]
...which gives me the error:
Error in readPDF(PdftotextOptions = "-layout") :
unused argument (PdftotextOptions = "-layout")
I assume this is due to the fact that the pdftotext program (part of xpdf, http://www.foolabs.com/xpdf/download.html) has not been installed correctly on my machine, so that R cannot access it.
What are the steps to install xpdf/pdftotext correctly such that the above R code can be executed? (I am aware of similar questions already posted, however they don't address the same issue)
PdftotextOptions is no parameter of readPDF. readPDF has a control parameter, which expects a list. So correct use would be:
if(all(file.exists(Sys.which(c("pdfinfo", "pdftotext"))))) {
tmp1 <- readPDF(control = list(text = "-layout"))
doc <- tmp1(elem=list(uri=filename),language="en",id="id1")
}
Set
setwd('C:/xpdf/bin64')
It works for me.

R:V3.1.1, Platform:x86_64-w64-mingw32/x64 (64-bit), Package: choroplethrMaps

This is my first question to the community. I've read through the guidelines and am doing my best to ask an appropriate question and including a minimal, complete, and verifiable example. That being said, please feel free to suggest ways in which I can ask better questions going forward.
I am having trouble with the choroplethrMaps package, which I have never used in the past. I have had issues installing packages on my work computer before, but have gotten around this issue by pasting the package and its dependencies in my library directory. Part of my issues may stem from that, but I'm not sure.
Here is the code that replicates the issue on my machine.
library(choroplethrMaps)
library(choroplethrAdmin1)
library(choroplethr)
data(state.map)
df<-data.frame(region=unique(x = state.map$region),value=rnorm(n = 51,mean = 500,sd = 45))
debug(state_choropleth)
state_choropleth(df = df,title = "", legend = "", num_colors = 1)
After debugging the state_choropleth function, it looks like there is an error with the "render" portion of the code. When I execute the above code, I get the following error message.
Error in withCallingHandlers(tryCatch(evalq((function (..., call. = TRUE, :
object '.rcpp_warning_recorder' not found
Note that I am only using state_choropleth because when running the choroplethr function, I was advised to use state_choropleth instead. It seems as though choroplethr is out of date.

Error in ls(envir = envir, all.names = private)?

The below error keeps coming up inconsistently when I try to read excel files into R using the 'XLConnect' package.
Error in ls(envir = envir, all.names = private) :
invalid 'envir' argument
I have actually run into this error while even using other packages that read excel files like package 'xlsx' and 'xlsReadWrite'. Many times restarting the R session solves this problem, which leads me to think that something else I am doing in my R session is changing the environment and not allowing me to load excel files anymore. Below is the latest example of code that is causing this error. In this case I know that the following coding sequence is causing the error to appear - but why is that happening? And how can I get past this error if I need the chron package.
library("XLConnect")
wb2 <- loadWorkbook("excel_file", create = FALSE)
library(chron)
wb2 <- loadWorkbook("excel_file", create = FALSE)
Anyone else run into this issue before? Any help on this issue is greatly appreciated!
Before reopening the workbook try removing the reference to previously opened one, so:
rm(wb2)
wb2 <- loadWorkbook("excel_file", create = FALSE)
Also, make sure that "excel_file" is not open by excel or any other program while you run the R test.
I've seen the same error come up when using XLConnect and the above seemed to help.
Had this problem a couple of times and the call stack looks like this message is generated when a "OutOfMemory" Exception is thrown.
To solve this problem I used:
options( java.parameters = "-Xmx4g" )
to increase the heap size rJava is able to use.
Debugging with options(error=utils::recover) helped a lot, because the R error messages are not very specific.

Resources