R marmap getNOAA.bathy Error in if (ncol(x) == 3 & !exists("bathy", inherits = FALSE)) { : argument is of length zero - r

I've been trying to get bathymetry lines from marmap and recently got the following error message when getNOAA.bathy function is called
Querying NOAA database ...This may take seconds to minutes, depending on grid size
Error in if (ncol(x) == 3 & !exists("bathy", inherits = FALSE)) { : argument is of length zero
This happens even with something as simple as
map <- getNOAA.bathy(lon1=10,lon2=19,lat1=67,lat2=71,resolution=10, keep=TRUE)
I updated the package to the latest version (1.0.4) as I read that there could be issues related to server access. I've also tried running the above script in R rather than RStudio, but the error persists...
The function works fine if I use previously downloaded data, but now I needed use use another set of coordinates for a new map.
Any help is much appreciated!

You need to (re)install both rgdal and raster packages. This is already documented here on the GitHub Issue pages of the marmap package.

For anyone else looking, the reported error can also result from the server being down - R doesn't give any indication of this. You can check https://www.ncei.noaa.gov/alerts for scheduled outages.

For several months now, the same error message seem to have plagued Windows users, event with all packages up to date.
marmap v1.0.9 is now available on GitHub:
remotes::install_github("ericpante/marmap")
This version should solve this infamous Error message:
Error in if (ncol(x) == 3 & !exists("bathy", inherits = FALSE)) { :
argument is of length zero
I have been able to confirm that the error was due to a limitation in the length of urls that the raster package can handle on Windows. The geotif files from NOAA's servers are now downloaded with utils::download.file() in a temporary file on the user's disk and then imported in R using raster::raster().
marmap v1.0.9 will be available on CRAN servers in the next few days.

Related

pathtrackR - S4 vector needs to be a double

I am using R for the first time as part of my dissertation and have been given code from my supervisor who got it from someone else (code attached below with comments from my supervisor). When I try to run it, I get this error and I am unsure how to resolve it.
Error at line 50:
ww<-manualPath("C:/Users/Erin/Document/4th year university/Honours Project/Data files/AOTEA/Jan/508_C5_JAN25_0922",1000,1000,fps=4)
Click once on the top left corner of your arena, followed by clicking once on the bottom right corner of your arena, to define the opposing corners of the entire arena...
Error in as.double(y) :
cannot coerce type 'S4' to vector of type 'double'
My supervisor thinks that the issue is that when the pathtrackR package was created it relied on other packages which have subsequently been updated while it hasn't. These updated packages won't, in turn, function on an older version of R where pathtrackR will work, and so the two fall down.
We know that the error is because the S4 vector needs to be a double, but I don't have enough coding experience to examine the pathtrackR package myself to determine why this is the case.
I was wondering if there is a workaround without reworking the code, or if anyone has any other help or advice on what I could try.
Any help is much appreciated.
######### REQUIRES AN OLDER VERSION OF R TO EFFECTIVELY RUN. I AM USING 3.6 FOR THIS
install.packages("devtools")
library(devtools)
find_rtools() #should return TRUE if an appropriate level of package is present (although F doesn't seem to affect functionality of package?)
if (!requireNamespace("BiocManager", quietly = TRUE))
install.packages("BiocManager")
BiocManager::install("EBImage")
library(EBImage)
#ONLY NEED TO REVISIT GITHUB IF IT HASN'T BEEN UPDATED IN A WHILE
install_github("aharmer/pathtrackr", build_vignettes = TRUE)
install.packages("digest", dependencies=TRUE) #seems to resolve issues about not instaling newest versions?
library(pathtrackr)
#####BIG ISSUE SEEMS TO BE ANY PROBLEM REQUIRES CLOSING R AND REOPENING......
#######################################################################
#######################################################################
#MANUAL TRACKING
#seems to be resolved on mac following additional install of XQuartz & Xcoe (had to find an old version from
#eg. https://stackoverflow.com/questions/7047735/where-can-i-download-old-versions-of-xcode Need version 9.2 for
#macOS Sierra 10.12.6)
#see also here https://www.r-bloggers.com/installing-r-on-os-x/
library(devtools)
find_rtools() #should return TRUE if an appropriate level of package is present
library(pathtrackr)
################### NB OPENING MORE THAN 2 VIDEOS WITHOUT CLOSING & RESTARTING R LEADS TO
################### SIGNIFICANT SLOWING & EVENTUAL CRASHING OF R
################### PROB A WORKING MEMORY ISSUE?
################### EASIEST 'OPTION'FIX' IS TO CLOSE & REOPEN EACH TIME
rm(list=ls())
library(devtools)
#library(sp)
find_rtools() #should return TRUE if an appropriate level of package is present (DOESN'T SEEM TO MATTER WHETHER TRUE OR FALSE???
library(pathtrackr)
####################################################################################################################################
#ww<-manualPath("/Volumes/LaCie/MarieCurieFellowship/2020Fieldwork/2020mp4vids/FirstTimes/710_C2_MAR6_1028",1000,1000,fps=4)
ww<-manualPath("~/Desktop/2020_Data/SLISCALED/806_C2_JAN09_1023",1000,1000,fps=4)
pdf(file='/Volumes/LaCie/MarieCurieFellowship/2020Fieldwork/2020_Data/SLI/pdf_images/806_Jan09.pdf')
plotPath(ww) #saving image
dev.off()
plotSummary(ww)
pathSummary(ww)
xx<-as.data.frame(ww$movement)
write.csv(xx, "/Volumes/LaCie/MarieCurieFellowship/2020Fieldwork/2020_Data/SLI/MovementData/sli806_Jan09.csv",row.names=F)
zz<-as.data.frame(ww$position)
write.csv(zz, "/Volumes/LaCie/MarieCurieFellowship/2020Fieldwork/2020_Data/SLI/PositionData/sli806_Jan09.csv",row.names=F)
####################################################################################################################################
#Second run of same commands. Allows you to realise you forgot to write the csv files without having lost all the work as all files are stored under new names
#pp<-manualPath("/Volumes/LaCie/MarieCurieFellowship/2020Fieldwork/2020mp4vids/FirstTimes/712_A4_MAR6_1045",1000,1000,fps=4)
pp<-manualPath("~/Desktop/2020_Data/SLISCALED/807_C7_JAN10_0910",1000,1000,fps=4)
pdf(file='/Volumes/LaCie/MarieCurieFellowship/2020Fieldwork/2020_Data/SLI/pdf_images/807_Jan10.pdf')
plotPath(pp)
dev.off()
plotSummary(pp)
pathSummary(pp)
ll<-as.data.frame(pp$movement)
write.csv(ll, "/Volumes/LaCie/MarieCurieFellowship/2020Fieldwork/2020_Data/SLI/MovementData/sli807_Jan10.csv",row.names=F)
kk<-as.data.frame(pp$position)
write.csv(kk, "/Volumes/LaCie/MarieCurieFellowship/2020Fieldwork/2020_Data/SLI/PositionData/sli807_Jan10.csv",row.names=F)
####################################################################################################################################
#Third run of same commands. Ibid.
#yy<-manualPath("/Volumes/LaCie/MarieCurieFellowship/2020Fieldwork/2020mp4vids/FirstTimes/713_B4_MAR6_1052",1000,1000,fps=4)
yy<-manualPath("~/Desktop/2020_Data/SLISCALED/807_D5_JAN09_0846",1000,1000,fps=4)
pdf(file='/Volumes/LaCie/MarieCurieFellowship/2020Fieldwork/2020_Data/SLI/pdf_images/807_Jan09.pdf')
plotPath(yy)
dev.off()
uu<-as.data.frame(yy$movement)
write.csv(uu, "/Volumes/LaCie/MarieCurieFellowship/2020Fieldwork/2020_Data/SLI/MovementData/sli807_Jan09.csv",row.names=F)
oo<-as.data.frame(yy$position)
write.csv(oo, "/Volumes/LaCie/MarieCurieFellowship/2020Fieldwork/2020_Data/SLI/PositionData/sli807_Jan09.csv",row.names=F)
####################################################################################################################################
#Fourth run of same commands. Ibid.
#ee<-manualPath("/Volumes/LaCie/MarieCurieFellowship/2020Fieldwork/2020mp4vids/FirstTimes/709_D2_MAR6_1020",1000,1000,fps=4)
ee<-manualPath("~/Desktop/2020_Data/SLISCALED/801_B8_JAN09_0947",1000,1000,fps=4)
pdf(file='/Volumes/LaCie/MarieCurieFellowship/2020Fieldwork/2020_Data/SLI/pdf_images/801_Jan09')
plotPath(ee)
dev.off()
rr<-as.data.frame(ee$movement)
write.csv(rr, "/Volumes/LaCie/MarieCurieFellowship/2020Fieldwork/2020_Data/SLI/MovementData/sli801_jan09.csv",row.names=F)
dd<-as.data.frame(ee$position)
write.csv(dd, "/Volumes/LaCie/MarieCurieFellowship/2020Fieldwork/2020_Data/SLI/PositionData/sli801_jan09.csv",row.names=F)

STRINGdb r environment; error in plot_network

I'm trying to use stringdb in R and i'm getting the following error when i try to plot the network:
Error in if (grepl("The document has moved", res)) { : argument is
of length zero
code:
library(STRINGdb)
#(specify organism)
string_db <- STRINGdb$new( version="10", species=9606, score_threshold=0)
filt_mapped = string_db$map(filt, "GeneID", removeUnmappedRows = TRUE)
head(filt_mapped)
(i have columns titled: GeneID, logFC, FDR, STRING_id with 156 rows)
filt_mapped_hits = filt_mapped$STRING_id
head(filt_mapped_hits)
(156 observations)
string_db$plot_network(filt_mapped_hits, add_link = FALSE)
Error in if (grepl("The document has moved", res)) { : argument is
of length zero
You are using quite few years old version of Bioconductor and by extension the STRING package.
If you update to the newest one, it will work. However the updated package only supports only the latest version STRING (currently version 11), so the underlying network may change a bit.
More detailed reason is this:
The STRING's hardware infrastructure underwent recently major changes which forced a different server setup.
Now all the old calls are forwarded to a different URL, however the cURL call, how it was implemented, does not follow our redirects which breaks the STRINGdb package functionality.
We cannot update the old bioconductor package and our server setup can’t be really changed.
That said, the fix for an old version is relatively simple.
In STRINGdb library there is script with all the methods "rstring.r".
In there you’ll find “get_png” method. In it replace this line:
urlStr = paste("http://string-db.org/version_", version, "/api/image/network", sep="" )
With this line:
urlStr = paste("http://version", version, ".string-db.org/api/image/network", sep="" )
Load the library again and it should create the PNG, as before.

How do I deal with an error message while installing a package?

I am brand new to this so please forgive my inexperience...I'm trying to learn.
I'm attempting to install an R package called "Doublet Finder" using the specified code given on the Github site.
When I do this, I get this error immediately:
Error in rbind(info, getNamespaceInfo(env, "S3methods")) :
number of columns of matrices must match (see arg 2)
Being new to R, I'm not sure what this error means and when I google this something similar comes up and the individual removed and re-installed ALL of their libraries...that seems crazy. Does anyone have advice on what this could be, how to fix it, or why the package won't install?
Your problem seems to be fairly similar to this one. It might be the case that the dependencies (packages that Doublet Finder relies on) are outdated. What you can try is to follow these steps to uninstall and reinstall all packages with the hope that by updating packages there isn't a version mismatch.
This code is copied from the website above:
ip <- as.data.frame(installed.packages(lib.loc = .libPaths()[1]),
stringsAsFactors = FALSE)
head(ip)
str(ip)
path.lib <- unique(ip$LibPath)
# create a vector with all the names of the packages you want to remove
pkgs.to.remove <- ip[,1]
head(pkgs.to.remove)
str(pkgs.to.remove)
sapply(pkgs.to.remove, remove.packages, lib = path.lib)
sapply(pkgs.to.remove, install.packages, lib = path.lib)

I am trying to make an interactive map using leafletR package according to a ZevRoss blog. But there is an error in the code

The ZevRoss blog is as follow:
http://zevross.com/blog/2014/04/11/using-r-to-quickly-create-an-interactive-online-map-using-the-leafletr-package/
The code with error is:
# ----- Write data to GeoJSON
leafdat<-paste(downloaddir, "/", filename, ".geojson", sep="")
writeOGR(subdat, leafdat, layer="", driver="GeoJSON")
And the error is:
Error in writeOGR(subdat, leafdat, layer = "", driver = "GeoJSON") :
GDAL Error 3: Cannot open file 'd:/Leaflet/County_2010Census_DP1.geojson'
Because I am a freshman in R, I searched for this problem a lot and didn't get any good answer.
I am using Rstudio R version 3.1.1(2014-07-10) on windows 7 32bit.
My rgdal version is 0.9-1.
The other code in the blog runs successfully, this sentence seems to be the only difficult point.
You could create GeoJSON using leafletR package:
library('leafletR')
Your_GeoJSON <- toGeoJSON(data=YourData, dest=getwd())
I've tried to find a solution for this mysterious error for some time.
Eventually I found this post on the Gdal package errors' tickets site that clarified the problem and gave a solution.
Basically the problem is in the interface between rgdal and Gdal (Gdal changed their way to work and the latest version of rgdal hasn't watched up yet):
writeOGR() calls ogrCheckExists("foo.geojson") to check first if the file exists before creating a new dataset.
In the 1.11 version the OGR GeoJSON driver will emit an error message that this file doesn't exists, whereas previous version didn't emit an error message.
rgdal catches this error as a fatal one and doesn't go to the writing step. This should be fixed in rgdal.
Meanwhile you have an easy workaround : add check_exists = FALSE as a parameter to writeOGR()
Therefore the following code will work:
writeOGR(spDf,'foo.geojson','spDf', driver='GeoJSON',check_exists = FALSE)
Of course if there is already a geojson file with the chosen name at the location writeOGR still fails.
Even though you already have a 'd:' drive on your computer and you have permission to write to that drive, try the following:
--------------------------------
leafdat<-paste(downloaddir, "/", ".geojson", sep="")
> leafdat
> "d:/Leaflet/.geojson"
writeOGR(subdat, leafdat, layer="", driver="GeoJSON")
--------------------------------
Then you may get ".geojson" file on "d:/Leaflet". Change the file name ".geojson" to "County_2010Census_DP1.geojson".

Error with importShapefile with PBSmapping package in R

I am receiving a sporadic error message with importShapefile in PBSmapping (version 2.63.37) in RStudio (0.97.318), running R version 2.15.2, platform: i386-w64-mingw32/i386 (32-bit). I also received the error while running previous versions of R and RStudio.
> ST6 = importShapefile("Data/pvi_stat_2002_utm.shp", projection="UTM", readDBF = TRUE)
Error in 1:nrow(dbf) : argument of length 0
> traceback()
2: cbind(1:nrow(dbf), dbf)
1: importShapefile("Data/pvi_stat_2002_utm.shp", projection = "UTM",readDBF = TRUE)
I only receive this error occasionally - perhaps 1 in every 10 times that I run the code. But once the error occurs in a session, it occurs repeatedly and will not successfully implement the command until I have closed R completely and reopened it. On one occasion I had to reboot the computer for it to work, as successive reopening of R did not help.
I thought it might be a memory issue but sometimes I will get the error when no objects are in the workspace. And usually the code runs fine even if I have large objects loaded. In response to the error I have removed all objects from the workspace and even followed with gc(), but to no avail.
This is the only shapefile with which I have received the error but as it is the only one that I use with regularity and since I can not predict when the error will occur, my efforts with other shapefiles are inconclusive. Not sure about uploading a shapefile to Stack Overflow. The zipped file is about 9MB.
Have a look in the folder where your shapefile is. Is there actually a .dbf file? If there is, it sounds like it is empty or corrupted, or misnamed. Are you expecting your shapefile to have polygons with attributes. Can you try importShapefile(... readDBF = FALSE )? Maybe you can make our data available through a dropbox link or something?
Alternatively have you tried rgdal:::readOGR or, my personal favourite, maptools:::readShapePoly(). I personally find readShapePoly() to be extremely robust and there are methods for coercing a SpatialPolygonsDataFrame from sp to a PolySet from PBS.
If you really must use PBS have you tried...
require( maptools )
require( sp )
myshp <- readShapePoly("Data/pvi_stat_2002_utm")
myshpPBS <- SpatialPolygons2PolySet( myshp )
I am assuming that there is a .prj file with your shapefile, describing the projection information?
I'm using R-3.0.1 and PBS Mapping 2.66.53 with the NAVO Divisions shapefile from http://www.nafo.int/about/overview/gis/Divisions.zip. On Windows 7 x86_64 and OS X Snow Leopard (using macports R built for x86_64), the .dbf is being read properly, but it
sometimes fails using RHEL 5.9:
> library("PBSmapping", lib.loc="/home/gwhite/R/x86_64-unknown-linux-gnu-library/3.0")
-----------------------------------------------------------
PBS Mapping 2.66.53 -- Copyright (C) 2003-2013 Fisheries and Oceans Canada
[...]
-----------------------------------------------------------
> library("rgeos", lib.loc="/home/gwhite/R/x86_64-unknown-linux-gnu-library/3.0")
rgeos version: 0.2-19, (SVN revision 394)
GEOS runtime version: 3.3.8-CAPI-1.7.8
Polygon checking: TRUE
> layer='Divisions'
> divs = importShapefile(layer, projection='LL')
Error in 1:nrow(dbf) : argument of length 0
Using readDBF=F does allow the shapefile data to be read:
> divs = importShapefile(layer, projection='LL', readDBF=F)
So far, importShapefile() has been working in a freshly started R session.

Resources