How to fix rsync's #ERROR: chdir failed error on a Mac - rsync

I am using OSX (Catalina) and I am trying to sent rsync up and I am failing at it.
I have the following configuration:
port = 3001
pid file = /path/to/pid.log
lock file = /path/to/rsync.lock
log file = /path/to/rsync.log
[repo]
path = /path/to/dir/
comment = This a directory
read only = yes
list = yes
use chroot = false # used true, same result
use chdir = false # used true, same result
Attempting to list by running:
rsync -rdt rsync://IPADDR:RsyncPort/
or attempting to copy a file by running:
rsync -rdt rsync://IPADDR:RsyncPort/DirectoryName/File /DestinationDirectory/
Always leads to same error:
#ERROR: chdir failed
rsync error: error starting client-server protocol (code 5) at /BuildRoot/Library/Caches/com.apple.xbs/Sources/rsync/rsync-54/rsync/main.c(1402) [receiver=2.6.9]
What am I doing wrong? How can this be fixed?

The problem was I forgot to create the dir in /path/to/dir/. In my research I found two reasons that could be responsible for this error. Either permission issues, or the file/directly does not exist. In my case it was the directory not existing, and rsync does not report a useful descriptive error message.
I thought of deleting this question, but I guess this can be useful to someone else.

Related

R CMD check fails with ubuntu when trying to download file, but function works within R

I am writing an R package and one of its functions download and unzips a file from a link (it is not exported to the user, though):
download_f <- function(download_dir) {
utils::download.file(
url = "https://servicos.ibama.gov.br/ctf/publico/areasembargadas/downloadListaAreasEmbargadas.php",
destfile = file.path(download_dir, "fines.rar"),
mode = 'wb',
method = 'libcurl'
)
utils::unzip(
zipfile = file.path(download_dir, "fines.rar"),
exdir = file.path(download_dir)
)
}
This function works fine with me when I run it within some other function to compile an example in a vignette.
However, with R CMD check in github action, it fails consistently on ubuntu 16.04, release and devel. It [says][1]:
Error: Error: processing vignette 'IBAMA.Rmd' failed with diagnostics:
cannot open URL 'https://servicos.ibama.gov.br/ctf/publico/areasembargadas/downloadListaAreasEmbargadas.php'
--- failed re-building ‘IBAMA.Rmd’
SUMMARY: processing the following file failed:
‘IBAMA.Rmd’
Error: Error: Vignette re-building failed.
Execution halted
Error: Error in proc$get_built_file() : Build process failed
Calls: <Anonymous> ... build_package -> with_envvar -> force -> <Anonymous>
Execution halted
Error: Process completed with exit code 1.
When I run devtools::check() it never finishes running it, staying in "creating vignettes" forever. I don't know if these problems are related though because there are other vignettes on the package.
I pass the R CMD checks with mac os and windows. I've tried switching the "mode" and "method" arguments on utils::download.file, but to no avail.
Any suggestions?
[1]: https://github.com/datazoompuc/datazoom.amazonia/pull/16/checks?check_run_id=2026865974
The download fails because libcurl tries to verify the webservers certificate, but can't.
I can reproduce this on my system:
trying URL 'https://servicos.ibama.gov.br/ctf/publico/areasembargadas/downloadListaAreasEmbargadas.php'
Error in utils::download.file(url = "https://servicos.ibama.gov.br/ctf/publico/areasembargadas/downloadListaAreasEmbargadas.php", :
cannot open URL 'https://servicos.ibama.gov.br/ctf/publico/areasembargadas/downloadListaAreasEmbargadas.php'
In addition: Warning message:
In utils::download.file(url = "https://servicos.ibama.gov.br/ctf/publico/areasembargadas/downloadListaAreasEmbargadas.php", :
URL 'https://servicos.ibama.gov.br/ctf/publico/areasembargadas/downloadListaAreasEmbargadas.php': status was 'SSL peer certificate or SSH remote key was not OK'
The server does not allow you to download from http but redirects to https, so the only thing to do now is to tell libcurl to not check the certificate and accept what it is getting.
You can do this by specifying the argument -k to curl
download_f <- function(download_dir) {
utils::download.file(
url = "https://servicos.ibama.gov.br/ctf/publico/areasembargadas/downloadListaAreasEmbargadas.php",
destfile = file.path(download_dir, "fines.rar"),
mode = 'wb',
method = 'curl',
extra = '-k'
)
utils::unzip(
zipfile = file.path(download_dir, "fines.rar"),
exdir = file.path(download_dir)
)
}
This also produces some download progress bar, you can silence this by setting extra to -k -s
This now opens you up to a Machine In The Middle Attack. (You possibly already are attacked this way, there is no way to check without verifying the current certificate with someone you know at the other side)
So you could implement an extra check, e.g. check the sha256sum of the downloaded file and see if it matches what you expect to receive before proceeding.
myfile <- system.file("fines.rar")
hash <- sha256(file(myfile))

Knitting: Error: pandoc document conversion failed with error 61

Problem
Our End User fails to produce html files, gets this error:
Error: pandoc document conversion failed with error 61
Execution halted
TS Performed
We set up the proxy for a previous error message.
This previous error was:
pandoc.exe: Could not fetch \\HHBRUNA01.hq.corp.eurocontrol.int\alazarov$\R\win-library\3.5\rmarkdown\rmd\h\jquery\jquery.min.js
ResponseTimeout
Error: pandoc document conversion failed with error 67
Execution halted
For this we added "self_contained: no" to RProfile.Site>
We also tried "Self_Contained: yes" .
Current Error Message
Could not fetch http://?/UNC/server.contoso.int/username$/R/win-library/3.5/rmarkdown/rmd/h/default.html
HttpExceptionRequest Request {
host = ""
port = 80
secure = False
requestHeaders = []
path = "/"
queryString = "?/UNC/server.contoso.int/username$/R/win-library/3.5/rmarkdown/rmd/h/default.html"
method = "GET"
proxy = Just (Proxy {proxyHost = "pac.contoso.int", proxyPort = 9512})
rawBody = False
redirectCount = 10
responseTimeout = ResponseTimeoutDefault
requestVersion = HTTP/1.1
}
(InvalidDestinationHost "")
Error: pandoc document conversion failed with error 61
Execution halted
I had the same issue on Windows 10, with user path located on a network drive.
Could not fetch http://?/UNC/...
Error: pandoc document conversion failed with error 61
The solution was to run R as administrator, remove the package 'rmarkdown', and reinstall it.
Additional to the answer by Malte: When you do not have administrator rights you can just change the library directory towards a directory where you have full rights, C: for example. The default option is your network folder "?/UNC/server.contoso.int/username$/R/win-library/3.5/rmarkdown/rmd/h/default.html", where you have not sufficient rights and therefore R can not knit the markdown file.
In RStudio, click on Tools>Install Packages.. Under "Install to library" you can see the default option (in your case it should be "?/UNC/server.contoso.int/username$/R/win-library/3.5/rmarkdown/rmd/h/default.html"). The second option here should be "C:/Program Files/R/R-3.6.2/library".
To change this order, i.e. to make the "C:/Program Files/R/R-3.6.2/library" folder the default folder, you have to use the following code (execute the code in a new R file) :
bothPaths <- .libPaths() # extract both paths
bothPaths <- c(bothPaths [2], bothPaths [1]) # change order
.libPaths(bothPaths ) # modify the order
After that, you might have to install the markdown package again. This time, it will be directly installed into the "C:/Program Files/R/R-3.6.2/library" folder.
Now, knitting should be working, because R will use the package straight from a folder where you have full rights.
Aand issue was resolved. Someone changed a rule on the server hosting the files without documenting/logging....

Untar fails when it used to work

I use untar to decompress .tar.gz files which used to work like a charm. However, on the same files (see file attached), untar doesn't work anymore.
I use:
file = "C:/TEMP/INCA_TT.tar.gz" # The file attached
untar(tarfile = file, exdir = tempdir())
And I get the error:
tar (child): Cannot connect to C: resolve failed
gzip: stdin: unexpected end of file
/i686/tar: Child returned status 128
/i686/tar: Error is not recoverable: exiting now
Warning messages:
1: running command 'tar.exe -zxf "C:/TEMP/INCA_TT.tar.gz" -C "C:/TEMP"' had status 2
2: In untar(filename, exdir = "C:/TEMP") :
‘tar.exe -zxf "C:/TEMP/INCA_TT.tar.gz" -C "C:/TEMP"’ returned error code 2
What could be the reason for this? If I open the tar.gz file using 7zip it works.
I also faced the same error, later I found out that I had traversed to c drive (using ../../mnt/c) and tried to pass the full path from the root (which is a mistake for sure) so please check the below 2 points
1: the path (is it an absolute path or relative path)
2: the directory/location from where you are executing this command should be able to traverse along the path you passed and be able to find the file at that location
this is one of the solution (there will help if it relates to the above scenario)

osmar package in R (OpenStreetMap)

The osmar package in R has a demo file called demo("navigator"). It is provided to illustrate package capabilities and functions. When I ten the script, I hit the following line and error:
R> muc <- get_osm(muc_bbox, src)
sh: osmosis: command not found
Error in file(con, "r") : cannot open the connection
In addition: Warning message:
In file(con, "r") :
cannot open file '/var/folders/81/4k487q0969q1d8rfd1pyhyr40000gs/T//RtmpdgZSOy/file13a473cb904c': No such file or directory
The command is intended to convert an osmosis data object to a osmar object. I have properly installed osmosis for MacOSX, updated my path definition in the bash shell to point to the osmosis executable.
I'm not sure what the error message means and how best to respond. Any help appreciated
Brad
Have your restarted R? It looks like osmosis isn't in your path, although you do mention that you set that. Make sure that you can run one of the osmosis commands in Terminal:
osmosis --read-xml SloveniaGarmin.osm --tee 4 --bounding-box left=15 top=46 --write-xml SloveniaGarminSE.osm --bounding-box left=15 bottom=46 --write-xml SloveniaGarminNE.osm --bounding-box right=15 top=46 --write-xml SloveniaGarminSW.osm --bounding-box right=15 bottom=46 --write-xml SloveniaGarminNW.osm
The example is irrelevant, as long as it doesn't say osmosis file not found.
Also, make sure you have gzip in your path. I am almost certain that it is default, but the demo package relies on it to run. Just open a Terminal and type gzip to make sure it is there.
Finally, if you need to debug this, then run this:
library(osmar)
download.file("http://osmar.r-forge.r-project.org/muenchen.osm.gz","muenchen.osm.gz")
system("gzip -d muenchen.osm.gz")
# At this point, check the directory listed by getwd(). It should contain muenchen.osm.
src <- osmsource_osmosis(file = "muenchen.osm",osmosis = "osmosis")
muc_bbox <- center_bbox(11.575278, 48.137222, 3000, 3000)
debug(osmar:::get_osm_data.osmosis)
get_osm(muc_bbox, src)
# Press Enter till you get to
# request <- osm_request(source, what, destination)
# Then type request to get the command it is sending.
After you type Enter once, and then request you will get the string it is sending to your OS. It should be something like:
osmosis --read-xml enableDateParsing=no file=muenchen.osm --bounding-box top=48.1507120588903 left=11.5551240885889 bottom=48.1237319411097 right=11.5954319114111 --write-xml file=<your path>
Try pasting this into your Terminal. It should work from any directory.
Oh, and type undebug(osmar:::get_osm_data.osmosis) to stop debugging. Type Q to exit the debugger.
Hey I just got this thing working. The problem is not with the system path variable for osmosis. It is with the system call the script makes which uses the "gzip" application to unzip the .gz file it has downloaded before. So there is an error when gzip is not installed in your machine or gzip is not in the system path variable. so installing gzip and adding it to the path variable will mitigate this error. alternatively you can unzip the file manually to the same path and run the script again.

CWRsync failing due to spaces in directory names

I'm having trouble rsyncing folders with spaces between the names between two servers.
I have researched this error online but many of the solutions deal with just one folder with spaces, my problem is that I have two subfolders with spaces in their names and none of the solutions work
E.g. on the server I have the directory:
c:/test folder/test er/test.txt
When I run the rsync.cmd on the client the following error appears in the rsyncd.log file on the server
2011/08/09 09:16:01 [440] connect from server(xxx.xx.xx.xx)
2011/08/09 09:16:01 [440] rsync: chdir /cygdrive/c/'test folder'/'test er' failed
: No such file or directory (2)
In the rsyncd.conf folder on the server I have the following:
[TESTER]
path = /cygdrive/c/"test folder"/"test er"
read only = true
transfer logging = yes
This isn't working, however the following does work:
Folder: c:/test folder/test.txt
with rsyncd.conf code:
[TESTER]
path = /cygdrive/c/"test folder"
read only = true
transfer logging = yes
i.e it works for only one directory with spaces but not two.
I know it's a syntax issue but I can figure out the syntax and I tried a lot of varioeties based on research online include backslashes() and x20 for the spaces
I'm using cwrsync and the server machine is:
MicroSoft windows server 2003 R2
Thank you.
Found the answer, the only thing I didn't try was to not have any quotations or backslashes at all on the names.
i.e.
[TESTER]
path = /cygdrive/c/"test folder"/"test er"
read only = true
transfer logging = yes
The above worked

Resources