Can't merge repositories - svndump

When using: svnrdump load URL < file.dump
I get the following error:
Failed to get lock on destination repos, currently held by x
svnrdump: E000022: Couldn't get lock on destination repos after 10 attempts

Related

How to handle a private remote dependency in a new R package?

I am writing an R package (let's called it new.package) that needs to import functions from another private R package (let's called it remote.private.package). remote.private.package is a private package on a github private remote repository (let's assume at github::username/remote_private_package).
I have run the following line which adds a private remote dependency to the new.package DESCRIPTION file:
usethis::use_dev_package("remote.private.package", type = "Imports", remote = c("username/remote_private_package", auth_token = "my_github_auth_token"))
This is the content of the relevant section of the DESCRIPTION file, after running the above line:
...
Remotes:
username/remote_private_package,
my_github_auth_token
Imports:
remote.private.package (>= 0.1.0),
...
However when i run devtools::load_all() i get the following console message:
ℹ Loading new.package
ℹ The package `remote.private.package` (>= 0.1.0) is required.
✖ Would you like to install it?
1: Yes
2: No
Selection: 1
Error: HTTP error 404.
Not Found
Did you spell the repo owner (`username`) and repo name (`remote_private_package`) correctly?
- If spelling is correct, check that you have the required permissions to access the repo.
I am sure that the info is correct because when i run:
remotes::install_github("username/remote_private_package", auth_token = "my_github_auth_token")
library(remote.private.package)
The package is installed and loaded correctly.
What are the best practices to handle packages with private remote repository dependencies?

R CMD check fails with ubuntu when trying to download file, but function works within R

I am writing an R package and one of its functions download and unzips a file from a link (it is not exported to the user, though):
download_f <- function(download_dir) {
utils::download.file(
url = "https://servicos.ibama.gov.br/ctf/publico/areasembargadas/downloadListaAreasEmbargadas.php",
destfile = file.path(download_dir, "fines.rar"),
mode = 'wb',
method = 'libcurl'
)
utils::unzip(
zipfile = file.path(download_dir, "fines.rar"),
exdir = file.path(download_dir)
)
}
This function works fine with me when I run it within some other function to compile an example in a vignette.
However, with R CMD check in github action, it fails consistently on ubuntu 16.04, release and devel. It [says][1]:
Error: Error: processing vignette 'IBAMA.Rmd' failed with diagnostics:
cannot open URL 'https://servicos.ibama.gov.br/ctf/publico/areasembargadas/downloadListaAreasEmbargadas.php'
--- failed re-building ‘IBAMA.Rmd’
SUMMARY: processing the following file failed:
‘IBAMA.Rmd’
Error: Error: Vignette re-building failed.
Execution halted
Error: Error in proc$get_built_file() : Build process failed
Calls: <Anonymous> ... build_package -> with_envvar -> force -> <Anonymous>
Execution halted
Error: Process completed with exit code 1.
When I run devtools::check() it never finishes running it, staying in "creating vignettes" forever. I don't know if these problems are related though because there are other vignettes on the package.
I pass the R CMD checks with mac os and windows. I've tried switching the "mode" and "method" arguments on utils::download.file, but to no avail.
Any suggestions?
[1]: https://github.com/datazoompuc/datazoom.amazonia/pull/16/checks?check_run_id=2026865974
The download fails because libcurl tries to verify the webservers certificate, but can't.
I can reproduce this on my system:
trying URL 'https://servicos.ibama.gov.br/ctf/publico/areasembargadas/downloadListaAreasEmbargadas.php'
Error in utils::download.file(url = "https://servicos.ibama.gov.br/ctf/publico/areasembargadas/downloadListaAreasEmbargadas.php", :
cannot open URL 'https://servicos.ibama.gov.br/ctf/publico/areasembargadas/downloadListaAreasEmbargadas.php'
In addition: Warning message:
In utils::download.file(url = "https://servicos.ibama.gov.br/ctf/publico/areasembargadas/downloadListaAreasEmbargadas.php", :
URL 'https://servicos.ibama.gov.br/ctf/publico/areasembargadas/downloadListaAreasEmbargadas.php': status was 'SSL peer certificate or SSH remote key was not OK'
The server does not allow you to download from http but redirects to https, so the only thing to do now is to tell libcurl to not check the certificate and accept what it is getting.
You can do this by specifying the argument -k to curl
download_f <- function(download_dir) {
utils::download.file(
url = "https://servicos.ibama.gov.br/ctf/publico/areasembargadas/downloadListaAreasEmbargadas.php",
destfile = file.path(download_dir, "fines.rar"),
mode = 'wb',
method = 'curl',
extra = '-k'
)
utils::unzip(
zipfile = file.path(download_dir, "fines.rar"),
exdir = file.path(download_dir)
)
}
This also produces some download progress bar, you can silence this by setting extra to -k -s
This now opens you up to a Machine In The Middle Attack. (You possibly already are attacked this way, there is no way to check without verifying the current certificate with someone you know at the other side)
So you could implement an extra check, e.g. check the sha256sum of the downloaded file and see if it matches what you expect to receive before proceeding.
myfile <- system.file("fines.rar")
hash <- sha256(file(myfile))

devtools::install_github error: SEC_E_UNTRUSTED_ROOT - The certification chain was issued by an entity that is unreliable

I am trying to install an package from behind a corporate (fire)wall using devtools:
library(devtools)
devtools::install_github("aryoda/tryCatchLog")
I get an error message:
Error: Failed to install 'unknown package' from GitHub: schannel:
next InitializeSecurityContext failed: SEC_E_UNTRUSTED_ROOT (0x80090325)
- The certification chain was issued by an entity that is unreliable.
The reason seems to be the used curl package which produces the same error with:
library(curl)
curl::curl_fetch_memory("https://httpbin.org/get")
How can I fix this?
PS: I am using MS Windows 10
I have found the solution:
The internet connection does only work within curl if you set the correct HTTPS_PROXY:
# insert your correct domain name and IP port here
Sys.setenv(https_proxy = "http://httpproxy.mycompany.com:1234")
This devtools issue comment did help me:
https://github.com/r-lib/devtools/issues/1610#issuecomment-333344548
Update 1:
This is a generic solution to set the HTTP(S)_PROXY in R:
requires(curl)
requires(devtools)
proxy <- curl::ie_get_proxy_for_url("https://www.qwant.com/")
Sys.setenv(https_proxy=proxy)
# Sys.setenv(http_proxy=proxy) # you could also set an HTTP proxy
devtools::install_github("aryoda/tryCatchLog") # should work now
You could add this line to your Rprofile.site file (in R/etc folder)

Unable to use RJDBC at Shinyapp.io

I have written a Shiny App which runs perfectly in my local machine. I have used RJDBC to connect to the DB2 database in IBM Cloud. The code is as follows.
#Load RJDBC
dyn.load('/Library/Java/JavaVirtualMachines/jdk-9.0.4.jdk/Contents/Home/lib/server/libjvm.dylib')
# dyn.load('/Users/parthamajumdar/Documents/Solutions/PriceIndex/libjvm.dylib')
library(rJava)
library(RJDBC)
As the path is hard coded, I copied the file libjvm.dylib to the Project directory and pointed to that. When I do this, R gives a fatal error.
I remove the absolute path and replaced with "./libjvm.dylib" and deployed the application on ShinyApp.io website. When I run the program, it gives a fatal error.
#Values for you database connection
dsn_driver = "com.ibm.db2.jcc.DB2Driver"
dsn_database = "BLUDB" # e.g. "BLUDB"
dsn_hostname = "dashdb-entry-yp-lon02-01.services.eu-gb.bluemix.net" # e.g. replace <yourhostname> with your hostname, e.g., "Db2 Warehouse01.datascientstworkbench.com"
dsn_port = "50000" # e.g. "50000"
dsn_protocol = "TCPIP" # i.e. "TCPIP"
dsn_uid = "<UID>" # e.g. userid
dsn_pwd = "<PWD>" # e.g. password
#Connect to the Database
#jcc = JDBC("com.ibm.db2.jcc.DB2Driver", "/Users/parthamajumdar/lift-cli/lib/db2jcc4.jar");
jcc = JDBC("com.ibm.db2.jcc.DB2Driver", "db2jcc4.jar");
jdbc_path = paste("jdbc:db2://", dsn_hostname, ":", dsn_port, "/", dsn_database, sep="");
conn = dbConnect(jcc, jdbc_path, user=dsn_uid, password=dsn_pwd)
Similarly, I copied the file "db2jcc4.jar" to my local project directory. If I point to the local project directory for this file in my local machine, the program works. However, when I deploy on ShinyApp.io, it gives fatal error.
Request your please letting me know what I need to do so that the application runs properly on the ShinyApp.io website.
The error is as follows when I run the application from Shiny server:
Attaching package: ‘lubridate’
The following object is masked from ‘package:base’:
date
Loading required package: nlme
This is mgcv 1.8-23. For overview type 'help("mgcv-package")'.
Error in value[[3L]](cond) :
unable to load shared object '/srv/connect/apps/ExpenseAnalysis/Drivers/libjvm.dylib':
/srv/connect/apps/ExpenseAnalysis/Drivers/libjvm.dylib: invalid ELF header
Calls: local ... tryCatch -> tryCatchList -> tryCatchOne -> <Anonymous>
Execution halted
What works for me is the following and it is independent of OS.
Create your own R package that contains the file you need somewhere in the extdata folder. As an example, your package could be yourpackage and the file would be something like extdata/drivers/mydriver.lib. Typically this would be stored at this location inst/extdata/drivers. See http://r-pkgs.had.co.nz/inst.html for details.
Store this package on github and if you want privacy you will need to work out how to grant an access token.
Use the devtools package to install it. The command would be something like this, devtools::install_github("you/yourpackage", auth_token = "youraccesstoken"). Do this once before deploying to Shiny.io. Ensure that you also do library(yourpackage). The package submission process will work out that it needs to fetch from Github.
Use the following R code to find the file.
system.file('extdata/drivers/mydriver.lib, package='yourpackage'). This will give you the full path to the file and you can use it.

HTTP 400 - Unable to parse remote repository npm metadata

We have 2 remote NPM registries inside of a virtual repository. One of them is the NPM Registry, the other one is from a software provider. When I add the second repository to the virtual repository, I am getting HTTP 400 messages at random.
For example: if I want to install a package from the npm-registry, I see through the logs that Artifactory is trying to get the package from the other repository (which does not have the package) and tries to parse the response as json. The response from the other repository gives back a html file though which results in the following error message:
2017-02-23 09:39:05,424 [http-nio-8080-exec-7112] [ERROR]
(o.a.a.n.r.NpmRemoteRepoHandler:362) - Error while parsing the response of a remote npm
JSON query on 'https://repository.domain.com/api/npm/public/file-loader':
Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object,
'true', 'false' or 'null')
at [Source:org.artifactory.storage.db.binstore.service.UsageTrackingBinaryProvider$ReaderTrackingStream#7360bc6c; line: 1, column: 2]
As you can see, Artifactory is trying to get the package from the other repository. The JSON response of our artifactory, when I try to get the package manually is :
{
"errors" : [ {
"status" : 400,
"message" : "Unable to parse remote repository npm metadata."
} ]
}
Any help would be greatly appreciated, since this makes the NPM Registry completely useless as some requests are returning this HTTP 400 error.
fyi: We are using Artifactory Pro 4.5.1
There are 2 things you should do to avoid this behavior
Configure the virtual repository resolution order so the NPM registry is approached before the software provider registry. The resolution order is controlled by the order they are presented in the Selected Repositories list.
Use include/exclude patterns to control which packages are resolved from the software provider registry. Assuming there is a way to identify the packages which should be resolved from software provider you can define patterns which will limit this registry only for the resolution of certain packages.
Another thing to check is whether the software provider remote repository configured properly. Normally it should not return an HTML response for an API call.

Resources