Did someone have problems with metaboanalystR CrossReferencing error? - r

I am trying to run some tutorials of metaboanalystR and I have a problem with function CrossReferencing
mSet<-CrossReferencing(mSet, "name")
[1] "Reading data unsuccessful, attempting to re-download file..."
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 557k 100 557k 0 0 557k 0 0:00:01 0:00:01 --:--:-- 457k
[1] "Loading files from server unsuccessful. Ensure curl is downloaded on your computer."
Error in .read.metaboanalyst.lib("compound_db.rds") :
objeto 'my.lib' no encontrado
I have already install curl, but nothing happens.
Did anyone have the same error?
Thank you

Related

Open Foam Trouble while accessing

`
-MacBook-Air ~ % sudo curl -o /usr/local/bin/openfoam-macos-file-system http://d1.openfoam.org/docker/openfoam-macos-file-system
Password:
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0curl: (6) Could not resolve host: d1.openfoam.org
rahiuppal#rahis-MacBook-Air ~ %
`
While accessing Open Foam I am getting this error, Can anyone help me to sort this out?
Help me to sort this out

Download.file with method curl downloads 0 bytes

When I use the download.file function with method curl I get a downloaded file with 0 bytes.
for example:
download.file('https://github.com/rstudio/rmarkdown/archive/refs/heads/main.zip', destfile='library', method='curl')
Produces the following output:
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
character(0)
How can I get this fixed?
Note:
method='wget' works but that is because my Mac has wget, however my windows laptop only has curl as an option, and this does not work. I specifically want to fix the curl method.

Unable to download large files from Sonatype Nexus

Nexus version 3.1.0-04
During a build, I receive the following error downloading an artifact from Nexus.
Download >http://10.148.254.17:8081/nexus/content/repositories/central/org/assertj/assertj-core/2.4.1/assertj-core-2.4.1.jar
:collection:extractIncludeTestProto FAILED
FAILURE: Build failed with an exception.
What went wrong:
Could not resolve all dependencies for configuration ':collection:testCompile'.
Could not download assertj-core.jar (org.assertj:assertj-core:2.4.1)
Could not get resource 'http://xxx.xxx.xxx.xxx:8081/nexus/content/repositories/central/org/assertj/assertj-core/2.4.1/assertj-core-2.4.1.jar'.
Premature end of Content-Length delimited message body (expected: 900718; received: 6862
This appear to be a problem with large files stored in Nexus.
If I try and download the file via wget or curl, it also fails.
c:>wget http://xxx.xxx.xxx.xxx:8081/nexus/content/repositories/central/org/assertj/assertj-core/2.5.0/assertj-
core-2.5.0.jar
--13:57:06-- >http://xxx.xxx.xxx.xxx:8081/nexus/content/repositories/central/org/assertj/assertj-core/2.5.0/assertj-core-2.5.0.jar
=> `assertj-core-2.5.0.jar'
Resolving proxy.xxxx.com... done.
Connecting to proxy.xxxx.com[xxx.xxx.xxx.xxx]:xxx... connected.
Proxy request sent, awaiting response... 200 OK
Length: 934,446 [application/java-archive]
0% [ ] 6,856 1.44K/s ETA 10:27
13:57:21 (1.44 KB/s) - Connection closed at byte 6856. Retrying.
c:>curl -O http://xxx.xxx.xxx.xxx:8081/nexus/content/repositories/central/org/assertj/assertj-core/2.5.0/assertj-core-2.5.0.jar
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 912k 0 6862 0 0 613 0 0:25:24 0:00:11 0:25:13 613
curl: (18) transfer closed with 927584 bytes remaining to read
Any ideas why?
In my case by docker layer was blocked. I solved this problem by changing the timeout in System>Http>Connection/Socket timeout.

Cannot connect to h20 in R using h2o.init()

I'm trying to connect to h20 for the first time in R Studio with the following command:
library(h2o)
h2o.init()
R Versions 2.3.2 and h20 version 3.10.2.2. Any idea on how to fix this so I can connect?
H2O is not running yet, starting it now...
running command ''/usr/bin/java' -version 2>&1' had status 1
Note: In case of errors look at the following log files:
/var/folders/w7/vpssy9010lg4xxlwkg5f2zwm0000gn/T//RtmpZhyGXB/h2o_chrisstroud_started_from_r.out
/var/folders/w7/vpssy9010lg4xxlwkg5f2zwm0000gn/T//RtmpZhyGXB/h2o_chrisstroud_started_from_r.err
No Java runtime present, requesting install.
Starting H2O JVM and connecting: ............................................................
[1] "localhost"
[1] 54321
[1] TRUE
[1] -1
[1] "Failed to connect to localhost port 54321: Connection refused"
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0curl: (7) Failed to connect to localhost port 54321: Connection refused
[1] 7
Show Traceback
Error in h2o.init() : H2O failed to start, stopping execution.
If you have proxy enabled, it won't work.
Sys.setenv(https_proxy="")
Sys.setenv(http_proxy="")
Sys.setenv(http_proxy_user="")
Sys.setenv(https_proxy_user="")
fixed it for me.

Is there a way to disable SSL/TLS for GitHub Pages?

I have been looking for a place to store some of my XML schemas publicly without actually having to host them. I decided GitHub Pages would be the ideal platform. I was correct except that I cannot figure out how to turn off SSL/TLS. When I try to fetch my pages with plain old HTTP I get a 301 Moved Permanently response.
So obviously this isn't a big deal. Worst case scenario it takes a little longer to download my schemas, and people generally only use schemas that they've already cached anyway. But is there really no way to turn this off?
From github help :
HTTPS enforcement is required for GitHub Pages sites created after June 15, 2016 and using a github.io domain.
So, you have two solutions :
find a github.io repository older than June 15, 2016
set a custom domain name on your github.io
But is there really no way to turn this off?
No, and a simple curl -L would follow the redirection and get you the page content anyway.
For instance (get an xml file in a tree structure):
vonc#vonvb C:\test
> curl --create-dirs -L -o .repo/local_manifests/local_manifest.xml -O -L https://raw.githubusercontent.com/legaCyMod/android_local_manifest/cm-11.0/local_manifest.xml
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 530 100 530 0 0 1615 0 --:--:-- --:--:-- --:--:-- 1743
vonc#voncvb C:\test
> tree /F .
C:\TEST
└───.repo
└───local_manifests
local_manifest.xml

Resources