In Python, if I install mambaforge or conda then I can make a file with extension .yml and then inside it list the name of packages I want to install alongside their specific versions. How can I do a similar way of installing packages in Julia?
I understand that if I have already installed Julia packages by addcommand in package manager, then I have a file named Project.toml which I can use to install the same packages later. However, this still does not look as good as Python's way of installing packages.
Upon further investigation I realized that to install Julia packages from an empty Prokect.tomlfile, I should add [deps]in the file followed by the name of packages I want and then give each package a uuidwhich can be found here. For example:
[deps]
Images = "916415d5-f1e6-5110-898d-aaa5f9f070e0"
After all , this is still tedious as it needs to find all those uuids.
How can I install packages in Julia the same way I described for Python?
Is there a particular reason that you want to write package names to a .yml file and then read the packages from there? After all, you can generate the Project file and add multiple dependencies automatically:
(#v1.8) pkg> generate MyProject # or whatever name you like
(#v1.8) pkg> activate MyProject
(MyProject) pkg> add Countries Crayons CSV # some example packages
(In recent versions of Julia, an installation prompt will appear if a package isn't already installed).
Speaking from experience, learning to use environments in Julia can be challenging to a new user, but rewarding! The documentation for Pkg.jl are helpful here.
If you are just assembling an environment for your own code, there is probably no need for you to manually edit Project.toml. On the other hand, if you are maintaining a package, you might wish to edit the file directly (e.g., for specifying compatability).
Maybe you can use this:
using TOML
using HTTP
using DataStructures
## Get the Registry.toml file if it already doesn't exist in the current directory
function raed_reg()
if !isfile("Registry.toml")
HTTP.download(
"https://raw.githubusercontent.com/JuliaRegistries/General/master/Registry.toml",
"Registry.toml"
)
end
reg = TOML.parsefile("Registry.toml")["packages"]
return reg
end
## Find the UUID of a specific package from the Registry.toml file
function get_uuid(pkg_name)
reg = raed_reg()
for (uuid, pkg) in reg
if pkg["name"] == pkg_name
return uuid
end
end
end
## Create a dictionary for the gotten UUIDs, by setting the name = UUID and convert it to a project.toml file
function create_project_toml(Pkgs::Vector{String})
reg = raed_reg()
pkgs_names_uuid = OrderedDict{AbstractString, AbstractString}()
pkgs_names_uuid["[deps]"] = ""
for pkg in Pkgs
_uuid = get_uuid(pkg)
pkgs_names_uuid[pkg] = _uuid
end
open("project.toml", "w") do io
TOML.print(io, pkgs_names_uuid)
end
end
## Test on the packages "ClusterAnalysis" and "EvoTrees"
create_project_toml(["ClusterAnalysis", "EvoTrees"])
Related
This bounty has ended. Answers to this question are eligible for a +100 reputation bounty. Bounty grace period ends in 4 hours.
antonio wants to draw more attention to this question:
Explain what is the proper workflow when you are building two packages, where one depends on the other. What is the correct way to run devtools::check() on the depending-on package?
I want to add a local package dependency using R devtools.
The suggested way to add packages to the package DESCRIPTION file is one of the two functions use_package() and use_dev_package(), from the usethis package. The latter adds a dependency on an in-development package. The function help shows the following prototype:
use_dev_package(package, type = "Imports", remote = NULL)
where remote is
a character string to specify the remote, e.g. ‘"gitlab::jimhester/covr"’, using any syntax supported by the remotes package.
The remotes vignette shows the following
# Local
Remotes: local::/pkgs/testthat
So the command should be along these lines:
use_dev_package(foopack, type = "Imports", remote = "local::<foopack>")
However, what should be the path to the foopack. An absolute one or relative to project dir? The root package directory or the R directory with the code, or perhaps the foopack.tar.gz build?
All attempts failed for me.
Needless to say that, beyond having the local dependency properly listed in the DESCRIPTION file, I need it to be seen by the devtools build & check functions.
Edit
As regards use_dev_package(), I found a solution:
if I use devtools::check(), then the dependency appears in the search path, and use_dev_package() does not complain any more (see answer below).
However, it is still unclear to me what arguments should I use to make a development check() for the main package, in particular when the package has a vignette.
Ideally, I should be able to pass the check with local dependencies by passing cran = FALSE, but this still gives
"Package required but not available".
It seems that I have to check the local dependencies before adding them to the description file.
devtools::check("path/to/foopack")
usethis::use_dev_package("foopack", remote ="local::path/to/foopack")
The paths can be relative or absolute, and even a single colon works.
It might be worth noting that, when I build the main package, I can use the ordinary:
devtools::build()
but, for a successful check, I need to use the remote argument:
devtools::check(remote = TRUE)
I can't see a rationale for restating what is in the DESCRIPTION file, but I do not have enough expertise to say it's a bug.
Let's see what the others say in this regard.
Edit
Unfortunately, it seems that the remote argument above does not apply to vignettes. So, if I add a vignette to the package, checks fail with local packages
Until an actual solution is found, all I can do is (sadly) to ignore vignette checks:
devtools::check(remote = TRUE, vignettes = FALSE)
I want to setup local repository for R package, I'd like the repository works like sonatype nexus(it can proxy the central repository, and cache the artifacts after downloading the artifact from central repository).
Currently nexus does not support R repository format, so it doesn't suite what I needed to do.
Is that any existing solution for creating this repository? I don't want to create a CRAN mirror, which is too heavy for me.
First, you'll want to make sure you have the following path and its directories in your system: "/R/src/contrib". If you don't have this path and these directories, you'll need to create them. All of your R packages files will be stored in the "contrib" directory.
Once you've added package files to the "contrib" directory, you can use the setRepositories function from the utils package to create the repository. I'd recommend adding the following code to your .Rprofile for a local repository:
utils::setRepositories(ind = 0, addURLs = c(WORK = "file://<your higher directories>/R"))
After editing your .Rprofile, restart R.
ind = 0 will indicate that you only want the local repository. Additional repositories can be included in the addURLs = option and are comma separated within the character vector.
Next, create the repository index with the following code:
tools::write_PACKAGES("/<your higher directories>/R/src/contrib", verbose = TRUE)
This will generate the PACKAGE files that serve as the repository index.
To see what packages are in your repository, run the following code and take a look at the resulting data frame: my_packages <- available.packages()
At this point, you can install packages from the repo without referencing the entire path to the package installation file. For example, to install the dplyr package, you could run the following:
install.packages("dplyr", dependencies = TRUE)
If you want to take it a step further and manage a changing repository, you could install and use the miniCRAN package. Otherwise, you'll need to execute the write_PACKAGES function whenever your repository changes.
After installing the miniCRAN package, you can execute the following code to create the miniCRAN repo:
my_packages <- available.packages()
miniCRAN::makeRepo(
pkgs = my_packages[,1,
path = "/<your higher directories>/R",
repos = getOption("repos"),
type = "source",
Rversion = R.version,
download = TRUE,
writePACKAGES = TRUE,
quiet = FALSE
)
You only need to execute the code above once for each repo.
Then, check to make sure each miniCRAN repo has been created. You only need to do this once for each repo:
pkgAvail(
repos = getOption("repos"),
type = "source",
Rversion = R.version,
quiet = FALSE
)
Whenever new package files are placed into the local repo you can update the local repo's index as follows:
miniCRAN::updateRepoIndex("/<your higher directories>/R/")
Finally, as an optional step to see if the new package is in the index, create a data frame of the available packages and search the data frame:
my_packages <- available.packages(repos = "file://<your higher directories>/R/")
This approach has worked pretty well for me, but perhaps others have comments and suggestions that could improve upon it.
I want to deploy a basic trained R model as a webservice to AzureML. Similar to what is done here:
http://www.r-bloggers.com/deploying-a-car-price-model-using-r-and-azureml/
Since that post the publishWebService function in the R AzureML package was has changed it now requires me to have a workspace object as first parameter thus my R code looks as follows:
library(MASS)
library(AzureML)
PredictionModel = lm( medv ~ lstat , data = Boston )
PricePredFunktion = function(percent)
{return(predict(PredictionModel, data.frame(lstat =percent)))}
myWsID = "<my Workspace ID>"
myAuth = "<my Authorization code"
ws = workspace(myWsID, myAuth, api_endpoint = "https://studio.azureml.net/", .validate = TRUE)
# publish the R function to AzureML
PricePredService = publishWebService(
ws,
"PricePredFunktion",
"PricePredOnline",
list("lstat" = "float"),
list("mdev" = "float"),
myWsID,
myAuth
)
But every time I execute the code I get the following error:
Error in publishWebService(ws, "PricePredFunktion", "PricePredOnline", :
Requires external zip utility. Please install zip, ensure it's on your path and try again.
I tried installing programs that handle zip files (like 7zip) on my machine as well as calling the utils library in R which allows R to directly interact with zip files. But I couldn't get rid of the error.
I also found the R package code that is throwing the error, it is on line 154 on this page:
https://github.com/RevolutionAnalytics/AzureML/blob/master/R/internal.R
but it didn't help me in figuring out what to do.
Thanks in advance for any Help!
The Azure Machine Learning API requires the payload to be zipped, which is why the package insists on the zip utility being installed. (This is an unfortunate situation, and hopefully we can find a way in future to include a zip with the package.)
It is unlikely that you will ever encounter this situation on Linux, since most (all?) Linux distributions includes a zip utility.
Thus, on Windows, you have to do the following procedure once:
Install a zip utility (RTools has one and this works)
Ensure the zip is on your path
Restart R – this is important, otherwise R will not recognize the changed path
Upon completion, the litmus test is if R can see your zip. To do this, try:
Sys.which("zip")
You should get a result similar to this:
zip
"C:\\Rtools\\R-3.1\\bin\\zip.exe"
In other words, R should recognize the installation path.
On previous occasions when people told me this didn’t work, it was always because they thought they had a zip in the path, but it turned out they didn’t.
One last comment: installing 7zip may not work. The reason is that 7zip contains a utility called 7zip, but R will only look for a utility called zip.
I saw this link earlier but the additional clarification which made my code not work was
1. Address and Path of Rtools was not as straigt forward
2. You need to Reboot R
With regards to the address - always look where it was installed . I also used this code to set the path and ALWAYS ADD ZIP at the end
##Rtools.bin="C:\\Users\\User_2\\R-Portable\\Rtools\\bin"
Rtools.bin="C:\\Rtools\\bin\\zip"
sys.path = Sys.getenv("PATH")
if (Sys.which("zip") == "" ) {
system(paste("setx PATH \"", Rtools.bin, ";", sys.path, "\"", sep = ""))
}
Sys.which("zip")
you should get a return of
" C:\\RTools|\bin\zip"
From looking at Andrie's comment here: https://github.com/RevolutionAnalytics/AzureML/commit/9cf2c5c59f1f82b874dc7fdb1f9439b11ab60f40
Implies we can just download RTools and be done with it.
Download RTools from:
https://cran.r-project.org/bin/windows/Rtools/
During installation select the check box to modify the PATH
At first it didn't work. I then tried R32bit, and that seemed to work. Then R64 bit started working again. Honestly, not sure if I did something in the middle to make it work. Only takes a few minutes so worth a punt.
Try the following
-Download the Rtools file which usually contains the zip utility.
-Copy all the files in the "bin" folder of "Rtools"
-Paste them in "~/RStudio/bin/x64" folder
I want to use packrat on a Windows 7 machine with no internet connection.
I have downloaded all binary packages from http://cran.r-project.org/bin/windows/contrib/3.1/ into the local folder C:/xyz/CRAN_3_1.
The problem is now that
packrat::init(options=list(local.repos="C:/xyz/CRAN_3_1"))
throws a bunch of warnings and errors like
Warning: unable to access index for repository http://cran.rstudio/bin/...
Warning: unable to access index for repository http://cran.rstudio/src/...
Fetching sources for Rcpp (0.11.4) ... Failed
Package Rcpp not available in repository or locally
As it seems packrat tries to find
the binary version of Rcpp on CRAN (fails since there is no internet connection)
the source of Rcpp on CRAN (fails since there is no internet connection)
the local source of the package (fails since I only have the binaries)
What I don't understand is why packrat does not also search for the local binary package...
Question 1: I could download the source CRAN repository to get around this problem. But I would like to know from you guys whether there is an easier solution to this, i.e., whether it is possible to make packrat accept a local binary repo.
Question 2: When I create my own package myPackage with packrat enabled, will the myPackage-specific local packrat library also be included in the package? That is, assume that I give the binary myPackage zip File to one of my colleagues who does not have one of the packages that myPackage depends on (let's say Rcpp). Will Rcpp be included in myPackage when I use packrat? Or does my colleague have to install Rcpp himself?
I managed to hack around this problem. Please bear in mind that I have never used packrat before and that I do not know its "proper" behaviour. But my impression is that the hack works.
Here is how I did it:
Open your project, load packrat via library(packrat)
type fixInNamespace("snapshotImpl",ns="packrat") - a window opens - copy its content into the clipboard
Go to /yourProjDir/ and create a file snapshotImplFix.R
Copy the clipboard's content into this file ...
... but change the first line to
snapshotImplFix=function (project, available = NULL, lib.loc = libDir(project),
dry.run = FALSE, ignore.stale = FALSE, prompt = interactive(),
auto.snapshot = FALSE, verbose = TRUE, fallback.ok = FALSE,
snapshot.sources = FALSE)
Note snapshot.sources = FALSE! Save and close the file.
Create /yourProjDir/.Rprofile and add
setHook(packageEvent("packrat","onLoad"),function(...) {
source("./snapshotImplFix.R");
tmpfun=get("snapshotImpl",envir=asNamespace("packrat"));
environment(snapshotImplFix)=environment(tmpfun);
utils::assignInNamespace(x="snapshotImpl",value=snapshotImplFix,ns="packrat");})
Points 2-6 fix the problem with the snapshot.sources argument being TRUE by default (I did not find a better way to change that...)
Finally, we have to tell packrat to take our local repository. It's important that you have the right folder structure. Therefore I moved the repo from C:/xyz/CRAN_3_1 to C:/xyz/CRAN_3_1/bin/windows/contrib/3.1. Do not forget to run library(tools);write_PACKAGES("C:/xyz/CRAN_3_1/bin/windows/contrib/3.1"); if you also have to move your files.
Open yourProjDir/.Rprofile again and add at the end
local({r=getOption("repos");r["CRAN"]="file:///C:/xyz/CRAN_3_1";r["CRANextra"]=r["CRAN"];options(repos=r)})
Note the 3 / right after file! Save and exit file.
Close the project and re-open.
Now you can execute packrat::init() and it should run without errors.
It would be great if someone with more experience regarding packrat could give his/her input so that I can be sure that this hack works. Any pointers to proper solutions are highly appreciated, of course.
I am trying to use the library libspatialite to extend sqlite on a NetBSD platform. I've taken the first step of creating a package for libspatialite in pkgsrc (libspatialite-4.1.1). The package appears to work; pkg_info says it's installed and I've verified that the files from PLIST (in code chunk below) have been installed in /usr/pkg/. However, when I try to install the package I built for spatialite-tools, configure says that libspatialite isn't installed. Also I can't figure out how to load the library in sqlite3 with load_extenstion(X,Y); what is the library file referred to in the documentation?
pkgsrc/databases/libsqlite/PLIST:
#comment $NetBSD$
include/spatialite.h
include/spatialite/debug.h
include/spatialite/gaiaaux.h
include/spatialite/gaiaexif.h
include/spatialite/gaiageo.h
include/spatialite/geopackage.h
include/spatialite/gg_advanced.h
include/spatialite/gg_const.h
include/spatialite/gg_core.h
include/spatialite/gg_dxf.h
include/spatialite/gg_dynamic.h
include/spatialite/gg_formats.h
include/spatialite/gg_mbr.h
include/spatialite/gg_structs.h
include/spatialite/gg_wfs.h
include/spatialite/gg_xml.h
include/spatialite/spatialite.h
include/spatialite/sqlite.h
lib/libspatialite.la
lib/pkgconfig/spatialite.pc
Do you have a buildlink3.mk file in the libspatialite package?
If not:
In order to avoid a package building against implicit dependencies pkgsrc only makes libraries which have been explicitly listed visible to the building package.
This stop the issue of a package picking up an optional dependency which just happens to be installed, then building a binary package which uses that library but does not have it listed in the package metadata. The resultant binary package will work fine on that system... until the optional dependency is removed, and will fail on any other system without that hidden dependency.
Anyway... buildlink3.mk files are used in pkgsrc to make the necessary files visible during a build. A libspatialite buildlink3.mk might look like the below (adjust 1.0 to the current lib version)
# $NetBSD$
BUILDLINK_TREE+= libspatialite
.if !defined(LIBSPATIALITE_BUILDLINK3_MK)
LIBSPATIALITE_BUILDLINK3_MK:=
BUILDLINK_API_DEPENDS.libspatialite+=libspatialite>=1.0
BUILDLINK_PKGSRCDIR.libspatialite?= ../../devel/libspatialite
BUILDLINK_LIBDIRS.libspatialite+= lib/spatialite
BUILDLINK_RPATHDIRS.libspatialite+= lib/spatialite
BUILDLINK_INCDIRS.libspatialite+= include/spatialite
.endif # LIBSPATIALITE_BUILDLINK3_MK
BUILDLINK_TREE+= -libspatialite
Then in the depending package add something like:
.include "../../devel/libspatialite/buildlink3.mk"