move packages to different directory Julia 0.7 - julia

I would like to move ~/.julia/packages to the julia installation directory.
What else should I move ?
What env variables should I set?
JULIA_PKGDIR?
JULIA_DEPOT_PATH?
JULIA_LOAD_PATH?
push!(DEPOT_PATH, "newdirname") created registries and compiled directories. but any newly added packages are not getting installed in the "newdirname" directory.
export JULIA_DEPOT_PATH="newdirname" removed ~/.julia/packages and replaced with the new one. packages started getting installed in the new dir. Now when I move the entire installation with the packages to a different machine without internet, Julia forces me to rebuild the packages and then fails because there is no connectivity. so what would make this move work?

You can add the new path by putting the statement
push!(DEPOT_PATH, "newdirname")
in startup.jl.
CAUTION: Most of the installed code in the .julia directories will have text created at installation containing pathnames to your old directories! I'd expect a lot of installed modules to break their installed state if moved. The adding via push! to DEPOT_PATH is for adding additional places to load files, NOT for moving existing installations!

Related

Installing R libraries by simply pasting the folders with packages to R-3.6.1\library location

I know there are plenty of ways to install packages but I obtained a zipped folder containing lots of folders with r packages. Is that ok to simply unzip the folder, copy all the folders and paste them to R-3.6.1\library location? Will that work properly?
You're talking about the installed package folders that were built and installed on another computer?
You'd want to be sure they are from the same operating system and version of R.
It also depends if there are any unmet or potentially conflicting dependencies. If not, in theory it should work.
But you'll then get notified of package updates. So it will only save you time installing them in the first place.
You could always switch between the package libs location when starting RStudio, by setting the "R_LIBS_USER" environment variable. Then update those packages (and get a sense of how safe it is), and you'll probably run into less chance of issues when you soft-copy them across to your primary location.
And please backup the primary location first if you need to restore it to that point.

How can I copy and entire renv based project to a new PC (which does not have internet access)?

I have been given access to a beefy machine on which to run a large simulation.
I have developed the code in an RStudio project with renv. Renv makes a local copy of all the packages and stores versions thereof in a lock file.
The target machine (which runs Windows) does not have access to the internet. I have copied the project file, the code files, the renv folder (which includes all the local copies of the packages, the lock file, and the .RProfile file, to a folder on the target machine.
When I open the project on the target machine, the .RProfile executes source("renv/activate.R"). However, this fails to load the projects, instead giving me the following message
The following package(s) are missing their DESCRIPTION files:
... Long list of packages ...
These may be left over from a prior, failed installation attempt.
Consider removing or re-installing these packages.
Trouble is I can't reinstall them since this machine does not have access to the internet. I could manually go through each package and download the binaries on my work machine, then transfer them over to the target machine, then install them one by one, but this seems like a very painful thing to do.
Is there a way for me to convince renv, or R itself, to just use the packages in the local renv folder?
From the Cache section of https://rstudio.github.io/renv/articles/renv.html:
When using renv with the global package cache, the project library is instead formed as a directory of symlinks (or, on Windows, junction points) into the renv global package cache. Hence, while each renv project is isolated from other projects on your system, they can still re-use the same installed packages as required.
I think that implies trying to copy the renv folder means copying those junction points (which are something like shortcuts / symlinks), not the actual underlying folder.
It looks like you already have a solution, but another option would be to call renv::isolate() to ensure that your project doesn't link to packages within the global cache, and instead just maintains an isolated library directly.
In the end I just wrote an small script to copy the files over.
sourceFolder = "some/path"
targetFolder = "some/other/path"
subFolders = list.files(sourceFolder)
for (i in seq_along(subFolders)) {
subFolder = subFolders[i]
file.copy(
from = paste0(sourceFolder, subFolder),
to = targetFolder,
overwrite = TRUE,
recursive = TRUE
)
paste(subFolder) |> message()
}

How to exclude a folder to be downloaded when hosting R package in github

My package is hosted in github, and user can install it through devtools::install_github.
Now I'm using pkgdown to generate documentation site, which created a 10M docs folder. Then I found devtools::install_github always download the whole master zip ball which become quite slow.
I tried to exclude the docs folder with these attempts:
.Rbuildignore, turned out it's only about bundled package, while install_github is installing source package so it doesn't work.
put package in pkg folder, put the generated docs folder out of pkg folder. However the whole master zip ball is always downloaded, even with subdir = "pkg" specified.
put development in a branch, and to create a special package branch without docs folder. Merge two branch but let package branch exclude docs folder. I tried make .gitignore to be branch specific but it doesn't seem to work. This seemed to be impossible.
My newest attempt is to create a separate repo solely for the website, just let pkgdown create the website in that folder like build_site(path = "../docsite/docs"). This should solve the problem and is simple and clean. The only imperfection is the website url will not be the usually pattern.
EDIT: with the latest version of pkgdown, there is no path parameter anymore, you need to specify it in the site configuration yaml, which works better (you don't need to specify it in every command).

R Packrat Fails to load private library

I have developed a solution using R and want to transfer it to the production server (CentOS 7) which has no Internet connection to install packages. To facilitate installation of packages, I used packrat to bundle the packages I used in my R script to the project.
Using packrat::bundle(), I have created a tar file of the project and moved the file to the server and untar the zip file.
According to a post in Blogger, once I open the project, When R is started from that directory, Packrat will do its magic and make sure the software environment is the same as on the source machine.
However, when I open the project in Server (using R-Studio Server 0.99), nothing happens and it throws error of unknown packages.
When manually execute the "packarat/init.R" file below error is thrown
Error in ensurePackageSymlink(source, target) :
Target '/home/R_Projects/prjName/packrat/lib-R/base' already exists and is not a symlink
Well, I found the problem and solve it. The symlink error is related to centOS (it is not related to R). I just simply removed all the folders inside the
/home/R_Projects/prjName/packrat/lib-R
Because these folder exist, the packrat is unable to create symlink with the same name inside the lib-R folder. If I remove them, it will create a link (shortcut) to the actual folder where the r package is located.
Hope it helps future readers.

Move r packages to new computer which has no internet

Normally I install packages using:
install.packages("foo")
and a Repo over the internet. But I have a new machine now where I want to replicate the packages from my existing installation without having to pull everything off the internet all over again. (I've a ton of packages and slow internet access)
Both machines are Windows and run the same R version. (2.13.1)
Is there a way to do this? Closest I can get is I know I can install from local zip files using:
install.packages("pathtozip", repos = NULL)
But does R store all Zips somewhere? I found a few in locations like:
C:\Documents and Settings\foouser\Local Settings\Temp\RtmpjNKkyp\downloaded_packages
But not all.
Any tips?
The function .libPaths will give you a vector of all the libraries on your machine. Run this on your old machine to find all of them. You can simply copy all these files into the libraries on your new machine (run .libPaths on it too to find out where).
Alternatively, if you want to set up a real repository (i.e. basically a CRAN mirror) on your computer or on a network drive you can update, you can put binary or source packages into a folder and run tools::write_PACKAGES on that folder. You can them run install.packages using the contriburl argument and point it to your repository folder.
All packages that you have installed are stored in a folder called win-library\r-version, for example,
C:\Users\Ehsan\Documents\R\win-library\2.15 so, it is enough to copy all the folders inside 2.15 to the same folder in your new machine. because you have the same version of R you do not need to update them by update.packages().
On your original computer, run
write.csv(unique(data.frame(installed.packages())[,1]),"packages.csv",row.names=F)
Save this .csv into the working directory of your new computer, then run
install.packages(as.character(read.csv("packages.csv")[,1]))
You can check what your working directory is using getwd().

Resources