Global cache versus library -- what is used to load packages? - r

Can someone clarify how the global cache differs from a project library in an renv? Does renv first search a project library, then the global cache?

The global cache is an implementation detail. It never determines what packages are part of the local library, and what gets loaded in a project.
The only importance of the global cache is, well, caching installed packages, which makes setting up a new project faster because, rather than having to download and install a package, ‘renv’ can just link the already installed, cached package into the local library:
Future calls to renv::restore() and renv::install() will become much faster, as renv will be able to find and re-use packages already installed in the cache.
Because it is not necessary to have duplicate versions of your packages installed in each project, the renv cache should also help you save disk space relative to an approach with project-specific libraries without a global cache.
If you prefer, you can (mostly) ignore the existence of the global cache entirely. ‘renv’ should work as if it didn’t exist.

Related

Using Renv behind a proxy without password in plaintext

I'm working on R projects behind a proxy server, which is why I use the keyring library to store my proxy credentials and to authenticate on the proxy manually whenever it is required. This way, I don't need to write HTTPS_PROXY=http://usr:pw#proxy:port somewhere in plaintext - neither in global environments nor project wise. Of course, on runtime, Sys.env does contain this string but at least only for the session.
So far so good. Now I need to use virtual environments because of some package version mismatches in my projects. For that I created renv:init(). After closing and reopining the package, Rstudio seems to freeze during loading the package. I guess renv somehow tries to reach the packages (some are on cran, some are on local gitlab), which cannot work as the proxy is not set.
When I create a .Renviron including the proxy settings with my username and password, everything works fine.
Do you know a way to prevent renv to try to connect to the package sources at project start? Or do you think the problem lays somewhere else?
My best guess is that renv is trying to make a request to the active package repositories on startup, as part of its attempt to verify that the lockfile + library are in sync. If that's the case, you could disable this via:
RENV_CONFIG_SYNCHRONIZED_CHECK = FALSE
in your .Renviron. See https://rstudio.github.io/renv/reference/config.html for more details.
Alternatively, you could tell renv to load your credentials earlier on in a couple ways, e.g.
Try adding the initialization work to either your project .Rprofile or (with RENV_CONFIG_USER_PROFILE = TRUE) your user ~/.Rprofile;
Try adding the initialization code to a file located at renv/settings.R, which renv will source relatively early on during load.

Maintain different versions of R package for open source contribution

Packrat is often recommended as the virtual environment for R, but it doesn't fully meet my need of contributing to R open source. Packrat's "virtual environment" is stored directly in the project directory, requiring me to modify the .gitignore to ignore them when I make a pull request to the open source upstream.
In contrast, something like conda stores the virtual environment somewhere else, leaving no trace in the project codebase itself.
So how do R open source contributors deal manage dependencies during package development? Ideally the solution would work well with devtools and Rstudio.
There is nothing wrong in having Packrat in .gitignore.
You can use .git/info/exclude file thus avoiding touching the .gitignore.

Julia: Recompiling stale cache file?

The following message appears everytime I attempt to use 'Gadfly', 'Bio', or several other packages (I'm using 'Bio' in the example):
julia> using Bio
INFO: Recompiling stale cache file C:\Users\CaitlinG\emacs251\.julia\lib\v0.5\Di
stributions.ji for module Distributions.
INFO: Recompiling stale cache file C:\Users\CaitlinG\emacs251\.julia\lib\v0.5\Bi
o.ji for module Bio.
Julia 0.5.1 (all packages updated)
Windows 10 (fully updated)
Emacs 25.1
This is inconvenient since I can only assume it is not a "typical" component of importing a package. Can the issue be resolved by deleting the .julia directory?
Thanks.
Moving my comment to an answer since that appears to have resolved the question:
Julia caches its precompiled output within the .julia/lib folder. If any of the files there are older than the original source, it'll recompile them. It seems like Julia was having trouble overwriting the cache for a few specific packages here, so it was persistently recompiling them. By deleting the lib folder, you clear out these caches. Julia will recompile all packages, but it should now write them with the correct permissions that will allow them to be overwritten in the future.
Deleting the entire .julia folder is a much more drastic step that risks losing edits you've made to packages, and you'd need to re-install all the packages you've added.
The messages about recompiling a stale cache file are not warnings, but rather for your information. It means that something changed on your system and the current cache file is no longer considered valid, so instead of providing you a potentially old cachefile, Julia automatically cleans up and recompiles the cache.

Where does Meteor store atmosphere package templates/files

How can I modify source/templates of packages that I am using inside my Meteor Project.
For example, I download the package from Atmosphere, begin using the package and want to make some minor tweaks to the package.
What is the best approach to making changes to a installed Atmosphere package?
You should fork the package from the github source and make changes to your own version. If you think those changes might benefit others then you can make a pull request for your changes and, if the original author(s) agree, your changes can now benefit others.
If you make changes in the deployed package under your project directory then those will probably be clobbered the next time you run $ meteor update
If you're still curious as to where they live, look under /.meteor

Including patches with an installer using a basic msi Installshield project

I'm currently stuck with an Installshield project for installing our ASP.Net Application and need to implement upgrading. From my initial investigation it seems extremely complicated for what is essentially copying over a number of files.
Of the options available: patches and small, minor and major upgrades, what seems to most suit our needs is a patch but it is done as a separate .exe.
Is there a way to include patches in the full setup.exe or another recommendation that makes the whole process less complicated.
EDIT
Any alternative recommendation still needs to be done as part of an installer.
No, there is no way to include patches in the installer setup.exe. Patches, as well as small and minor updates, are applied to already installed application. I mean users already used the original installation package to install your application. And patch update contains only small set of files that are modified.
What you want is a major update. This kind of package contains all the required files, and it can be used to install the application for the first time. In case where the application is already installed, this kind of installation package will automatically remove the old version and install the new one.
If it involves only copying files then IMO, the best option is to give the bunch of files in needed directory structure and ask to overwrite existing copies. A slightly more user-friendly measure would be to zip up the directory structure along with a batch file and ask to unzip it in the app directory under some designated folder and then run the batch file to overwrite files.

Resources