R execute script/function during installation - r

I am developing a package and I would like to know what is the best way to execute a script at installation time.
Essentially I want to make sure that third-party tools are installed,
retrieve them when I can and raise an error if some dependency is missing.
I am not referring only to R packages, but also to system-wide headers, fonts and similar dependencies.
What is the best strategy to follow in this case?

Scripts to be run before installation should be placed in an executable file called configure (will be executed on Linux/Unix/Mac computers) or in a file called configure.win (will be executed in Windows computers).

Related

Julia package available from a registry

I added the package Knet with Pkg.add("Knet") and noticed that several packages were installed including CUDA. However, after the installation finished when I try:
using CUDA
it says that this package is not found but that it is available from a registry. It seems that this package is a requirement for Knet and it is installed but then one cannot access it right away. Do you know what is happening behind scenes? Thanks.
The underlying mechanism is a bit complex, and is described in detail here.
But the general logic is as follows: you can use (with using or import) the packages that you have explicitly installed. However, such packages might depend on other packages. Julia will automatically decide what other packages are needed to be installed, but they will be not visible in your project unless you explicitly install them.
In fact, typically, on one computer you will have hundreds of packages installed in one place (to avoid having to download and precompile them each time), but each individual project will have access only to packages that you explicitly specify you want to use in this project. The information what packages should be visible in an individual project is typically contained in the Project.toml file as is described here.
You can find more information how to manage projects in Julia here.

How to make sure the user of a shiny app is using the right package versions in R

Due to recent experience with several bugs created by updating packages, I wonder what the best approach is for the following problem:
I currently provide a stand alone version so to say of my shiny App (just the script files to run it locally) and run a long list of require() functions to load / install the needed packages. However, in the end I would like to use fixed package versions to avoid bugs created by changes in packages.
Is there a way to ensure that the user, who may have older or newer versions of packages on their computer, is using the right version of all the packages my app needs?
You can consider using packrat: https://rstudio.github.io/packrat/.
Unfortunately, private libraries don’t travel well; like all R
libraries, their contents are compiled for your specific machine
architecture, operating system, and R version. Packrat lets you
snapshot the state of your private library, which saves to your
project directory whatever information packrat needs to be able to
recreate that same private library on another machine.
Short tutorial:
RStudio - File - New Project - New Directory - New Project - "Do: use Path" - Create Project
Enter in the R(Studio) console:
Code:
packrat::init()
.libPaths() # test if libpath has changed
install.packages("reshape2") # installs within one of the packrat libpaths
Installing package into ‘C:/R/packRatTest/packrat/lib/x86_64-w64-mingw32/3.4.3’
Assumption would be that you can use and share RStudio Projects, but i think it would be hard to work without them anyway ;).
Try writing your shiny app as a package. You can, somewhat, control that through the description file.
Since you said you're using script take a look at: https://github.com/chasemc/electricShine
Even of you don't use it, hopefully looking at the code will help for things like setting the download repo to be a specific MRAN date.

Don't use R system library

I'm trying to use a linux server with R installed. Apparently the R system library has old versions of non-base packages installed like dplyr and testthat.
Because i don't have permission to edit the system library, i'm unable to update the packages.
My plan is to only use a user library, so I can controll the package versions myself. However i'm unable to remove the "/usr/lib64/R/library" folder from .libPaths(). I tried changing the environment variables R_LIBS_SITE and R_LIBS with the .Renviron and .Rprofile files to a different folder, but the /usr/lib64/R/library folder will always be present. Removing it with the command .libPaths(.libPaths()[1:2]) doesn't work either.
Is there a way to remove the system library from .libPaths(), so I'm not depending on the update policy of the server admin?
You can't remove the system library, because that's where the base packages live. They can't be installed anywhere else, and R won't work without them.
Best would be for you to get your sysadmin to update the system library. Those obsolete packages probably contain bugs.
If you can't do that, then run update.packages(instlib = "local") to install all the latest versions in the library named "local". (Substitute your own local lib name, of course.) This requires all your users to specify .libPaths("local") when they start, and some will likely forget, so it's not as good.
It might be easiest for you to just install a full copy of R in your own account. Then you'll have control of things, and anyone using your copy will get your library.
(There's a new release (3.5.3) coming in ten days; you might wait for that, or install one of the betas or RCs, which should be available now, then update again when the final release arrives.)
For me, it works to use
.libPaths(.libPaths()[2:1])
This will still search the system library, but only after it searches my personal library, so if I have a newer version, it uses that. Note: I used .libPaths()[2:1] not .libPaths()[1:2]

Automatic network Installation

Recently I have been attempting to install a software package automatically over a moderately sized computer network for a university lab. The package is aspenOne v8.4. I have tried creating response files, but every time I finish the install I am unable to locate the response file. I included the f1 flag option to specify file location and name, and I have also checked the %systemroot% directory as well, to no avail. I do not believe one is being created. I have also tried creating a response file for the msi package included on the disc with msiexec, but I was unsuccessful in that path as well. The install requires too much time to install manually across the network, so I was hoping there might be another option to automatically deploy the application package throughout the network. Is there a way to do this?
Thank you in advance.

Running qcollectiongenerator during application compile process

I've been working on a program called RoboJournal for a long time. The next release has full documentation included; Whenever the user presses F1 or clicks the Help item in the RoboJournal program, the help file is displayed in Qt Assistant (way classier than simply opening a browser window to some online documentation).
In its base form, the documentation consists of lots of loose HTML and image files included in the source package. These loose files are supposed to be compiled into a QCH compiled help file and QHC collection file during build time so Qt Assistant can display the documentation properly. On Windows, this was fairly easy because I was able to write a batch script to automate the entire build process (including compiling the documentation and moving the output files to the right place).
On Linux, it's a bit more complicated. True, I could write a Bash or Perl script that compiles the documentation along with the rest of the program but I have no guarantee that the people who will eventually create my app's Debian packages from the source package I give them will use the script. The source package is used to create all the Debian packages so everything has to work flawlessly with the standard build procedure (or the source package is worthless). Therefore, I need the compile process to produce the same results whether the user runs the script or not. As it is now, the user has to build and install the documentation manually. Surely there's some way to automate this.
Is it possible to have Qmake add instructions to run qcollectiongenerator to the makefile (in order to build my application's help files) so it gets handled properly during the "make" step? That way, the QHC and QCH files will be ready to install to their proper locations (in my case, /usr/share/doc/robojournal-0.4.1) along with everything else when the user runs "make install". I've considered compiling the QCH and QHC files in advance and providing them in the source package but the whole point of building from source is to be able to re-create the entire app from its base components.
I know I probably have to add additional instructions to my .PRO file but I’m not sure what or how. I've found something that looks promising (http://www.qtcentre.org/archive/index.php/t-49484.html) and gives me hope that it's possible for Qmake to do what I need but I’m not sure how applicable those instructions are to my situation. Do I have to create a PRI file just for this or can I add the instructions directly to the main project file?
You can try to use the QMAKE_POST_LINK variable:
QMAKE_POST_LINK += build_help.sh
P.S.
I have no guarantee that the people who will eventually create my
app's Debian packages from the source package I give them will use the
script.
I think this is really not your problem :) It's up to them to properly build the package.

Resources