We recently decided to make Julia Language available on our cluster systems. The cluster system is not able to connect to the internet.
Is there any way to download all Julia packages and make them available for our different users to install and use them offline?
Another option that we have is a system that can connect to the internet temporarily, but it is always connected to the main cluster system. Is there any way to use this system as a mirror for the Julia packages or not?
We want to use "Julia 1.0.1".
our cluster operation system is: "CentOS 5.5
notes: I have seen the question asked before here, but it is for Julia 0.6 and a single package that will be copied by hand. I want that user uses the Pkg.add <pkgName> command but instead of the internet, the package manager gets the packages from our offline system.
Thank you for your help and time.
Caution:
Side effects are not known!
May please be tested properly before put into production!
a) Collect the required packages along with their dependent packages in compiled form, put them in folder, stdlib (for example: /opt/julia/julia-1.1.0/shared/julia/stdlib/v1.1/)
b) add stdlib path to environment variables, JULIA_DEPOT_PATH and JULIA_LOAD_PATH
The following is a crosspost of https://stackoverflow.com/a/74800608/18431399
PackageCompiler.jl seems like the best tool for using modern Julia (v1.8) on secure systems. The following approach requires a build server with the same architecture as the deployment server, something your institution probably already uses for developing containers, etc.
Build a sysimage with PackageCompiler's create_sysimage()
Upload the build (sysimage and depot) along with the Julia binaries to the secure system
Alias a script to julia, similar to the following example:
#!/bin/bash
set -Eeu -o pipefail
unset JULIA_LOAD_PATH
export JULIA_PROJECT=/Path/To/Project
export JULIA_DEPOT_PATH=/Path/To/Depot
export JULIA_PKG_OFFLINE=true
/Path/To/julia -J/Path/To/sysimage.so "$#"
I've been able to run a research pipeline on my institution's secure system, for which there is a public version of the approach.
Related
I've created a program that runs in R that I plan on distributing among a lot of other people. Currently the R script is ran completely automatically and behind the scenes with one .sh script which is exactly how it is intended to be. I'm trying to make it so theres no need for client intervention. The R script itself loads the packages and installs them if they aren't present which takes away the task of them installing the packages themselves.
Is there a way I can provide a folder within my Application's folder that they already download that contains R-script and its dependencies so the code can use that location of Rscript to compile and run the R-program I have created. The goal is to be able to download it and run without the need of internet connection to download R and maybe even the programs required packages if possible.
Any help or ideas is appreciated.
I assume that process you want called "creating binary package". Binary is programs (like EXE files) which can run directly on target CPU without any interpreter software (like Python interpreter for python scripts, or Java VM for java applications). I'm not so familiar with packaging of R programs but I found some materials regarding this issue:
1 - Building binary package R
2 - https://seandavi.github.io/post/build-linux-r-binary-packages/
3 - https://support.rstudio.com/hc/en-us/articles/200486508-Building-Testing-and-Distributing-Packages
Second link assumes Linux as target system. Opposite to interpreted languages, binary files often OS dependent (Linux, Windows, or Mac). I, personally, don't know how compatible are packages between Linux systems with different library sets.
Please comment if you find some information misleading, I'll correct the answer.
I'm trying to get a shiny app deployed on Shiny Server. I can do that without any issues, but when trying to deploy an app that has a number of dependencies (remote and local) we keep running into issues.
We used renv to track the dependencies (on the Windows dev box) and rebuild it from scratch on the Linux prod box, but even though the dependencies are rebuilt and some get loaded, others do not. The .Rprofile of the user running the app is pointing to the renv activation script.
For the sake of clarity, we need and want all the R code to be built from the source code on the Linux box.
What is the best or standard way (or even a poor way that works) to deploy the libraries for a shiny app to the shiny server? Is renv even the right tool for this scenario or is there a better tool?
I've tried reading the shiny server documentation and the closes it only mentions that it uses the .Rprofile of the user running the app, but there doesn't seem to be any sort of guide on the best way to deploy dependent libraries.
This renv documentation discusses some reproducibility caveats:
system dependencies, and
changes in CRAN (e.g. a binary no longer being available).
Since you are moving from a Windows to a Linux system your packages may have unmet system dependencies (things that need to be installed outside of R) that you didn't encounter in Windows. For example, rJava is required for some of the Excel-related R packages, and getting its related system dependencies installed and working on Linux can sometime be a challenge. You can use the RStudio Package Manager Website to figure out what system dependencies are required for different R packages for your particular Linux OS. Also, the error messages you get when running these apps on Linux should point you in the right direction. These system dependencies are what you'll have to manage yourself since renv doesn't.
But for a more production-level solution you can try Docker and ShinyProxy. For apps with many dependencies or especially external dependencies (e.g. Python, SQL, etc.) you can guarantee more reproducibility using Docker. ShinyProxy can be used to host apps built into docker images. This is more work, but you ensure the entire system is reproducible, not just the R version and R packages. ShinyProxy also adds additional hosting capabilities like user authentication.
I have an R Notebook that I am building to provide an analysis for somebody, and I am wondering if I should choose another option as I don't know if she will be able to run the Notebook without having R installed.
Is it possible to run an R Notebook as a single entity or must you have R installed in order to do it?
To rerun the notebook they require R. But the whole point of R Notebooks is that they produce a static document as output. That document (usually in HTML format) can be shared in isolation, and does not require any additional software besides a web browser to be viewerd.
Notebook will need R to run. To distribute a notebook without the R dependency will be a bit more elaborate, like installing rstudio server within a docker container. User will, in this particular case, need to have Docker installed and know how to start a container. From there on the user can interact with the code through a web browser.
Another option would be to use the cloud solution that some companies offer. It offers sharing functionality and you don't have to worry about the infrastructure or distribution of your work. There are some free plans that may work for you, but the real power is in premium features.
This is sort of related to a previous post of mine. I have the need to use the bigmemory library on my 32bit Windows PC to do some ugly matrix calculations. Unfortunately, it appears that the maintainers have temporarily ceased production of Windows binaries. I have Ubuntu on my home PC. I would really like to take the .tar.gz file and build it into a Windows binary that I can actually run at work. I realize there are more efficient ways, like installing RTools on the Windows device. However, our IT keeps our admin rights on lockdown, so I can never edit my PATH enviro variable. Could anyone provide some general guidance for doing this? Are there any tools I need to install on my Ubuntu PC above and beyond R?
I found similar questions, but nothing that thoroughly answered my questions.
Unless the package source is incompatible with current versions of R, you could use the R project's win-builder site to build a Windows binary. Quoting from the linked site, win-builder is a service:
intended for useRs who do not have Windows available for checking and building Windows binary packages.
As a convenience, Hadley Wickham's devtools package includes a utility function, build_win(), that you can use for this purpose. From ?build_win:
Works by building source package, and then uploading to http://win-builder.r-project.org/>. Once building is complete you'll receive a link to the built package in the email address listed in the maintainer field. It usually takes around 30 minutes.
Windows has four sets of environment variables (system, user, volatile and process sets). The first three sets are stored in the registry but the process set is not so even if they have locked down the registry its typically still possible to set the process environment variables (including the PATH) in a local process, i.e. on a temporary basis, so you might double check your assumptions that you can't modify anything. Its more likely that you can't modify the system variables and registry but can still modify the set in your local process. To check this from the Windows cmd line enter this:
set mytest=123
set mytest
and if the second line shows that mytest has the value 123 then you likely have all the permissions you need.
Furthermore anything you need to set is all handled automatically for you by R.bat in the batchfiles distribution so you don't have to set anything yourself.
Just ensure that Rtools and R are installed into the standard locations (you can tell them to skip the setting of any registry keys during the installation process), ensure R.bat is on your path or in current directory and run:
R.bat CMD INSTALL mypackage.tar.gz
without setting environment variables, registry keys or path.
If that does not work try Rpathset.bat also from the batchfiles which is not automatic like R.bat but on the other hand is extremely flexible since you must modify the SET statments in it to whatever you want.
There is a PDF document that comes with the batchfiles which gives more info.
On windows applications are typically packaged as MSI, on Redhat Linux as RPM, what would be a best open source packaging method that could be used to deploy applications to all platforms including different flavors of unix and windows?
Contents would include exes, unix binaries, java jar files, user data, even database scripts to be run.
(I recognize contents would vary per destination OS, ie. binaries would be different, win exe vs unix binary etc, but for example config files may be the same or in the case of java even the bytecode jars)
Key feature I'd like the packaging to support is different users and permissions for different directories, however I recognize supporting this feature multiplatform may be very difficult.
Rather than build a package that is supposed to work across all of your platforms, which is likely impossible, you should have your build system build different packages for each target platform.
With CPack (It come with CMake) you can create packages for Windows (with NSIS), Linux (rpm and deb), and OS X with "make package". CMake also simplify cross-platform building.
For a sample you can look at avogadro's CMakeLists.txt and AvoCPack.cmake
I have a client that uses IzPack to create a single installer (it's Java-based) that installs their app on Windows, OS X and Linux.
http://izpack.org/
NSIS is an open-source solution which, as far as I know is able to build installers that run on Windows and UNIX-likes alike. However, for software deployment on Windows (especially in corporate environments) MSI is the way to go and NSIS is more of a headache.
So I wouldn't advise that you try to build a single package/installer for different platforms. But rather, as RibaldEddie indicated, multiple packages: one for each platform. That also allows to restrict the contents of the package to the files relevant to each platform.
If you'd like to support packaging for multiple distributions, I'd suggest helping the packagers for those distributions out; use some sort of well-known build system for your software (GNU's autotools or something like scons or waf), and document the build, optional dependencies, and so forth pretty well.
That way, when a Debian, Ubuntu, Red Hat, SuSE, whatever, packager comes along, they'll be able to create the package for you. You can optionally include packaging templates for one or more distributions in a separate VCS tree that is available, if you'd like.
If you are looking at packaging a closed-source/proprietary application for multiple systems, you'd probably do best to package up a .tar.gz file and document the installation process for it. You'll also want to make sure that the build process used doesn't embed any path information into the application, so that it can be run in /opt, /usr, or /usr/local, which are some popular choices for third-party add-on software.
BitRock InstallBuilder allows you to create installer packages for each one of the platforms you mentioned (as well as creating RPM, DEB, packages etc. from a single project file)