Installing atom packages with dependencies whilst offline - atom-editor

I've just joined a new office and their security is very tight. Essentially, we cannot go online without connecting to another machine. This means any applications that attempt to connect online won't connect to anything.
I'm trying to set up atom for python development (I've not used atom before and it's all that available to me!) - but the lack of internet is causing an issue.
I understand that to install a package, I can download it from github, and extract it to ~/.atom/packages - and this works! But what do I do with packages with dependencies that haven't been downloaded? Is there a simple way to get the package and the dependency whilst being offline?
I've also noticed that although my office has atom installed there's no 'apm' or 'npm' commands in the terminal...is this common?
thanks

Related

Best (standard) way to deploy the libraries for a shiny app to the shiny server?

I'm trying to get a shiny app deployed on Shiny Server. I can do that without any issues, but when trying to deploy an app that has a number of dependencies (remote and local) we keep running into issues.
We used renv to track the dependencies (on the Windows dev box) and rebuild it from scratch on the Linux prod box, but even though the dependencies are rebuilt and some get loaded, others do not. The .Rprofile of the user running the app is pointing to the renv activation script.
For the sake of clarity, we need and want all the R code to be built from the source code on the Linux box.
What is the best or standard way (or even a poor way that works) to deploy the libraries for a shiny app to the shiny server? Is renv even the right tool for this scenario or is there a better tool?
I've tried reading the shiny server documentation and the closes it only mentions that it uses the .Rprofile of the user running the app, but there doesn't seem to be any sort of guide on the best way to deploy dependent libraries.
This renv documentation discusses some reproducibility caveats:
system dependencies, and
changes in CRAN (e.g. a binary no longer being available).
Since you are moving from a Windows to a Linux system your packages may have unmet system dependencies (things that need to be installed outside of R) that you didn't encounter in Windows. For example, rJava is required for some of the Excel-related R packages, and getting its related system dependencies installed and working on Linux can sometime be a challenge. You can use the RStudio Package Manager Website to figure out what system dependencies are required for different R packages for your particular Linux OS. Also, the error messages you get when running these apps on Linux should point you in the right direction. These system dependencies are what you'll have to manage yourself since renv doesn't.
But for a more production-level solution you can try Docker and ShinyProxy. For apps with many dependencies or especially external dependencies (e.g. Python, SQL, etc.) you can guarantee more reproducibility using Docker. ShinyProxy can be used to host apps built into docker images. This is more work, but you ensure the entire system is reproducible, not just the R version and R packages. ShinyProxy also adds additional hosting capabilities like user authentication.

Packrat::restore() for system dependencies

I was developing a shiny-app on a Windows machine in Rstudio. Now I need to develop it on a Linux machine and later on will need to deploy it on a server. Because of the need to run the app across platforms, I was looking into some environment control application so that I don't have to tune and reinstall the needed packages manually.
I tried to use Packrat for this purpose. I made a snapshot on my Windows machine, copied and pasted the project to my Ubuntu, reopened the project on RStudio, installed Packrat and the restore of the packages ran automatically. Then I ran into an issue that some of the packages that requires system dependencies were not installed automatically, e.g. rgdal and jqr. Therefore, I had to manually install those system dependencies manually in a terminal (it took me a while because there were about 10 of them that requires extra system dependencies).
I am wondering if there is an easier way to just automatically handle this. Later on, I will need to work with a system administer to deploy the app to the server. I am wondering if Packrat has the capability of automatically installing system dependencies on Linux machine/server. If anyone has encounter this issue before, or have other better options, please let me know!
Thank you!
Hello and welcome to StackOverflow.
You are facing a question that is actually much harder to tackle than you may think at first---deployment of complex R package dependencies across different operating systems is a truly hard and, truth be told, unsolved problem!
You can of course use packrat and renv for R package dependencies and snapshots of particular versions. But this does not do anything for system-level dependencies which are simply taken as "given". So no to just transfering to another box and saying "abracadabra". Sorry!
The closest we all may have gotten to fixing this may be Docker where you can create a portable unit of execution that can be deployed whereever Docker run: Windows, macOS, different Linux flavours, ... as it encodes everything.

R-Studio setup with offline CRAN repository in Windows

I would like to know what is the mchanism, if there is one, for setting up local CRAN repository in an environment that has no internet access. I have a windows environment but I would love to know if it can also be done in linux environment.
I have heard that I'll need to have a web browser to allow R-studio to find local repository. Not sure if it's true but I would like to find out all the steps to set this R-studio with local repository environemnt.
The idea is to have a fully fuunctional R-studio with full CRAN respository available in an offline environment where any package can be installed easily. I couldn't find any source/link available online that details how this can be achieved.
I know R-Studio is setting a package management tool that allows this functionality but I would like to get this done without spending any money.
I managed to solve this problem by creating a local webserver using Apache and then downloading full CRAN repository (win binaries only). I also had to edit my Rprofile.site file by providing the link to my local webserver.

R: Terminal IDE for Centos6 server without admin access

I need a replacement for RStudio Server which I can install and manage myself on a remote server (no sudo access). Gedit + XQuartz on my MacBook performs very poorly due to the lack of integration with R.
I was looking at vimR, and it appears to have the functionality that I need and should be able to easily extend to Python and other programming languages, which is important. But I think this guide is out of date, and the installation of dependencies is convoluted and ultimately requires installing via the package manager which is not an option.
Are there other alternatives to this? Google has not brought up anything useful so far.

some issue with using atom sync-settings tool

With the risk of me misunderstanding something, I can't get my packages to sync. I went through the following scenario:
I install new packages on Machine 1 and upload the settings through the "Sync Backup" command in atom. I can see that the new packages are listed in the packages.json file in the gist.
On Machine
I restore the settings and can indeed see settings being restored, like keymaps. However I don't get my new packages. I have restarted and reloaded Atom without luck.
Are there any extra steps I need to take to get the new packages on Machine 2?
You can try atom-package-sync. It is a package that I created a couple weeks ago. It works a little bit like the synchronization of Google Chrome, you just login and it syncs your packages and settings automatically across all your Atom instances. It is based on package-sync but I find it easier to use.

Resources