With the risk of me misunderstanding something, I can't get my packages to sync. I went through the following scenario:
I install new packages on Machine 1 and upload the settings through the "Sync Backup" command in atom. I can see that the new packages are listed in the packages.json file in the gist.
On Machine
I restore the settings and can indeed see settings being restored, like keymaps. However I don't get my new packages. I have restarted and reloaded Atom without luck.
Are there any extra steps I need to take to get the new packages on Machine 2?
You can try atom-package-sync. It is a package that I created a couple weeks ago. It works a little bit like the synchronization of Google Chrome, you just login and it syncs your packages and settings automatically across all your Atom instances. It is based on package-sync but I find it easier to use.
Related
I have a new computer in the same network as the old one. In the new one, I first installed RStudio, then the latest version of R (I hope the order does not matter). Now, the new R installation gets "unable to access index for repository" error for every library. Simple task becomes hours of googling, during which I tried the following:
tick or untick use secure download method for http in global options (was ticked in the old computer)
try different, close my location repositories in global settings
copy the folder of an existing package (like ggplot2) from the old computer to the new one's library folder (does not become visible in packages pane)
uninstall and reinstall R
compare installations (the only difference is that in the old computer, folder RTools is under c:/program files while in the new one I accepted the standard location and folder (c:/Rtools40)
Please note that both computers are in the same corporate network, so the firewall and other network settings are the same.
Finally I found the reason: our company uses an internal repository that mirrors an external one. After inserting a custom repository, everything worked perfectly.
I hope this helps if someone runs to a similar problem.
Thanks everyone for your comments!
My problem is that I can't use R-studio at my work place as the IT does not support it . I want to use R and R-studio that installed on my personnel laptop on my company laptop ( using a modern browser which is behind firewall ) . Some of the options I am thinking of two two things
should I need to build a docker for R and R-studio (I see base images are already available) , I am mostly interested in basic R , Dplyr (haven ,xporter, and Reticulate ) packages .
Should I have to use a binder . I am not technical person and my programming skills are very limited can any one suggest me way .
What exactly are the difference between using Docker option vs Binder ?
I know I can use R-Studio online and get my work done but with the new paid account I am running out of project hours and very slow sometimes . Thanks in advance
Here are some examples beyond the modern RStudio MyBinder example:
https://github.com/fomightez/pythonista_skewedf
https://github.com/fomightez/r_phylogenetics_worshop
https://github.com/fomightez/chapter7/tree/master/binder
The modern RStudio MyBinder example has been set as a template on GitHub so you can use
The first one is for a special use of a package not on conda. And I started that one from square one.
The other two were converted from content by others to aid in making them Binder-ready.
You essentially list everything you need from conda in the environment.yml along with the appropriate channels. If you need special stuff not on conda, you need the other configuration files included there.
Getting everything working can take some iterations on adding things, letting the image get built, and testing your libraries are available. Although you seem to think your situation is not overly complex.
The binder launch badges you see are just images where you modify the URL to point the MyBinder federation site at your repository. Look at the URL and you should see the pattern where you put studio at the end of the URL pointing at your repo. The form at MyBinder.org site can help with this; however, most often it is easier to just adapt a working launch badge's code copied from elsewhere. The form isn't set up at this time for making the URLs for launching to RStudio.
Download anything useful your create in a running session. The sessions timeout after 10 minutes, although RStudio usually keeps them active.
Lack of Persistence and limited memory, storage, & power can be drawbacks. The inherent reproducibility and portability are advantages.
MyBinder.org doesn't work with private repos. If you have code you don't want to share, you can upload it to the temporary session, using the repo for specifying the environment. You could host a private binderhub that does allow the use of private git repositories; however, that is probably overkill for your use case and exceed your ability level at this time.
GitHub isn't the only place to host repositories that can be pointed at the MyBinder system. If you go to the MyBinder.org page and click where it says 'GitHub' on the left side of the top line of the form, you can see a list of the sources at which you can host a repository and point the system to build an image and launch a container with that specified image.
Building the image from a source repository takes some minutes the first time. Once the image is built though on the service, launch is typically less than 30 seconds. Each time you make a change on the source repo, a build is necessary. Some changes don't cause the new build to be as long as the initial one as some optimizing is done to only build what is necessary after a change. Keep in mind there are several members of the federation around the workd and if traffic on the internet gets sent to where the built image isn't yet available, it will be built from scratch again first.
The Holepunch project is out there to offer some help for users working in the R ecosystem; however, with the R-Conda system that is now integrated into MyBinder it is pretty much as easy to do it the way I described. Last I knew, the Holepunch route makes a Dockerfile that isn't as easy to troubleshoot as using the current the R-Conda system route. Dockerfiles are essentially a last ditch configuration file that MyBinder can handle. The reason being the other configuration files are much easier and don't require knowing Dockerfile syntax. MyBinder aims to offer the ability to take advantage of Docker offering containers with a specified environment without users needing to know anything about Docker.
There is a Binder Help category for posting to get help at the Jupyter Discourse Forum. Some other examples of posts already there may help you troubleshoot.
Notice of a common pitfall
Most of the the configuration files for making a repository Binder-ready are simply text and can be edited right in the GitHub browser interface, without need to git or even cloning the repo locally.
Last I knew, there are two exceptions to this. The postBuild and start configuration files have settings that allow them to be run as scripts and these get altered in a way they no longer work if you edit them via the GitHub browser interface. (This was my experience when last I tried. Your mileage may vary or things may have changed now.) To edit those, you have to have git available on a system you have and pull one from some other source. Then edit that on your machine that has git working & add it your repo and push it back up from your local computer.
(If this is a problem, you can post in the Jupyter Discourse Forum Binder help category and you and I could coordinate where I fork and edit those files in your repo to your specifications and then make a pull request to update your source of the fork with those changes.)
If you are using Jupyter notebooks extensively then it may make sense to use Binder
But if you simply want to use R and Rstudio, then all you need is docker. A good resource is
https://github.com/rocker-org/rocker
Due to recent experience with several bugs created by updating packages, I wonder what the best approach is for the following problem:
I currently provide a stand alone version so to say of my shiny App (just the script files to run it locally) and run a long list of require() functions to load / install the needed packages. However, in the end I would like to use fixed package versions to avoid bugs created by changes in packages.
Is there a way to ensure that the user, who may have older or newer versions of packages on their computer, is using the right version of all the packages my app needs?
You can consider using packrat: https://rstudio.github.io/packrat/.
Unfortunately, private libraries don’t travel well; like all R
libraries, their contents are compiled for your specific machine
architecture, operating system, and R version. Packrat lets you
snapshot the state of your private library, which saves to your
project directory whatever information packrat needs to be able to
recreate that same private library on another machine.
Short tutorial:
RStudio - File - New Project - New Directory - New Project - "Do: use Path" - Create Project
Enter in the R(Studio) console:
Code:
packrat::init()
.libPaths() # test if libpath has changed
install.packages("reshape2") # installs within one of the packrat libpaths
Installing package into ‘C:/R/packRatTest/packrat/lib/x86_64-w64-mingw32/3.4.3’
Assumption would be that you can use and share RStudio Projects, but i think it would be hard to work without them anyway ;).
Try writing your shiny app as a package. You can, somewhat, control that through the description file.
Since you said you're using script take a look at: https://github.com/chasemc/electricShine
Even of you don't use it, hopefully looking at the code will help for things like setting the download repo to be a specific MRAN date.
I have recently experienced a serious problem with Rstudio when developing a package. Whenever, I open an existing project with Rstudio where versions are controlled with Git, it takes so long for it to respond to any command. It is also impossible to type something in the console (e.g. 1+1) and obtain the result. Even quitting the Rstudio, should be done with task manager. There is no problem when I create a new project / package or when I open directly a R script.
This problem appears both when the project is saved on a dropbox or on a local repository.
To overcome this issue everytime I need to modify my code, I create a new project, and then I move toward the new repository all my current R scripts and the folder ".git".
I would appreciate if anybody could help me with this issue.
I had a similar problem to yours. Changing the attribute of my .git folder into hidden solved my problem.
We recently discovered an issue where projects using git for version control could become laggy / unusable on Windows if the .git folder within the project had become a non-hidden directory.
https://github.com/rstudio/rstudio/issues/1918
My enterprise has a Git repository. To make changes, I have to make changes in my forked repository and then make a pull request.
I primarily use RStudio, so I have enabled its integration with Git. I can make changes to my forked repository and then push, pull, sync, etc. The problem is that I still have an additional step of logging into GitHub and making a pull request for my forked repository. Is there a way of doing this from RStudio?
I too use RStudio for R development and I do not believe there is a way to do this. The reason is because this is more than just adding code to a branch, you're requesting a management feature to take place which is pulling part of your code into another branch of the code base. RStudio appears to be limited to pulling, syncing and committing. Likely you need to use a separate, more full featured GitHub client.
This could be done via the GitHub API, which could be executed from an R package using the httr or curl package, after which such a package could have an addin for RStudio, which would let you check everything using a nice Shiny app!
Now we only need to look for someone who wants to develop this… Can’t seem to find it (Jan 2022).