I was wondering if anyone had success accessing R and R studio through a cloud platform like OneDrive. Because once my project started compiling on one of my drives on my computer, the drive started filling up due to like 5-6 on going projects and I do need access to them and therefore I don't have the capacity to delete them either. It's troublesome to save all the files I do to a cloud too. Currently I have a 256 SSD with STATA, SPSS and MortPak in addition to R. Appreciate any advice on this.
Related
I'm currently working on an university research related software which uses statistical models in it in order to process some calculations around Item Response Theory. The entire source code was written in Go, whereas it communicates with a Rscript server to run scripts written in R and return the generated results. As expected, the software itself has some dependencies needed to work properly (one of them, as seen before, is to have R/Rscript installed and some of its packages).
Due to the fact I'm new to software development, I can't find a proper way to manage all these dependencies on Windows or Linux (but I'm prioritizing Windows right now). What I was thinking is to have a kind of script which checks if [for example] R is properly installed and, if so, if each used package is also installed. If everything went well, then the software could be installed without further problems.
My question is what's the best way to do anything like that and if it's possible to do the same for other possible dependencies, such as Python, Go and some of its libraries. I'm also open to hear suggestions if installing programming languages locally on the machine isn't the proper way to manage software dependencies, or if there's a most convenient way to do it aside from creating a script.
Sorry if any needed information is missing, I would also like to know.
Thanks in advance
Is there any way to make the R + shiny build smaller, so as to make it more light weight when deploying shiny apps? (or plumber API's, or any other R processes for that matter?)
Background
I have been deploying shiny apps in kubernetes and the builds are quite a lot larger than similar apps written in other languages (e.g. python).
I've also deployed some to heroku, and they're also quite large builds (in the hundreds of MB, whereas other similar apps in other languages might only be a few tens of MB).
What I know so far
I know base R is quite large, but packages are the bulk of the build size, so I have reduced those as much as possible by not importing anything unnecessary, and extracting some functions from their packages so as to not have to include the whole thing.
I think leprechaun is the best option (as in easy AND light) out there right now. It's purpose is to be similar to golem, but without making the heavy requirements of golem. (link to the docs https://github.com/devOpifex/leprechaun/tree/master/docs)
I am an MS Office veteran with self-taught basic GIS skills (Tatuk Editor), including use of SQLite-based layers that link to MS Access. In the past few years I've been learning to use qGIS, and for the most part, the experience has been very positive.
What hasn't been so great in the qGIS learning curve is my attempt to link a qGIS-created geopackage layer (using the SQlite ODBC driver) to an MS Access application for the express purpose of editing and, ideally, for programmatic updating of attribute fields in existing records. Yes, the gpkg table will link, but unfortunately the connection is read-only. The problem apparently stems from an rtree rigger in the underlying geodatabase that won't allow the edited or updated records to be written /saved.
At the recommendation of a friend who is more highly versed in these technicalities, I tried to resolve the 'no gpkg editing' problem by adding spatialite .dll files to the system folder and appropriate extensions in the ODBC set up box, all without success. I next dumped the 32 bit version of my Office 365 software and transitioned to the 64 bit version, which fortunately didn't faze my existing documents, databases, etc. but had no effect whatsoever on the 'no gpkg editing' problem. At the end of the day, I'm no closer to achieving the desired solution, i.e. an editable connection between Access and the gpkg table.
Without going into immense detail of the various steps I've tried, I will stop here and give folks an opportunity to respond. I'm hopeful that someone reading this has not only encountered the 'no gpkg editing' problem when linking to a geopackage with MS Access, but has also learned how to resolve the issue. If you are that person, please explain the process as best possible. If it simply can't be done, I would appreciate knowing that, as well.
I have the same exact problem. I downloaded the spatialite dlls and tried to put them in the same folder as the ODBC driver, and Sys32 other folders. No Dice. I tried using 32-bit and 64-bit driver, no dice. I tried the environment variable. No Dice.
I'm also an ArcGIS user who will miss being able to use Access Databases. Now that Pro can edit geopackages, we'd have a great option if we could edit the data in Access via ODBC. Frustrating!
There is an enormous amount this on the web (see here and here and here) but no clear solution. So this question may be a duplicate but there was no solution for the older versions.
I was just wondering if anyone has found a way to resolve this extremely annoying problem of the conflicts between the autosave functions in RStudio and Google Drive resulting in an error message 'the process cannot access the file because it is being used by another process' interrupting your work every 5-10 seconds.
I recently bought a windows PC after years of reading PCs were better for running RStudio than a mac, but what a blunder that was. No such problems on the Mac OS.
Google drive is great and cheap but I would not have a huge problem with changing to a different cloud storage provider, but, from what I gather it's not just Google Drive but any cloud-based storage software that has this same incompatibility with RStudio.
I find it incomprehensible that this is a problem windows users who use R just have to live with.
So has anyone solved this problem? and if not, how do you work in RStudio and save your work without going crazy?!!??
Really I would love to know.
I just started using GCP for Data Science and I'm a bit overwhelmed with all the different tools. I have customer data in BigQuery which I'd like to analyze further for customer segmentation purposes.
However, I am not allowed to download it or have any copies locally. Most of the R+BigQuery tutorials I've seen seem to do that through. I am currently looking into analyzing the data using DataLab but there I can't seem to use R, only Python.
What would be a safe and cheap (!) way to analyze BigQuery data (< 100GBs) in R without downloading it? What GCP tool would be suited for it? Is there a way that does not include running R code in Python?