Google Drive for Desktop sync problem with RStudio: generates random empty folders - directory

My research lab has a lab Google account to which we are expected to save/upload all of our files. To avoid the messy challenge of maintaining local files that can be read by RStudio as well as Cloud versions on the shared Google Drive account, I have recently began using Drive for Desktop to stream files from the Drive to my desktop to open in RStudio. (More specifically, I share my folder in the lab Drive with my personal Drive which in turn is synced/streamed to my Mac desktop.)
However, I believe this creates conflicts when Drive tries to sync/access files while RStudio is also updating those files (see: zero, one, two, three, four, five, six, seven). I do get a pop-up every so often that I have to close, but that is manageable. A bigger problem for now is that I have noticed that the Drive is now filled with random empty folders (see screenshot below). These folders are located within .Rproj.user. Does anyone know how to prevent these random folders from being generated when I use RStudio in this fashion? It is annoying because they show up in the Google Drive 'home page' and 'Recents' for everyone who accesses the shared Drive.
The other posts linked above describe the need to exclude some file types from sync in order to prevent conflicts, but the directions were for the old Google Drive Back Up and Sync system. I cannot find these settings in the new Drive for Desktop system, and I do not know if this would solve the random folder generation problem.
Has anyone encountered this problem before, know what causes it, or how to fix it?
Many thanks in advance.

Related

Change the directory where the SQLite *.db-journal file is created

Good day,
I have a small application created in Lazarus / Free Pascal. If I run this application located in a folder on my computer, it will start and SQLite will create a temporary file .db-journal in the current directory. Since the application is portable, it will also run from a flash drive. And now comes the problem. Some computers (eg at work) do not allow writing to external media. Therefore, when I start the application, it does not start and an error is displayed that it is not possible to open the database (tested on a locked SD card). And so that the application does not always have to be copied to the computer, I would like to know if it is possible to redirect the creation of a temporary file .db-journal to another directory, for example the "C:\WINDOWS\USERS<user>" user directory. Is it usually possible to write there always?
Of course, I searched the net, but so far I have not found anything that would help me, so I am addressing you here. Thank you for your advice or guidance.
Jirka

webdav access a textfile line by line

I have looked all over (spent about 7 hours). I have found numerous articles on how to map a drive (google drive, onedrive etc). What I cannot seem to find an answer to is this: Once I have mapped the drive can I use the files on that drive just like I use files on a server. Open the file, read a record, write a record. I have created a file, mapped a network drive, wrote records to the file and retrieved records from the file. I have a home grown database that is implemented with a large binary (as opposed to text) file. I have to go to a byte position and read a fixed number of bytes. If WebDAV is copying the file to my computer and then writing it back this would make my file access way to slow and I cannot seem to find an answer. Some programmers I have talked to say I cannot even do that, yet I can. Any direction would be very much appreciated.
Charlie
That's likely because standard WebDAV doesn't allow updating ranges of resources only, so the whole thing needs to be written back.

Working with two separate repos and syncing changes between them

I am a somewhat new/basic git user and I'm having a problem that I can not seem to find an answer for. I am trying to figure out a way I can store two different branches from a github repo locally on my computer. My understanding is that when I clone a repo to my laptop from Github, it also downloads all of the branch and commit history my local machine. I want to continue to use github as a version control/backup for my project. However, I am working with colleagues who understand git less than I do so I am trying to find a way to help keep everything simple for them at the same time.
Here is a description of the situation:
We are developing an analysis in RStudio to examine information about quantitative writing in college students.
I am writing the R analysis scripts and want to maintain a safe backup with github
I am sharing the project files with my colleagues via Google drive since they do not know how to use git/ github
I have reached a point where I am going to change the fundamental file structure of the project. However, I do not want to disrupt their ability to perform analyses while I am making these changes.
My colleagues need to be able to save analysis outputs to the project folder where they are synced back to me via Google Drive and then pushed to GitHub.
I can think of two ways to handle this situation but both seem to have problems that I can't see around.
Create a branch in github, make changes to the branch and then merge the branch with the master
This won't work because I am sharing the files via Google Drive and you can only have one branch of a repo on your local machine at a time. Once clone the branch to my machine, that is what gets shared via Google Drive and any changes I make disrupts everyone else's workflow.
Create a second copy of my repo, make changes there, and then push those changes to the original repo that gets shared to colleagues via Google Drive ##
I have no idea how to do this. Everything I have read discusses how to push/pull between different github users. How can I do this as a single user?
Did I forget anything important?
Any help/suggestions greatly appreciated.
You can do it all in the same directory by adding a second remote origin, if you've already added GitHub as 'origin', then run:
git remote add gdrive https://example.com/path/to/repo.git
then you can push up your changes you've made to the two repos
to push to GitHub:
git push origin
and then to push to GitHub
git push gdrive
See this github doc page

Files disappearing from Dropbox

Suddenly (at about the same time as upgrading to 0.9.0) my machine seems to be deleting older versions of Meteor-based apps, from anywhere on the machine including Dropbox. Thousands of files are disappearing. It is a little terrifying. Am I doing something wrong? Can I get them back?
You can retrieve deleted files from dropbox.com . There is a small trash bin icon "Show deleted files" and you can recover them.
I don't know what your files are, but if a program is deleting them according to their format for instance, you may want to zip/rar them prior to put them into the box. This can also save space if the goal is to archive them.

Run R from dropbox

Often in "restricted security" situations in which programs can't be installed on a computer I run R from a flash drive. Works like a charm. I've recently started using dropbox and was thinking it could be used in a similar fashion to the flash drive. For anyone who has tried this does it work?
I can test it myself but don't want to go to the bother if it's a dead end.
Thanks in advance.
PS this has the advantage of storing an .Rprofile that people whom you share the dropbox folder with can then run your R code. This is particularly nice for people unfamiliar with R.
It should just work.
R is set up in such a way that all its files are relative to a given top-level directory. Whether that is a F:\ or Z:\ drive from your flashdrive, or your Dropbox folder should not matter.
By the same token, R can run happily off a shared folder, be it via Samba, NFS or another mechanism.
It is fine if you want to share .Rprofile or .Rhistory. However, I see a problem with .Rdata, because it can be big (for example 4Gb). For me to save 100 Mb file on Dropbox takes minutes and .RData can be far bigger.
An alternative would be a remote server, where you could connect through ssh.

Resources