I need the mounted Google drive to be accessible by another user in Colab: The particular use case is Rstudio-server running in Colab, since Rstudio cannot be used by root.
If I do the usual:
from google.colab import drive
drive.mount('/drive')
Then /drive can only be accessed by root and cannot be used by the Rstudio user.
Is it possible to mount so that another user (say rstudio) can access the drive?
Related
Someone uploaded a folder on Google Drive and shared it with me. I installed Google Colab and opened an IPhython notebook in colab. For connecting it with Google Drive, I did:
from google.colab import drive
drive.mount('/content/gdrive')
I continued to run the code and do some imports, and at some point I need to give the path for this folder. I tried:
path = "/content/gdrive/MyDrive/the_folder/"
But when I checked on the left-hand-side, under "Files"->"gdrive"->"MyDrive", it's not even there, so no wonder it's not found when I run the code later. Did I mount it incorrectly?
In case you want to work with a shared by someone Google Drive folder (you are not an owner of which) in Google Colab, you should, first of all, create a shortcut of this folder inside your Drive, as it exists in the Drive of its owner.
It can be done following way: right-click on this folder while in the Shared with me tab of the Google Drive and then click on the Add shortcut to Drive. This way your Drive will contain the folder (a shortcut), which was created and shared by someone else.
Then, after a regular mount procedure this folder will be accessible from Google Colab.
Hi accidentally deleted a jupyter notebook that I was running in DataProc
I can't see a checkpoints folder in my GCS bucket. Any suggestions of recovery?
if you never saved the checkpoints before or you explicitly deleted from the dataproc web UI, then it's very likely your data won't be able to recovered.
Things you can check to ensure GCS is able to save your checkpoints:
Make sure the GCS is set up correctly to save checkpoints, you can check /etc/jupyter/jupyter_notebook_config.py and make sure c.GCSContentsManager.bucket_name is present and set with right bucket. If not present, you should set it and then from dataproc Web UI, Kernel > Restart to restart Jupyter from the Jupyter menu.
Make sure your account have admin/write access to your bucket. How - https://cloud.google.com/storage/docs/access-control/
Is there a way to change the location where the jupyter notebooks checkpoints are saved? Or in other words, use a different location for the 'ipynb_checkpoints" folder?
I have several notebooks saved on a network drive, but would rather have the checkpoints save on a local drive as sometimes when working from home my VPN connection is patchy and I loose access to the network.
I am trying to make a shortcut of some sort, hopefully a .desktop file, to run a jar file from a network share. I want to be able to do this without mounting the drive is possible. I've looked a few other posts on this site, and most talk about using Nautilus or some fileshare program to open a path to the file, but I want to be able to run the file from the network drive. Any help is appreciated.
I've been working on an alternative with a drive that mounts on login. The current setup I'm trying is to get a script to run at login using environmental variables. I've tried a number of the CIFS options, but nothing seems to work without prompting for the user's password.
My fstab file has the following:
//share.domain.com/folder /mnt/folder cifs noauto,users 0 0
I have a small shell script to mount it.
mount /mnt/folder
Is there any way to get the drive to mount without the user typing in their password? Note: this should work for multiple users on the system. So a credentials file isn't optimal.
I set up a machine with Ubuntu 14.4.1 yesterday and I am trying to use it as remote storage for a project I am working on remotely with a few people. I know nothing about hosting servers, so I am attempting to avoid the issue entirely by just treating it like a local area network using Hamachi.
The Ubuntu machine has 2 hard drives - a boot drive on which Linux is installed and a larger data drive. I am attempting to share a directory on the data drive via samba so that it can be accessed via Hamachi by windows 7 machines.
I am able to see the directories that I have shared, but when I try to enter them, I get a permission denied error. When I share a directory within my /home/user/ directory, it works fine. Is there any way that I can share a directory on my data drive?
Perhaps I could make a symbolic link from my user directory to the data drive? Would that actually work? I am not familiar.
If it is NTFS, you can't change the permissions, which will be the issue.