Is there a way to change the location where the jupyter notebooks checkpoints are saved? Or in other words, use a different location for the 'ipynb_checkpoints" folder?
I have several notebooks saved on a network drive, but would rather have the checkpoints save on a local drive as sometimes when working from home my VPN connection is patchy and I loose access to the network.
Related
I need the mounted Google drive to be accessible by another user in Colab: The particular use case is Rstudio-server running in Colab, since Rstudio cannot be used by root.
If I do the usual:
from google.colab import drive
drive.mount('/drive')
Then /drive can only be accessed by root and cannot be used by the Rstudio user.
Is it possible to mount so that another user (say rstudio) can access the drive?
I would like to clone a GCP Vertex AI notebook in a different availability zone. The zone the notebook was created in has not been available for several days! I can't find a way to do this from the User Managed Notebook interface. So I created a clone of the notebook's VM in the Compute Engine interface, but then I don't have a url proxy to access Jupyter Lab.
Any guidance would be appreciated.
Thanks,
Jay
When you say "The zone the notebook was created has not been available for several days", sounds very alarming and probably inaccurate because that would mean a disk or compute level a GCP zone is down (Does not show any related alert https://status.cloud.google.com/index.html and I don't see this in our Monitoring systems)
Notebooks VM is composed of 2 disks, boot disk and data disk which is a mount to /home/jupyter folder. Disks can't be moved to a different zone.
I would suggest you to create a new Notebook and copy the files. Documentation here: https://cloud.google.com/vertex-ai/docs/workbench/user-managed/migrate
Hi accidentally deleted a jupyter notebook that I was running in DataProc
I can't see a checkpoints folder in my GCS bucket. Any suggestions of recovery?
if you never saved the checkpoints before or you explicitly deleted from the dataproc web UI, then it's very likely your data won't be able to recovered.
Things you can check to ensure GCS is able to save your checkpoints:
Make sure the GCS is set up correctly to save checkpoints, you can check /etc/jupyter/jupyter_notebook_config.py and make sure c.GCSContentsManager.bucket_name is present and set with right bucket. If not present, you should set it and then from dataproc Web UI, Kernel > Restart to restart Jupyter from the Jupyter menu.
Make sure your account have admin/write access to your bucket. How - https://cloud.google.com/storage/docs/access-control/
I am trying to make a shortcut of some sort, hopefully a .desktop file, to run a jar file from a network share. I want to be able to do this without mounting the drive is possible. I've looked a few other posts on this site, and most talk about using Nautilus or some fileshare program to open a path to the file, but I want to be able to run the file from the network drive. Any help is appreciated.
I've been working on an alternative with a drive that mounts on login. The current setup I'm trying is to get a script to run at login using environmental variables. I've tried a number of the CIFS options, but nothing seems to work without prompting for the user's password.
My fstab file has the following:
//share.domain.com/folder /mnt/folder cifs noauto,users 0 0
I have a small shell script to mount it.
mount /mnt/folder
Is there any way to get the drive to mount without the user typing in their password? Note: this should work for multiple users on the system. So a credentials file isn't optimal.
I set up a machine with Ubuntu 14.4.1 yesterday and I am trying to use it as remote storage for a project I am working on remotely with a few people. I know nothing about hosting servers, so I am attempting to avoid the issue entirely by just treating it like a local area network using Hamachi.
The Ubuntu machine has 2 hard drives - a boot drive on which Linux is installed and a larger data drive. I am attempting to share a directory on the data drive via samba so that it can be accessed via Hamachi by windows 7 machines.
I am able to see the directories that I have shared, but when I try to enter them, I get a permission denied error. When I share a directory within my /home/user/ directory, it works fine. Is there any way that I can share a directory on my data drive?
Perhaps I could make a symbolic link from my user directory to the data drive? Would that actually work? I am not familiar.
If it is NTFS, you can't change the permissions, which will be the issue.