I'm using NCL scripts that open other NCL scripts. It seems that NCL scripts in the same directory open fine, but if they should be taken from another directory (with the same parent directory), they cannot be opened. It seems that NCL has not the right "home" directory set. How do I do this?
I obtained NCL via the conda installer (https://www.ncl.ucar.edu/Download/conda.shtml), and the tests mentioned there worked fine.
The problem was that the virtual machine I am running on has a limit of 256 open files. When this limit is reached, the error shown is "files cannot be found". I solved this by adding ulimit -n 1024 to my .bash_profile
Related
When I am removing files from Jupyter notebook environment, the disk space does not free up. I removed for about 40GB files and files disappeared from list, even from ls -a, however df -h shows that nothing happened. Also I killed all the processes using these files and even rebooted the system.
When I remove files using rm everything is fine. How can I free up space, or restore thos files to delete them using rm?
I also encountered the same issue, but later found out that files deleted in jupyter notebook are automatically moved to the trash, which is not permanently deleted instantly.
This function was added very early ago: https://github.com/jupyter/notebook/pull/1968
Thus, to free up space, you need to go to your computer's trash folder to completely delete those files. In Linux, you can do rm -rf $HOME/.local/share/Trash/files to empty trash.
In Windows or MacOS, you just need to "Empty Trash" in the desktop folder
To restore those files, try to access them in your Trash folder, which is located in the .local folder in your home directory (in my system.).
This worked for me. I'm using jupyter lab with Amazon Linux 2 AMI.
Ref.
recycle bin in linux:
https://www.pcquest.com/recycle-bin-linux/#:~:text=Henceforth%20whenever%20you%20delete%20a,SYSTEM_ROOT%20of%20the%20Trash%20directory.
I freed up the space and solved the issue when I was working with workbench(Google Cloud Vertex AI). Applicable to AI platform too. Open terminal from workbench File>New>Terminal
df -h -T /home/jupyter
Above code helps to know the free size
Then this code below will delete the trash to free up the size
sudo rm -rf /home/jupyter/.local/share/Trash/*
After deleting the trash
I was working in a jupyter notebook until it froze. It wouldn't save or shut down so I restarted my computer. I launched jupyter notebook from an anaconda prompt, my folder directory opens per usual. When i tried to open the notebook from before, I get an error loading screen that says permission denied: (name of notebook).ipynb. I hit close and the notebook shuts down.
I checked the folder permissions, I have full control. I can create a new ipynb without any issues. I can open other notebooks without any problem in the same folder. I tried to run a trust notebook through the anaconda prompt and it says the notebook is missing.
I need to recover this particular notebook as it has all my work. Help! Any ideas? Thanks in advance.
I work in the anaconda prompt in an environment other than the root, so this answer using sudo chmod doesn't work for me.
I had possibly the same problem. In my case the problem was that jupyter notebook must have crashed or had some problem whilst autosaving.
As a result, in the folder where the notebook is saved there's a temporary file called ".~nameofnotebook.ipynb".
This file didn't show up in jupyter notebook, but only in the explorer. I deleted the notebook file and renamed the temporary file to delete the ".~" prefix. Make sure to save a copy of the notebook file before deleting anything in case your problem is different.
The renamed temporary file opens fine and none of my data was lost.
Change the name of the file and you are good to go bro..
I am using:
download.file(url,path_file,mode="wb",quiet=quiet)
with R version 3.2.3 (2015-12-10) on Windows 7 to copy a big amount of images (TIFF files). I have to copy a huge number of files: 300'00 but it failed at some point with the following issue:
"cannot open destfile 'tmp/74114070005_531__0.tiff' , reason
'Too many open files'"
The issue is that from time to time the copy of the url failed and R create a empty file with a size of 0 Byte but Windows lock the file so I cannot remove it. So the failed file stay open until I exit R. After a certain number of failed copy then I got the error above that "too many open file"
Is there a way to close the connection for each file ? I tried closeAllConnections() but this has not impact.
Is there a way to run a R command line that will "restart" the R programm so Windows will unlock the files ?
Any other idea are welcome.
Thanks
Fabien
This is an issue with Windows that locked files. The best things is to avoid Windows. For people that need to use Windows there is a solution. I tested it with Windows 7:
copy the handle.exe (or handle64.exe) from: https://technet.microsoft.com/en-ca/sysinternals/bb896655.aspx?f=255&MSPPError=-2147217396
open a cmd prompt windwos as administrator
find the handle of the lock files: handle64.exe -nobanner -p python.exe pattern_of_your_file
handle64.exe -p pid_your_file -c handle_your_file
This avoid to have all these locked files
I'm trying to run a R script through a .bat file. When I run myself the commands line by line it works. But when I try to run the .bat file, it doesn't works.
This is the .bat file
cd "C:\Program Files\R\R-3.1.2\bin"
R CMD BATCH "C:\Users\Administrator\Downloads\testa_vps.R"
This is the R script
setwd('C:\Users\Administrator\Documents')
file.create('mycsv.csv')
I'm not an expert with Windows and generally try to stick to Unix-like systems for things like this, but I have found that using programs non-interactively (e.g. via .bat files) is usually less error-prone when you add the appropriate directories to your (user) PATH variable, rather than cding into the directory and calling the executable from within the .bat file. For example, among other things, my user PATH variable contains C:\PROGRA~1\R\R-3.0\bin\; - the directory that contains both R.exe and Rscript.exe - (where PROGRA~1 is an alias for Program Files which you can use in an unquoted file path, since there are no spaces in the name).
After you do this, you can check that your PATH modification was successful by typing Rscript in a new Command Prompt - it should print out usage information for Rscript rather than the typical xxx is not recognized as an internal or external command... error message.
In the directory C:\Users\russe_000\Desktop\Tempfiles, I created test_r_script.r, which contains
library(methods)
setwd("C:\Users\russe_000\Desktop\Tempfiles")
file.create("mycsv.csv")
and test_r.bat, which contains
Rscript --vanilla --no-save "C:\Users\russe_000\Desktop\Tempfiles\test_r_script.r"
Clicking on the Windows Batch File test_r ran the process successfully and produced mycsv.csv in the correct folder.
Before running test_r.bat:
After running test_r.bat:
I've never worked with a Windows server, but I don't see why the process would be fundamentally different than on a personal computer; you just may need your sysadmin to modify the PATH variable if you don't have sufficient privileges to alter environment variables.
As already suggested by #nrussel in the comments you should use RScript.exe for this.
Create a file launcher.bat with the following content:
cd C:\Users\Administrator\Documents
Rscript testa_vps.R
In addition, add C:\Program Files\R\R-[your R version]\bin\x64; or C:\Program Files\R\R-[your R version]\bin\i386to the System PATH variable in the Environment Variables menu depending if you run R on a 64-bit or 32-bit system.
I just tested the approach above successfully on a Windows Server 2008 64-bit system and mycsv.csv got created as expected.
EDIT
One important point I forgot to mention is the following: You need to specify the path in your R file in the setwd() call using \\ instead of \.
setwd('C:\\Users\\Administrator\\Documents')
Here is a screenshot of the successful run on the Windows 2008 server:
Note: I added cmd /k to the .bat file so that the cmd window stays open after clicking on the file.
Every time I started RStudio, I have seen this my working environment.
I can use rm(list=ls()) to remove them temporarily, but every time I restarted RStudio, they showed up again.
I use getwd() to see my working directory, but in the working directory, I did not see any .Rdata file. How can I get rid of these things ?
Your kind help will be well regarded.
I use Mac OS 10.10.
Click on RStudio in the menu bar and go to Preferences.
In the R General section, unclick the Restore .RData into workspace at startup option.
The default is to reload the working environment when you restart RStudio.
I think that you, at some point, chose to save your environment to your working directory (most likely ~, i.e. your home directory, which is the default RStudio working directory).
The easier way to clear your default environment is to remove the .RData file from your home directory. It will not appear in any Finder window, because in a Unix-like OS (like OS X), files starting with . are hidden. So do the following:
Open a terminal window
If not already there, go to your home folder: cd ~
Check if there's an .RData file: ls -lA .RData
If the file exists delete it: rm .RData (if you want, create a backup: `cp .RData ./RData_backup)