I am looking for an R function to return an identifier of the computer the script is being run on, or at the least to distinguish between one of two known computers.
I have two PCs, both running Windows and RStudio. I use the desktop in the office and the laptop over VPN, typically working on the same projects, always using RStudio.
My scripts and permanent data sets are in a commmon repository. However, since I/O to that repository is slow, I keep a local directory for temp files.
On the desktop, I have a dedicated drive, and each project lives in its folder 'D:/workspace/this_project/'. On the laptop, the path is 'C:/Users/myself/Documents/workspace/this_project/' or just '~/workspace/this_project/'.
Currently, I keep two setwd() statements at the top of each script, and I just rely on the fact that one of them will fail because of the file structure.
setwd('~/workspace/this_project') # will fail on the desktop
setwd('D:/workspace/this_project') # will fail on the laptop
This seems like a bad practice.
I've looked through ?"environment variables" and don't see how to get my computer's name on the network or something else that is persistent and unique to the computer.
The desired solution could modify the laptop's tilde expansion to D:/ on the laptop only so that a common '~/workspace/' could be used, or a function using_laptop() like this:
set_project_wd <- function(folder_nm){
if(using_laptop()) setwd(paste0('~/workspace/',folder_nm))
else setwd(paste0('D:/workspace/',folder_nm))
}
If you call Sys.info() you can get your details:
names(Sys.info())
[1] "sysname" "release" "version" "nodename" "machine" "login"
[7] "user" "effective_user"
the entry under nodename will be your pc name.
Then you can do something like:
set_project_wd <- function(folder_nm){
if(Sys.info()[[4]]=="mylaptopname") setwd(paste0('~/workspace/',folder_nm))
else setwd(paste0('D:/workspace/',folder_nm))
}
Related
I am on Windows 10. I want to write a function (in R) to copy the files stored in a camera (actually in the SD of the camera, but I cannot just read the memory card in the PC), to a different storage unit (say, the pc or an external HDD).
The camera is connected to the PC via an USB cable.
The problem I am facing is that, when opening the File Explorer, the camera is showing up as a link under "This PC" with no letter to indicate the drive (e.g., 'G:/').
While I can see the files using the file explorer window, I cannot find a way to get to those file from a cli type of interface (e.g., the command prompt, or the R console).
Googling, I found that 'This PC' is not a folder but rather a link to something in the registry called CLSID for which the identifier should be {20D04FE0-3AEA-1069-A2D8-08002B30309D}. However this is very confusing to me and I cannot figure out how to use this information.
Is there a way to do it? And if so: how?
Please consider I do not know much of commands from prompt (way better off in R).
A CLSID is just a GUID. My computer is a implementation of IShellFolder.
My Computer is part of the shell namespace. Several entries in the shell namespace are virtual (Control panel, scheduled tasks etc.) and cannot be accessed with low-level file functions nor cmd.exe.
While it would be possible to develop a tool that does something like shellcopy Computer\MyCamera\*.jpg x:\backup, I'm not aware of any existing tools that do this. You might have to code it yourself.
In the old days you would call SHGetDesktopFolder to get the root and then use the returned IShellFolder to navigate but these days it is simpler to use IShellItem instead.
To do this it is crucial to understand how IShellFolder and PIDLs work. See Introduction to the Shell Namespace for more information...
I have R Studio on Windows. It works fine before I connect to the internet through my VPN. After I connect commands start to hang, autocomplete can take forever, and simple operations like 4 + 4 can take one minute or more.
I have a feeling the studio is making connections under the hood. I would like to disable all of these connections no matter what.
I have experienced the same thing - I am assuming your actual work is on a file server and not local. I found the culprit in my case to not be RStudio directly, but rather the RStudio project file. RStudio will create a generally small, hidden folder in the directory named .Rproj.user with some settings. This, living on the file server, caused constant read/writes through my VPN connection.
I unfortunately had to either (a) move projects off the file server into a local copy (not bad since I can use a company GitLab), or (b) delete the .Rproj and .Rproj.user folders from the project directory on the server and use here::here() or something like that as a work around in my workflow.
As another possibility, I have seen installations of R itself done onto a personal server drive instead of locally. This has been done to avoid needing administrator privileges to install. This is not a great idea and can also result in extremely slow performance through a VPN connection. You can check to see where R is installed as well. Sounds like this is not the problem though based on what you described.
Maybe it is something else, but this is what I found last week for me based on a very similar experience.
I wrote an R script on an institutional computer. I saved it on an external device (USB PenDrive) and tested it on a computer not of my institution. When I run it to test it using R from Terminal and from GUI I never saved it on the host computer.
This because it has to remain secret until it will be published. I simply would like to know if R or Unix itself saved it elsewhere while running although I load it directly giving the path of Volumes and so on.
In other words the question is does running an R script off a USB leave any trace of it on the host computer.
P.S.: I checked the R history to look for it but it seems not to be saved.
I am quite unsure of how the move files/directories use case in a client and NAS scenario technically works - perhaps someone can enlighten me or tell me if this is normal OS-behavior.
I have a NAS ( Synology DiskStation ) in a Gigabit-LAN with sometimes big directories ( in the range of ~ 10GB ) which I want to move somewhere else on the same NAS ( even on the same hard disk ).
The problem is that if I move a directory from lets say
//diskstation:/dir_foo/dir_1/src_1
to
//diskstation:/dir_foo/dir_2/
via my Windows 7 Desktop PC in Explorer ( I even tried it in Finder on MacBook ) this can take up to 10 Minutes (or the like) and I really wonder why this is the case.
To me this seems as if the whole data was first transported over LAN to my client PC and then afterwards moved back to the NAS!?
Shouldn't the explorer or the NAS notice that this is local file operation so that the data doesn't have to be transported through my LAN and the movie should be much quicker?
How can I analyze if the file movement is really executed over LAN? Because if i wanted to do these kind of operations via VPN from external, it would be pretty much unusable...
Is this normal behavior?
It's hard to give a firm answer, because it depends. What access protocol are you using, and what operation are you performing? Is it a drag-drop in your GUI?
Your NAS does what it's told. It almost certainly implements some sort of internal rename function, that means you don't need to copy data in order to 'move' it.
If you do this from the command line, using 'move' or 'mv' (depending on DOS/Unix) do you have the same problem? I'm prepare to bet you don't, because you're telling the NAS to rename, and it will, and it'll be fine.
Move it from the GUI instead of the file explorer.
If you are using your windows explorer for moving the files then your OS will first download the file from source directory to client PC and then upload it to target directory, this is because your using SAMBA shares.
If you want to move files quickly within your nas then best way would be to use putty or WinSCP which uses ssh & ftp etc.
I am having difficulty connecting to a existing Informix database. I am attempting to mimic the configuration that is present on another machine which currently works. By the way, that other machine is on the same network and it is accessing the DB through a tunnel, so I am pretty sure the issue isn't related to the network configuration.
Regardless, here are the steps that I took to try and make the connection
Downloaded clientsdk.3.50.TC9DE and installed this. The working machine uses 3.50.TC2DE, but I couldn't find the installer for that version. (Note that at first I tried using 3.50TC9, not sure if that makes a difference)
Matched the ODBC config in the new machine to the working machine
The working machine has a host name in the Host Name field. I assume this was allowed because the host was set to an IP in the hosts file. Regardless, I am using the IP.
Also I am using C:\Windows\SysWOW64\odbcad32.exe to create the DSN
Made sure that the INFORMIXDIR and PATH directory were correct. as per http://www.dbforums.com/informix/694408-odbc-test-connection-not-successful.html#post2633932 I don't think the locales are the issue because they aren't set in the working machine's Setnet32. Also, I made sure that the locales matched in the ODBC environment settings.
Also, since my INFORMIXDIR is in C:\Program Files (x86)\IBM\Informix\Client-SDK\bin I tried replacing Program Files (x86) with PROGRAM~2 and Client-SDK with CLIENT~1 to no avail.
Tried setting INFORMIXDIR directly in my system environment variables (outside of Setnet32)
Set DBPATH to match the working system in both the user and system environment variables.
Set INFORMIXSERVER to the server in both Setnet and the system environment variables.
Completely lowered the firewall on my machine.
I can ping and telnet into the server.
I have also tried..
Tried this on Windows XP
Tested the ILogin demo. The result was a popup that stated Customer Records Found in the title bar with an empty text area field.
Reinstalled into C:\informix instead of C:\Program Files(x86)...
Rebooted after various steps.
At this point I am at a loss. Has anyone run into this? The only other things I can think of is that I am using Win7 64-bit (with 32 bit drivers) and that the driver is 9DE not 2DE.
Alright so half of the battle is over. I was able to get a "Test connection was successful" on my Win7 machine. We had a copy of the 2.90.TC6 driver available in our file server from way back. I installed it and it worked. So my guess is that the database I am working with isn't compatible with 3.50.TC9DE.
I guess my next course of action is to try and find an installer for 3.50.TC2DE so that I can match the production system.