Why does "move directories" on a NAS take so long? - networking

I am quite unsure of how the move files/directories use case in a client and NAS scenario technically works - perhaps someone can enlighten me or tell me if this is normal OS-behavior.
I have a NAS ( Synology DiskStation ) in a Gigabit-LAN with sometimes big directories ( in the range of ~ 10GB ) which I want to move somewhere else on the same NAS ( even on the same hard disk ).
The problem is that if I move a directory from lets say
//diskstation:/dir_foo/dir_1/src_1
to
//diskstation:/dir_foo/dir_2/
via my Windows 7 Desktop PC in Explorer ( I even tried it in Finder on MacBook ) this can take up to 10 Minutes (or the like) and I really wonder why this is the case.
To me this seems as if the whole data was first transported over LAN to my client PC and then afterwards moved back to the NAS!?
Shouldn't the explorer or the NAS notice that this is local file operation so that the data doesn't have to be transported through my LAN and the movie should be much quicker?
How can I analyze if the file movement is really executed over LAN? Because if i wanted to do these kind of operations via VPN from external, it would be pretty much unusable...
Is this normal behavior?

It's hard to give a firm answer, because it depends. What access protocol are you using, and what operation are you performing? Is it a drag-drop in your GUI?
Your NAS does what it's told. It almost certainly implements some sort of internal rename function, that means you don't need to copy data in order to 'move' it.
If you do this from the command line, using 'move' or 'mv' (depending on DOS/Unix) do you have the same problem? I'm prepare to bet you don't, because you're telling the NAS to rename, and it will, and it'll be fine.

Move it from the GUI instead of the file explorer.

If you are using your windows explorer for moving the files then your OS will first download the file from source directory to client PC and then upload it to target directory, this is because your using SAMBA shares.
If you want to move files quickly within your nas then best way would be to use putty or WinSCP which uses ssh & ftp etc.

Related

R Studio incredibly slow only after I connect to my work VPN?

I have R Studio on Windows. It works fine before I connect to the internet through my VPN. After I connect commands start to hang, autocomplete can take forever, and simple operations like 4 + 4 can take one minute or more.
I have a feeling the studio is making connections under the hood. I would like to disable all of these connections no matter what.
I have experienced the same thing - I am assuming your actual work is on a file server and not local. I found the culprit in my case to not be RStudio directly, but rather the RStudio project file. RStudio will create a generally small, hidden folder in the directory named .Rproj.user with some settings. This, living on the file server, caused constant read/writes through my VPN connection.
I unfortunately had to either (a) move projects off the file server into a local copy (not bad since I can use a company GitLab), or (b) delete the .Rproj and .Rproj.user folders from the project directory on the server and use here::here() or something like that as a work around in my workflow.
As another possibility, I have seen installations of R itself done onto a personal server drive instead of locally. This has been done to avoid needing administrator privileges to install. This is not a great idea and can also result in extremely slow performance through a VPN connection. You can check to see where R is installed as well. Sounds like this is not the problem though based on what you described.
Maybe it is something else, but this is what I found last week for me based on a very similar experience.

Best way to execute code on remote machine

I am looking for the best way to execute code on a distant machine. Ideally, I am looking for a solution such as Cuda which provides the opportunity to allocate executions on GPU or CPU, but for distinct machine.
I tried distinct ways to do that :
I connect my machines with ssh, export my script, execut it. No particular issue, but not very handy. But maybe this solution could be optimise. Because I open my ssh connection with the terminal, or termius.
I try another way with mosh, same outcomes, but quicker.
Currently, I am working on a Spyder kernel to have a direct link in the place of execution.
I've seen there is also a possibility with a nohup connection, but I have to work on this solution to understand well the possibilities.
Everything works well, but I am looking for a more convenient solution.
Thank you in advance for your answers !
You could either use sshfs along to ssh to mount the remote filesystem on your machine it's easier than always copy the code by hand, if so I would recommend to use screen or something like that that if the connection breaks it offers no problems.
Personal I like to work with Visual Studio Code and the ssh fs extension for this purpose.
An other alternative is to work with X2Go. X2Go enables you to access a graphical desktop of a computer over a low bandwidth (or high bandwidth) connection.

IBM Domino Designer JavaScript editor lags in virtual environment

I have installed Domino Designer in a Windows VM on VirtualBox on OS X.
When I start entering code in the JavaScript editor, Domino starts to work for every letter I type. The hourglass icon appears and the network symbol on the status bar flashes. This operation takes up to several seconds for every letter I type.
If I try to type anything before the hourglass disappears, the keyboard may hang up and the result is a long list of the same letters that I have to delete again (causing the hourglass to appear for each letter I delete again).
I have tried to disable functionality like "Content Assist", "Quick Diff" and other helpful stuff without luck.
I would really appreciate hints or tips to make this nightmare vanish...
I've not used domino designer, but first thought would be that your VM isn't handling the processing required by the designer.
What are the specs on you windows VM? Did you allocate enough RAM, for example? Make sure they match the requirements to run the designer:
http://www-969.ibm.com/software/reports/compatibility/clarity-reports/report/html/softwareReqsForProduct?deliverableId=1351628933716&osPlatform=Windows
Thanks to Joel for leading me into the right path.
I did several things, and together it now seems that I have a much better environment. I still see the hour glass from time to time, but it does not mess up my code anymore, and most of the time it does not bother me.
What I did was the following:
Changed the memory settings for Domino in this file:
[notes dir]\framework\rcp\deploy\jvm.properties
New values:
vmarg.Xmx=-Xmx1024m
vmarg.Xms=-Xms512m
vmarg.Xmca=-Xmca512k
Then I changed the virtual memory of my guest Windows install to a fixed swap file of 4096 MB.
At least I connected my Mac to a faster network using Thunderbolt to Ethernet cable adapter. I don't think the last thing did any difference, but at least I now have a faster and more reliable network connection.

detect which computer I'm running an R script on

I am looking for an R function to return an identifier of the computer the script is being run on, or at the least to distinguish between one of two known computers.
I have two PCs, both running Windows and RStudio. I use the desktop in the office and the laptop over VPN, typically working on the same projects, always using RStudio.
My scripts and permanent data sets are in a commmon repository. However, since I/O to that repository is slow, I keep a local directory for temp files.
On the desktop, I have a dedicated drive, and each project lives in its folder 'D:/workspace/this_project/'. On the laptop, the path is 'C:/Users/myself/Documents/workspace/this_project/' or just '~/workspace/this_project/'.
Currently, I keep two setwd() statements at the top of each script, and I just rely on the fact that one of them will fail because of the file structure.
setwd('~/workspace/this_project') # will fail on the desktop
setwd('D:/workspace/this_project') # will fail on the laptop
This seems like a bad practice.
I've looked through ?"environment variables" and don't see how to get my computer's name on the network or something else that is persistent and unique to the computer.
The desired solution could modify the laptop's tilde expansion to D:/ on the laptop only so that a common '~/workspace/' could be used, or a function using_laptop() like this:
set_project_wd <- function(folder_nm){
if(using_laptop()) setwd(paste0('~/workspace/',folder_nm))
else setwd(paste0('D:/workspace/',folder_nm))
}
If you call Sys.info() you can get your details:
names(Sys.info())
[1] "sysname" "release" "version" "nodename" "machine" "login"
[7] "user" "effective_user"
the entry under nodename will be your pc name.
Then you can do something like:
set_project_wd <- function(folder_nm){
if(Sys.info()[[4]]=="mylaptopname") setwd(paste0('~/workspace/',folder_nm))
else setwd(paste0('D:/workspace/',folder_nm))
}

Running Visual Studio in Parallels for mac - problem with debugging sites sitting in os x drive

I've installed parallels desktop on my MacBook to be able to run Visual Studio 2008 in a XP installation. Everything works great except when I decided to put my websites in my sites folder in the os x file system (Which by default automatically happens because the My Documents folder is mapped to the Mac's Documents folder, and I'd rather put my code there so that both OS's can easily access it.).
When trying to build or debug I get this error:
Failed to start monitoring changes to 'Z:\xxx...'
How do I get it so that I can get it to work under Parallels, from the shared drive?
Parallels uses network drives to simulate folders on OS X, and Windows can't monitor changes to network drives, so if you do this directly, it'll be broken.
If you want to keep them in sync though, use Live Mesh (http://www.mesh.com) and install it on both the host and guest. A little roundabout, but it'll make it so both copies are maintained (and Live Mesh is handy for other things too)
I recently flipped over to putting my source code onto my Mac volume, so I could use Time Machine to back it up and immediately got this same problem with my ASP.NET app. Other, procedural applications, built just fine, by the way.
I tried all sorts of things, including using Samba on the Mac side to share the directory, which led into the "too many BIOS commands" error described elsewhere. Unfortunately for me, the Registry hacks to fix that problem never worked for some reason.
I finally found another solution that avoids Samba and just uses the regular Parallels Shared Folders. It too is a Registry hack, but this one simply turns off file change monitoring for ASP.NET. It is a bit heavy-handed, but gets my builds to work again.
The reference for this change is here:
http://support.microsoft.com/kb/911272
The downside to this approach, I am finding, is that you need to be more deliberate about recompiling, or restarting the web server, as changes during development don't just magically appear anymore. I am still deciding whether that is a useful tradeoff.
UPDATE: After several days of this, development was just too difficult and, sadly, what I reverted to was keeping my source inside the Parallels virtual disk. To enable Time Machine backups and Spotlight searches, I used a lightweight MS utility called SyncToy to push stuff out of Parallels and out to my Mac drive several times a day. Despite the high hack factor, it is working well.
I know this isnt strictly a solution but VMware fusion is superior when it comes to shared drive space on a virtual machine. Its what i currently use and hasn't let me down thus far...
People always give me odd looks when they see visual studio on my mac :P
Try moving the project on to the VMs C drive. Its not an ideal situation, but you can access the VMs C drive from OS X.
I have a similar problem with a php site that uses an MS Access database (its a clients system). I have alias's that point to the php site on the VM so that I can still do all of my coding in OS X. To do this I created a network share on the VM and then connected to it from OS X. Once connected make the alias's. If the network drive is not open and you open a file in OS X it will try to reconnect. It means the VM will need to be running to get to the files, but this isn't normally a problem since the VM is hosting the site anyways.
.NET has funny issues trying to debug the objects on a network drive.
make sure that you have full trust on your local network between your Mac and XP install.
Check out: http://msdn.microsoft.com/en-us/library/aa302361.aspx
If at the end of that research, I"m afraid you will have to look into the option of keeping it on the VMDisk and moving it when you need it.
I see a similar problem on my machine connected to the windows domain. My documents is mapped to a network share and I can't debug|run|etc. I had to eventually move to my local disk for debugging.
I definately recommend Live Mesh as a way to keep directories in sync. Just keep the VM's directory in sync with the Mac's directory.
Or use SVN to hold copies in both machines and do commit/update as appropriate. That way you get versioning, history and if your project grows bigger, you can share with other devs.
I know dropbox also has history and sharing, but not check in/check out/conflicts and all the other advantages of a real source control.
Oh, if you have money you can also go for TFS. I would but it is just too expensive :)

Resources