I already synced with rsync the files from a remote server to my local.
When I run the rsync, I excluded several folders and after the sync, I removed some files from the local version.
How can I resync, so that I can have the latest version, of the files I have in local and ignore the one I deleted from my local and the one I excluded since the beginning?
Instead of just deleting the local files you could additionally add them to an rsync exclude list. A wrapper for the remove command could do that automatically.
Related
I accidentally checked out a bunch of files in an undesired CVS directory, causing me to run out of disc space. How do I undo the action safely? Theoretically, I’d think running (Unix) rm -r on the dir and then redo’ing the checkout for the correct path would work but I don’t want to risk causing potential alterations to the repo itself. I can’t seem to find anything online that explains how to remove checked out files from local view only. Guidance appreciated.
You can just delete your local CVS checked out files. Nothing on the CVS repository will be changed. The CVS repository is only changed when you do a cvs commit command (and cvs tag and a few other CVS commands).
You can also move your local checked out files to another directory.
What happen if we try to create a file on a unmounted folder?
Does the file is created on local system?
I have a mounted folder that could sometimes unmount for reasons
I'm going to schedule a oracle procedure that writes a file inside there, what will happen if somehow the folder is unmounted
Does the file is created on local system?
Yes. Or, better, the file is created into that directory (don't call folder unix directories... :)), wherever that directory is (it could be part of a mounted remote file system, in that case it was not part of the local system).
Writing permissions on the directory play a role, and the mount command can also play with those permissions, but this is another, complicated matter.
I have a directory on the computer having many files and sometimes I need to sync it with a clone from a device.
I use rsync.
Rsync does not see when a file was renamed or moved in other directory and it is very slow sometimes.
Is there a smart syncing tool that can see when a file was renamed/moved in the main directory and just rename on the clone without physical remove+copy ?
I'm going to start using grunt-rev & grunt-usemin with grunt-watch for my web development needs (a RESTful Web App specifically).
I have a local development machine which will run grunt-watch to attach revision identifiers on my JS files. I git commit and git push my tree to a git repo, and then ask the production server to git pull the changes from the git repo to show them to the web visitors.
The problem is that I don't want my git repo to store different filenames (due to grunt-rev) on each commit. That would be bad, because then I wouldn't be able to do git diff between commits without having my screen get flooded with the contents of files that appear and disappear, and also it could (sometimes) take up a lot more storage than if it only stored the small diffs of the files.
The only solution I see is to add the build directory containing the versioned filenames in my .gitignore, so as to not store those files (with the constantly changing filenames) in git. But wouldn't that mean that I would have to run grunt-watch on my production server as well, in order to produce the build directory with the versioned filenames there as well? But that gets complicated: a new process has to run on the remote server, maybe with its small chances of error in processing the files. Not the solution I was hoping for.
Do you people have another solution? What would you suggest I did?
What I do to solve this, is to remove previous "build" files before committing and deploying a new file. There is no need to keep older files that have been generated, because you can always rebuild them with the source files (which are in git).
I'm working on a web application where a user uploads a list of files, which should then be immediately rsynced to a remote server. I have a list of all the local files that need to be rsynced, but they will be mixed in with other files that I do not want rsynced every time. I know rsync will only send the changed files, but this directory structure and contents will grow very large over time and the delay would not be acceptable.
I know that doing a remote rsync, I can specify a list of remote files, i.e...
rsync "host:/path/to/file1 /path/to/file2 /path/to/file3"
... but that does not work once I remove "host:" and try to specify the files locally.
I also know I can use --files-from, but that would require me to create a file ahead of time with a list of files that I want to rsync (and then delete it afterwards). I think it'd be cleaner to just effectively say "rsync these 4 specific files to this remote server", but I can't seem to get that to work.
Is there any way to do what I'm trying to accomplish, or do I have to resort to creating a tmp file with a list in it?
Thanks!
You should be able to list the files similar to the example you gave. I did this on my machine to copy 2 specific files from a directory with many other files present.
rsync test.sql test2.cpp myUser#myHost:path/to/files/synced/