Vagrant shared/synced folders permissions - symfony

From my research I understand that VirtualBox synced folders have permissions set up during the mounting process. Later, I am unable to change it therefore permissions for the whole synced folder MUST be same for every single file/folder in the shared folder. When trying to change with or without superuser permissions, changes are reverted straight away.
How this can work with for example Symfony PHP framework where there are several different permissions for different files/folders? (i.e. app/console needs execute rights but I don't want to have 7XX everywhere).
I have found in different but similar question (Vagrant and symfony2) that I could set the permissions to 777 for everything in the Vagrantfile, however this is not desirable as I need to use GIT behind my source code which is than deployed to the live environment. Running everything under 777 in the production is, nicely put, not correct.
How do you people cope with this? What are yours permissions setups?

A possible solution could be using the rsync synced folder strategy, along with the vagrant rsync and vagrant rsync-auto commands.
In this way you'll lose bidirectional sync, but you can manage file permission and ownership.

I am in a similar situation. I started using Vagrant mount options, and found out that as I upgraded parts of my tech stack (Kernel, Virtualbox, Vagrant, Guest Additions) I started getting different behavior while trying to set permissions in synced folders.
At some point, I was perfectly fine updating a few of the permissions in my shell provisioner. At first, the changes were being reflected in the guest and the host. At another point in time, it was being done the way I expected, with the changes being reflected only in the guest and not the host file-system. After updating the kernel and VB on my host, I noticed that permission changes in the guest are being reflected on the host only.
I was trying to use DKMS to compile VBOX against an older version of my Kernel. No luck yet.

Now when I have little more experience, I can actually answer this question.
There are 3 solution to this problem:
Use Git in your host system because vagrant basic shared folders setup somehow forces 777 (at least on Windows hosts)
Use NFS shared folders option from the vagrant (not available on Windows out of the box)
Configure more complex rsync as mentioned in Emyl's answer (slower sync speeds).

Related

Index files in host and remote machine using PhpStorm

I work on a Symfony project using Vagrant. The host machine is using Windows. Due to fact that the request time is very high, I decided to install the vendor files inside the vm and the entire "rest" of the project remains inside the synced folder (project root => /vagrant).
Everything is working fine and the request time is under 100ms now. But there is one issue left. I have to install the vendor on my Windows machine first and then again in the vm, otherwise PhpStorm is not able to index the files correctly (I know, this is a logical consequence).
So my question is, if it is possible, to host a project on the Windows machine and the files are for example under "C:\Users\SampleUser\Project\ProjectX" and the vendor is installed under "/home/vagrant/vendor" and let PhpStorm index the files of both directories?
Otherwise I will have to live with this one and code completion won't work.
Or I will have to install the libraries on both machines to improve request time and have a more or less "good" workflow.
I hope, I could explain good enough, what my actual problem is.
Thank you very much for your time.
Had the same exact problem. Indeed a bummer.
One possible solution is to leave the vendor folder on the VM and manually copy it to your host machine.
Pros:
PHPStorm is able to index files
Cons:
If you add a dependency, you have to copy some parts of the vendor folder manually to the host machine
To those facing the same problem, I might advise SFTP (Tools -> Deployment -> Configuration in PHPStorm) - files can be transferred without leaving the IDE window. The only thing to do is get the VM box password, which is located at
%USERNAME%/.vagrant.d/boxes/your box/box version/virtualbox/Vagrantfile
Second solution: if you are using Virtualbox, you can use vm.synced_folder with type: "virtualbox" (the sync works both ways, host<->guest), and leave the vendor folder in your project (for it to sync all the time).
Pros:
vendor folder always up to date, no manual work
Cons:
Horrible performance (tested myself)
If you want to use non-virtualbox rsync (type: "rsync"), you will not get the ability to sync back from the guest (someone, please correct me if I'm wrong!), so you are left with the 1st solution.
It would be great if we could include the vendor folder directly from the VM (using some kind of rsync/symlink magic) to the "Languages & Frameworks -> PHP -> include path" list, at least when using VirtualBox, but oh well...

Vagrant 2 way folder sync

I've setup a Vagrant box that runs my webserver to host my Symfony2 application.
Everything works fine except the folder synchronization.
I tried 2 things:
config.vm.synced_folder LOCALFOLDER, HOSTFOLDER
config.vm.synced_folder LOCALFOLDER, HOSTFOLDER, type="rsync"
Option 1: First option works, I actually don't know how file is shared but it works.
Files are copied in both way, but the application is SUPER slow.
Symfony is generating cache files which might be the issue, but I don't really know how to troubleshoot this and see what is happening.
Option 2: Sync is only done in one way (from my local machine to the vagrant box), which covers most of the case and is fast.
Issue is that when I use symfony command line on the vagrant box to generate some files they are not copied over to my local machine.
My question is:
What is the best way to proceed with 2 ways syncing? With option 1 how can I (as it might be the issue) exclude some files from syncing.
With Option 2 how can I make sure changes on remote are copied to my local machine?
If default synced folder strategy (VirtualBox shared folders, I imagine) is slow for your use case, you can choose a different one and, if you need, maintain the two-way sync:
If your host OS is Linux or Mac OS X, you can go with NFS.
If your host OS is Windows you can instead choose SMB.
Rsync is very fast but, as you've pointed out, is one-way only.
As it doesn't seem Vagrant offers a "built-in" way to do this here is what I did:
Configure Vagrant RSYNC folder on the folders that will contains application generated files (in Symfony2 it is your Bundle/Entity folder). Note that I didn't sync the root folder because some folders doesn't have to be rsynced (cache/logs...) and also because it was taking way too much time for the rsync process to parse all the folders/subfolders when I know that only the Entity folder will be generated.
As the Rsync has to be done from the Vagrant box to the host, I use vagrant-rsync-back plugin and thus run this manually everytime I use a command that generates code.
https://github.com/smerrill/vagrant-rsync-back#getting-started
Create an watcher on my local machine that will track any change in code and rsync it to the vagrant box.
https://gist.github.com/laurentlemaire/e423b4994c7452cddbd2
Vagrant mounts your project root as /vargrant folder inside box as 2 way share.
You can run your command there do get required files synced. Any I/O will be damn slow (like you already mentioned), however you will get your files. For other stuff use your 1-way synced folder.

What should I do to secure, chmodly wise, my server?

I want to let my friend access my server so he can host his website. Let's call him Joris.
# useradd joris
note that I'm Debian. So now a /home/joris has been created. This is cool and all. BUT. He can
cd /
cd /etc/
cd /var/www/
He can cd pratically everywhere, maybe not delete but he can see everything, which I don't want him to. Is that normal?
First, I would suggest you reading the Debian Administrator's Handbook by either using aptitude install debian-handbook or using a search engine to find a copy online. It covers many topics about security that will be of use to you, especially when sharing a server with multiple users.
As far as being able to access various directories, Debian is VERY relaxed for my tastes with it's default permissions setup. Check the default UMASK settings (/etc/login.defs) so that you can have a more secure setup when adding users.
I o-rx from things like /var/www and grant access to those using Access Control Lists (ACLs). If you are unfamiliar with ACLs I highly recommend you familiarize yourself with them as they are much more robust than the default permissions system.
As far as what all you should protect, that will depend on your setup. Most things in /etc will be self explanatory whether or not you can remove read access for users outside of the owner/group (like your web server configuration directory). You can also use permissions to limit access to specific binaries that users should never have access to, like mysql or gcc.
In the long run your setup will be unique to your specific needs. Reading the Debian Handbook will be immensely helpful in helping you secure your box not only from the outside, but from the inside as well.
Hope this helps point you in the right direction.

How to work on several machines, keep them in sync without Internet?

For quite a while now, I have been using Dropbox to sync a Git repository on several virtual machines (one Windows, one Mac, one Linux). I would commit my changes on one of the machines and Dropbox would take care of syncing the changes of the files and the repo onto the other machines.
This works very seamless. I code on OSX, test the code on Windows and Linux, maybe make some changes there, then commit from one of the three.
However, it has three major drawbacks:
It requires an internet connection. I frequently have to rely on my cellphone for internet connectivity, which is unreliable if I'm on a train and only good for a few hundred Mb per month.
Dropbox syncs EVERYTHING including object files, Visual Studio debug databases and just a whole lot of unnecessary stuff that does not need to be synced.
It always goes through Dropbox servers, which is fine for some minor project or some open source stuff, but I don't want to push my work files to an untrusted server.
So, how do you manage an environment like this?
Edit:
Actually, all the three virtual machines live on the very same laptop, so network connections between them are not a problem. Also, I frequently code on one OS and compile on another--and go back and forth until I have found all errors. I don't want to spam the company repo with hundreds of incremental commits.
Edit 2:
To give you an idea for what I am looking for, here is a partial solution I came up with: On each machine, I created a git repository of the files I want to work with. Typically, I will start working on a bug/feature one machine, then commit my work. On the next machine, I will call git reset origin to load the changes from the first machine, then continue working on the commit using git commit --amend. This will go back and forth a few times. Once I am done, I will finally commit the changes for real (no more amending) and start working on the next feature/bug.
However, this workflow feels cumbersome and inelegant. What I am looking for is something that results in the same output--one commit on the repo--but was created fluently between the three machines.
You could consider setting up your own software versioning server.
Most clients for these servers have implementations on varying OS's and platforms.
But if you want to communicate between machines that are not in a LAN, you're going to need an internet connection.
The versioning servers network communication can be exposed over NAT through a gateway to the internet. You could implement security by setting up a tunnel mechanism. Any client would then tunnel up to a gateway server and then communicate with the versioning server.
As for control over which files are actually versioned: I have some experience with SVN, with which you can select on file level which files to add to versioning. the SVN client will then simply ignore the rest of the files and directories.
Edit:
Reading the edit of the original author's question:
Maybe setup a 4th virutal machine, containing the Versioning server. SVN isn't (by any stretch of the imagination) hard to manage. (RTM). Have the three virtual machines connect to the server on the 4th. (This is ofcourse, if it's possible to run the machines in parallel on the same machine.)
If you can share a disk between the three, put the master repo on that. (Make sure you make backups! Especially for removable media.)
Edit: In more detail, you can have your master repo on a USB stick or a shared partition on your hard drive (as you indicate you are using multiple virtual machines on the same hardware).
To set up a private git repo, simply create an empty directory and run git init.
Assuming you are on your Ubuntu box and have an USB stick with a file system which you can read and write in all your operating systems, mounted in /media/usbgit, run this:
vnix$ mkdir /media/usbgit/mycode
vnix$ cd /media/usbgit/mycode
vnix$ git init
Initialized empty Git repository in /media/usbgit/mycode/.git/
(Given that you already have a git repo, you probably just want to clone it to the USB stick instead:
vnix$ cd /media/usbgit
vnix$ git clone /home/you/work/wherever/mycode
Initialized empty Git repository in /media/usbgit/mycode/.git/
This will now contain all the commits from the repo you pulled from.)
Now you have an empty repository which you can clone and pull from on all the boxes. Once you have the USB stick mounted, you can clone from it.
vnix$ cd
vnix$ mount | fgrep usbgit
/dev/whatever on /media/usbgit type NTFS (rw)
vnix$ git clone /media/usbgit/mycode
Initialized empty Git repository in /home/you/mycode/.git/
warning: You appear to have cloned an empty repository.
All of this is doable with SVN too (use svnadmin create to initialize a repository, and svn checkout file:///media/usbgit/mycode to check it out), but you will lose the benefits of a distributed VCS, which seem useful in your scenario.
In particular, with a distributed VCS, you can have multiple private repositories (each working directory is a repository on its own) and you can sync with and pull from your private master and a public repository e.g. on Github -- just make sure you know what you have where.

Keeping dot files synched across machines?

Like most *nix people, I tend to play with my tools and get them configured just the way that I like them. This was all well and good until recently. As I do more and more work, I tend to log onto more and more machines, and have more and more stuff that's configured great on my home machine, but not necessarily on my work machine, or my web server, or any of my work servers...
How do you keep these config files updated? Do you just manually copy them over? Do you have them stored somewhere public?
I've had pretty good luck keeping my files under a revision control system. It's not for everyone, but most programmers should be able to appreciate the benefits.
Read
Keeping Your Life in Subversion
for an excellent description, including how to handle non-dotfile configuration (like cron jobs via the svnfix script) on multiple machines.
I also use subversion to manage my dotfiles. When I login to a box my confs are automagically updated for me. I also use github to store my confs publicly. I use git-svn to keep the two in sync.
Getting up and running on a new server is just a matter of running a few commands. The create_links script just creates the symlinks from the .dotfiles folder items into my $HOME, and also touches some files that don't need to be checked in.
$ cd
# checkout the files
$ svn co https://path/to/my/dotfiles/trunk .dotfiles
# remove any files that might be in the way
$ .dotfiles/create_links.sh unlink
# create the symlinks and other random tasks needed for setup
$ .dotfiles/create_links.sh
It seems like everywhere I look these days I find a new thing that makes me say "Hey, that'd be a good thing to use DropBox for"
Rsync is about your best solution. Examples can be found here:
http://troy.jdmz.net/rsync/index.html
I use git for this.
There is a wiki/mailing list dedicated to the topic.
vcs-home
I would definetly recommend homesick. It uses git and automatically symlinks your files. homesick track tracks a new dotfile, while homesick symlink symlinks new dotfiles from the repository into your homefolder. This way you can even have more than one repository.
You could use rsync. It works through ssh which I've found useful since I only setup new servers with ssh access.
Or, create a tar file that you move around everywhere and unpack.
I store them in my version control system.
i use svn ... having a public and a private repository ... so as soon as i get on a server i just
svn co http://my.rep/home/public
and have all my dot files ...
I store mine in a git repository, which allows me to easily merge beyond system dependent changes, yet share changes that I want as well.
I keep master versions of the files under CM control on my main machine, and where I need to, arrange to copy the updates around. Fortunately, we have NFS mounts for home directories on most of our machines, so I actually don't have to copy all that often. My profile, on the other hand, is rather complex - and has provision for different PATH settings, etc, on different machines. Roughly, the machines I have administrative control over tend to have more open source software installed than machines I use occasionally without administrative control.
So, I have a random mix of manual and semi-automatic process.
There is netskel where you put your common files on a web server, and then the client program maintains the dot-files on any number of client machines. It's designed to run on any level of client machine, so the shell scripts are proper sh scripts and have a minimal amount of dependencies.
Svn here, too. Rsync or unison would be a good idea, except that sometimes stuff stops working and i wonder what was in my .bashrc file last week. Svn is a life saver in that case.
Now I use Live Mesh which keeps all my files synchronized across multiple machines.
I put all my dotfiles in to a folder on Dropbox and then symlink them to each machine. Changes made on one machine are available to all the others almost immediately. It just works.
Depending on your environment you can also use (fully backupped) NFS shares ...
Speaking about storing dot files in public there are
http://www.dotfiles.com/
and
http://dotfiles.org/
But it would be really painful to manually update your files as (AFAIK) none of these services provide any API.
The latter is really minimalistic (no contact form, no information about who made/owns it etc.)
briefcase is a tool to facilitate keeping dotfiles in git, including those with private information (such as .gitconfig).
By keeping your configuration files in a git public git repository, you can share your settings with others. Any secret information is kept in a single file outside the repository (it’s up to you to backup and transport this file).
-- http://jim.github.com/briefcase
mackup
https://github.com/lra/mackup
Ira/mackup is a utility for Linux & Mac systems that will sync application preferences using almost any popular shared storage provider (dropbox, icloud, google drive). It works by replacing the dot files with symlinks.
It also has a large library of hundreds of applications that are supported https://github.com/lra/mackup/tree/master/mackup/applications

Resources