Improving Web Development Using Virtualization
https://web.archive.org/web/20090207084158/http://aspnet.4guysfromrolla.com:80/articles/102908-1.aspx
Virtualization is, in essence, creating multiple miniature (virtual) PCs inside of your primary PC. One of the great benefits of this is that it allows you to isolate and test an application or set of applications in an environment that is free of other things to interfere. It used to be that in order to get a new machine with a new development environment on it you had to have another piece of hardware, or you had to rebuild your system to the new environment. With virtualization, you simply install the new environment that you need into one of the virtual machines and you run it as necessary. When you're done you can shut it down.
Virtualization is the ultimate in isolation -- it can allow you to do things on one piece of hardware that are simply not possible without it. For instance, you can install software in a test environment on a member server because it won't run on a domain controller. You simply fire up two virtual machines at the same time -- one being the domain controller and the other being the member server. Both virtual machines can run on the same physical hardware at the same time without either being aware that they are sharing a machine. The result is a quick way to implement testing environments.
Virtualization technology allows for the virtual systems to be frozen in place. In other words, the exact spot in the machine that you are at can be frozen for an indefinite period of time. If you work on one project until it's released and stable and need to come back in a year and start working on it again, you can freeze the system when you stop working on the project and then restart it a year -- or more – later. When the system is restarted it will be like time had not passed. The system will be restored exactly as it was left.
This particular feature is great for developers who support multiple systems including consultants who have different clients with different projects that they will have to support over time. You don't have to worry about recreating an environment to test a bug fix; you simply thaw out your virtual machine and go.
Virtualization programs have a feature described as Undo disks. Undo disks allow you to operate on the system and if you decide that you don't want to save your work you simply don’t' accept the changes in the undo disks. Poof. Like magic everything that you did is undone and it's like it never happened.
Related
I've had this setup, but it didn't seem quite right.
How would you improve Content Delivery (CD) development across multiple .NET (customer) development teams?
CMS Server -> Presentation Server Environments
CMS Production -> Live and Preview websites
CMS Combined Test + Acceptance (internally called "Staging") -> Live ("Staging")
CMS Development (DEV) -> Live (Dev website) and sometimes Developer local machines (laptops)
Expectations and restrictions:
Multiple teams and multiple websites
Single DEV CMS license (typical for customers, I believe?)
Enough CD licenses for each developer
Preferably developer could program and run changes locally--was this a reasonable expectation?
Worked
We developed ASP.NET pages using the Content Delivery API against the same broker database for local machines and CD DEV. Local machines had CD dlls, their own license files, and ran/debug fine with queries and component presentation calls.
Bad
We occasionally published to both the Dev presentation server and Developer machines which doesn't seem right now, but I think it was to get schema files on our local machines. But yes, we didn't trust the Dev broker database.
Problematic:
Local machines sometimes needed Tridion-published pages but we couldn't reliably publish to local machines:
Setting multiple publication destinations for a single "Local Machine" publication target wouldn't work--we'd often take these "servers" home.
VPN blocked access to laptops offsite (used "incoming" folder at the time).
Managing publication targets for each developer and setting up CD for each new laptop was good practice (as in exercise, not necessarily as a good idea) but just a little tedious.
Would these hindsight approaches apply?
Synchronize physical files from Dev to local machines on our own?
Don't run presentation sites locally (localhost) but rather build, upload dll, and test from Dev?
We were simply missing a fourth CMS environment? As much as we liked our Sales Guy, we weren't interested in purchasing another CM license.
How could you better setup .NET CD for several developers in an organization?
Edit: #DominicCronin pointed out this is only a subset of a proper DTAP setup. I updated my terms and created a separate question to clarify DTAP with Tridion.
The answer to this one is heavily depending on the publish model you choose.
When using a dynamic model with a framework like DD4T you will suffice with just a single dev environment. There is one CMS, and one CD server in that environment and everything is published to a broker database. The CD environment could be used as an auto build system, the developers purely work locally on a localhost website (which gets the data from the dev broker database), and their changes are checked in an VCS (based on which the auto build could be done).
This solution can do with only a single CMS because there is hardly any code developed on the CMS side (templates are standardized and all work is done on the CD side).
It gets more complex if you are using a static or broker publishing model. Then I think the solution is to split Dev up in Unit-Dev and Dev indeed as indicated by Nuno and Chris.
This solution requires coding on both the CMS and CD side, so every developer has a huge benefit in having its own local CMS and CD env.
Talk to your Tridion account manager and agree a license package that suits the development model you want to have. Of course, they want to maximise their income, but the various things that get counted are all really meant to ensure that big customers pay accordingly, and smaller customers get something they can afford at a price that reflects the benefits they get. In fact, setting up a well-thought-out development street with a focus on quality is the very thing that will ensure good customer satisfaction and a long-running engagement.
OK - so the account managers still have internal rules to follow, but they also have a fair amount of autonomy in coming to a sensible deal with a customer. I'm not saying this will always work, but its way better than blindly assuming that they are going to insist on counting every server the same way.
On the technical side - sure, try to have local developer setups and a common master dev server a-la Chris's 5th. These days, your common dev environment should probably be seen as a build/integration server: the first place where the team guarantees all the tests will run.
Requirements for CM and CD development aren't very different, although you may be able to publish to multiple developer targets from one CM if there's not much CM development going on. (This is somewhat true of MVC-ish approaches, but it's no silver bullet.)
I understand that Two users cannot work on the same machine at the same time, and Test Compleate interacts with GUI in the way a user would do.But probably there is some way to solve this problem?
I don't know Test Complete so I can't say anything about workarounds specific to this product (that may well exist!), but one option is always to set up Virtual Machines and run the tests in there. Some of the most popular virtual machines (they all have free editions) are Virtualbox, VMWare and Microsoft Virtual PC.
try UI Automation (MS UIAutomation library or UiAutomation PowerShell module, for example), it frequently pulls the AUT to the foreground. Moreover, it has the abiblity to set the focus (AutomationElement.SetFocus(), Set-UIAFocus).
Currently our team (web devs, one designer and one copywriter) all work on separate workstations but do our changes on the same dev environment (we all mount the same shared drive), it's a marketing site and not a web application, so no builds or deployments, we just push changes to the live site once they are done, but I think it's important for us to keep versions of files, especially serverside code, even if it only makes up a tiny percentage of our content (mostly static pages).
I'd like to use version control for our work setup, but I'm not sure if SVN or GIT will play along with more than one person checking in/out from the same dev environment. I've got existing experience with SVN, CVS, GIT, Perforce and PVCS but have always worked with individual dev environments.
I'd like a solution that doesn't require us to run separate dev environments as we lack the infrastructure.
Most, if not all, version control implementations are intended to be used by more than one person. Both Subversion and GIT will happily do everything you require but you might find Subversion easier to get up and running with quickly. If you intend to host it in a Windows environment, take a look at VisualSvn Server.
The major difference between GIT and SVN is that GIT is a distributed system whereas SVN relies on a central repository. In software development, there are good arguments for using a distributed system but the needs you describe would be easily served by the much simpler SVN implementation (I think).
Another good reason for using SVN (under Windows) is that the TortoiseSvn client is one of the best examples of a user interface to any version control system. It is extremely easy to learn how to use and well documented as well as supported by the OS community.
It may also be worth investigating the various providers of hosted version control systems if you don't want the overhead of maintaining your own source control servers.
You don't need lots of server hardware. Why can't each dev/designer/etc run the site on their own computer? You do have atleast one computer each? :)
GIT or SVN in that case is just a matter of taste.
the way we used to do this at an old workplace was to give each user an account, and give the "project" and account as well. We would each check out a local working copy. the project (in your case it would be the tools that people in your business use i guess) would check out a copy too. Once the devs were all satisfied with a certain build, we would issue an update to the project working copy.
In your setup you say that you all have the same dev environment. do you mean you each have your own PC on the network, or are you sharing a PC? either way you should be able to each have your own svn accounts and local working directoires, so this would not be a problem.
Depending upon the amount of data you are talking about, an option to consider would be Dropbox ("secure backup, sync, and file sharing made easy"). It supports versioning, and lets you share folders.
It also has the benefit of providing offsite backup of, and remote web access to, your data.
With svn, you can have each submit automatically trigger an update of the live site. Still having everyone work in the same directory is somewhat awkward (as it was all the time, anyway …), but old habits die hard and assuming people insist on that, svn does export its repository via WebDAV, so it should be possible to mount it as a network filesystem on the desktop machines.
"Currently our team (web devs, one designer and one copywriter) all work on separate workstations but do our changes on the same dev environment (we all mount the same shared drive)" Adding SVN would do away with the need for the shared drive. You would each work in a local directory that is checked in to SVN. Tortoise would be a perfect client for you.
"it's a marketing site and not a web application, so no builds or deployments, we just push changes to the live site once they are done" A simple batch job (ANT script or other) could be written and given to each of you. Once the files are ready for deployment simply execute the batch and have it check out the latest files from SVN and copy to your web server.
"doesn't require us to run separate dev environments as we lack the infrastructure"
Doesn't make a lot of sense.
Presumably, each of you has a separate workstation. And your workstations are separate from your web server. Just guessing, but that's typical.
You can -- trivially -- each have a private development copy on your workstations. You can then use SVN to synchronize your various changes.
You can tag a version as "good to go".
Someone can -- when you've got everything looking right -- do an update on the web server to get the official version into production.
This doesn't require any more infrastructure than you already have in place.
With SVN you can have the devs commiting to the server, and have one special user at the server checking out the latest version. So commiting you work involves commiting to SVN, then logging in to the server and do a checkout of the latest trunk.
You can use the same pattern with git and other decentralized version control systems. The advantage with these systems is that they don't enforce the central server pattern. Dev a could for instance push to b which then pushes it to the server.
I am developing a distributed file system using Java, I cannot give many details at this moment. I need to test some things on Linux, I will use WMWare server an install Linux inside a virtual machine. Is there any difference between the simulated network card and a real ethernet interface?
I am developing a distributed file system ... I will use WMWare server an install Linux inside a virtual machine.
VMware is great for this sort of thing. There should be no difference except, as RichieHindle said, in performance, especially if you're planning to run multiple vms on the same server.
Use real hardware if you want usable performance benchmark results.
Java is it's own 'VM'... on top of a layer of virtualization in the guest OS... on top of VMware... on a virtual execution model CPU. Take a little virtualization here, add a little virtualization there, and pretty soon we're talking about some real abstraction!
From the point of view of application code, no, there's no difference.
The only visible difference might be in performance - the speed of response and exact timings of things might be different, but you're talking microseconds.
There's so much general-purpose software that works flawlessly under VMs that the answer to almost every question of the form "Are VMs different from real machines?" at the application level is "No".
(Things might be different if you were talking about kernel-level driver software.)
It's one of those things I see a lot but never really think of. Do you think for the purpose of web application development (specifically ASP.NET WebForms/MVC). Do you think it's advantageous to do such a thing and if so, what kind of advantages come out of it?
By virtualization I mean using products like Hyper-V to separate the server context like your SQL and Web Server, etc.
First question is, virtualization of what? Do you mean server virtualization? Do you mean running VMWare on each dev's laptop with multiple OSes? Do you mean moving everything to the cloud?
Virtualization of servers, in web app context, is not really different from that in general IT - most of the servers on the Internet, including StackOverload's, are bought to handle peak loads and spend most of the time idling away the cycles, so virtualizing them makes sense when you have more than a certain amount.
VMWare on the desktop (or other parallels on other operating systems) is superb because a) your devs can run a full instance of your server environment, including multiple virtual servers connnected in a virtual network - this is about as close to the real thing that you can get, minus hardware costs and minus devs messing with each other's servers. For clients, you can use Linux and multiple Windows installs to test various browsers, font sizes, etc. quickly - also a big win.
Moving everything to the cloud makes sense in many cases, but is probably a topic for a separate full-sized question :)
One big advantage I see is, that every developer can have his/her own sandbox to work on. If someone messes up his/her sandbox he/she can take a clean image and all is OK again. So I guess that means that there is room to experiment without losing valuable time getting back to the normal setup, you can simply do a rollback.
I'm in doubt a bit on whether you should use virtualisation for production environments. Depending on the application of course.
The only time I would use a virtual for ASP.Net development was if the app required specific setup, such as relying on installed software, wierd settings or particular shares. Every developer has their own webserver and can run their own database so if it's a "basic" webapp I don't see much value in virtuals.. it's pretty hard to break anything with a basic web app deployment :)
With a virtual server, you can test your code in a production-like environment. It is also possible to quickly revert back to the original setup. For many applications, it is useful in that time period just after you write the code, but before it goes to production.
I'm a fan of virtualizaion and use it in testing and production (VMWare and Hyper-v) but over the last year I find it less important on a dev machine. TFS provides me with all the backup/rollback ability that I need, multiple versions of .net can now exist on the same machine and VS2008 can target all those versions.
In a development environment a virtual environment is useful to put several different servers on one box, you can have an instance for your web app, one for your services, one for database, etc. That way it mimics your production environment if you are using separate servers.
One of the benefits of using virtualization in production is that your application is not tied to a specific machine. If you wanted to move your web server instance to another box, it is trivial to do so. You don't need to install or configure things on the new server and hope that everything is set up properly.
One problem I have had though in testing virtual instances is that it can run slower for some applications, specifically engineering apps that like running the CPU at 100%. So test before you leap.