Ok.. I have looked through this site and just can't seem to find the answer I am looking for.
We have a multi server setup for our web sites, typically (DEV/QA/PRODUCTION). Our sites are .net which require some sort of build before being deployed. We are using SVN for source control and are looking for a tool/website/something that will allow our project managers to push button deploy changes to the different environments. It seems there is no silver bullet for this, am I correct in this assumption?
I like the functionality of Springloops but can't find any information regarding its use with .net. Sparing details, it would be nice to see a diagram of sorts on the whole end to end process. What I mean by that is, if multiple different tools are the only answer (no silver bullet), then a diagram that shows where the tools sit in relation to the whole process.
Look at CruiseControl.Net. It'll let you automate your builds and if there are errors it can notify certain people and even flash a big red light in the office.
The other thing it can do is automatically deploy to a chosen region such as Dev/Test/Stage/Prod when there are no errors.
Though you may want to make Prod a manual press of a button. :)
The new on click publishing features in Visual Studio 2010 are pretty easy to set up ans use: http://vishaljoshi.blogspot.com/2009/05/web-1-click-publish-with-vs-2010.html
Supports configuring IIS and deploying SQL packages too.
There is a msdeploy.exe file included in the that you could setup to run with .bat files or some configurable script runner.
Try ClickOnce deployment. Once a update is deployed on the server the client applications are automatically get updated on every client on their next run.
Related
I am a C# winforms programmer, not used to ASP.Net.
As a winforms programmer I build regularly to detect syntax errors.
Recently I opened a Kentico website in Visual Studio and to my surprise found that there were build errors.
Does this matter?
My instinct is to go about correcting the site until it builds. This is a side track from what I set out to do.
If you are attempting to build any kind of quality into your project/software, then yes, it does matter if it builds.
Regarding Kentico and build times, if you're using a website vs. a web project, yes the build times are typically longer and range anywhere from a few minutes to I've seen upwards of an hour. The build times depend greatly on the machine building it as well. So if your machine has a Celeron processor, with 1GB of RAM and a 5400 RPM drive, you're going to take longer to build than a machine with an i7 processor, 16GB of RAM and a solid state hard drive that can read/write 500+ MB/s. Also keep in mind Kentico out of the box has over 9000 system files in it so as a website, it will take some time to build.
One of the first things I check when a site doesn't build is to ensure all the referenced DLLs are in the website/project. If not, this will cause several errors and is usually a very simple fix. If you have any kind of errors from code which resides in the /App_Code directory, your site will NOT run at all when you publish it. If you have errors within any other directory, the site will run BUT wherever those code files are referenced on the website, will display errors. So in your instance if you have webpart files in the /CMSWebparts/OurCompany folder, if those webparts are placed on pages within the website, those pages will error out even though the rest of the site is running.
In my opinion, just fix the errors and be done with them. Then check the code into a version control system to keep track of the changes.
Does this matter?
It depends on what you are trying to achieve with your website. If you want to make it available to the public then building is definitely something you should consider as top priority. If on the other hand you want to have the source code open in Visual Studio on your local machine, just for reading purposes, then building is not necessary.
I have a problem debugging a web forms application that is configured to use IIS for debugging, under Windows 7 and Visual Studio 2010. An example has just occurred, where I make a change to the code behind for a web form, save, and apparently rebuild before starting the app using F5.
The app starts, and I get an error message trying to do something in the app. I tell the debugger to break when an exception is thrown and try my task again, only to be told
The source file is different from when the module was built.
where the module is C:\Windows\Microsoft.NET\Framework64\v2.0.50727\Temporary ASP.NET Files\root\9d7b45ca\11a98b19\assembly\dl3\5e6cf0b2\636409d4_dfeecb01\PerfixEMS_Admin.DLL
The physical folder for my test web site is set to the web application project's source folder, so I have always assumed that IIS will look in the bin folder for required assemblies, and these will be rebuilt as expected. Why is this not happening?
Cleaning the solution usually works for me.
Update
Given the high number (320) of projects I understand why Clean and Build won't work for you. You should however try it at least once to see if fixes things.
If it does fix your problem but doesn't last you'll need to do one of two things.
Clean just the one file
Delete the offending temp file. You probably won't be able to do this because with VS running since it may have a lock on the DLL. You may also have to stop IIS. You can use Process Explorer to look for the processes that have a lock.
Use a custom solution
Its unlikely that you're going to be modifing all 320 projects at the same time. Create a custom solution for just the projects you're working on. You'll still be able to step through any project you have the DLL and PDB for if you need to.
Which to do
Using a custom solution has its problems since you can no longer use project reference for projects not in your solution. This impacts your team's source control. You'll also have to make sure the DLL's and PDB's from outside your solution are in a stable location and you'll need a way to detect when thoes other projects have changes that you care about.
These problems can be overcome with a careful check-in process for Project changes and scripts that copy files and working with team members to figure out how to communicate changes.
On the other hand closing VS for every change or running Clean and build isn't really tennable either.
it may be a workaround, but I just need to see if it will work or not, then we may investigate more in the original case. but for now, try this:
1- publish this website to a different folder
2- open the newly published version from your preferred browser (ex: http://localhost/APP_NAME).
3- from VS, open "Debug" menu, choose "Attach to process..."
4- select the IIS worker process "w3wp.exe" and click "Attach".
(if you can't find it, make sure that the checkbox "show processes in all sessions" is checked)
5- start debugging your source code normally and let me know what happened, thanks.
We are a MSFT shop with a far-reaching MSDN license.
After many years of doing things wrong, we finally have to start doing automated testing.
My group is the Guinea pigs at this. We need to create what was not there before. We looked at the multitude of options out there. Some people get by just fine with open-source alternatives such as CC.Net, Bamboo, MbUnit, etc. We want to give MsTest, CodedUI, Team Build a good try ... might as well because of MSDN licensing and MSFT focus.
The plus and minus of doing things the MSFT way is that MSFT makes monolithic things. You have got to install various tools that play with each other nicely, but with outsiders - not necessarily. The plus is that when things are done correctly, it should all function rather smoothly. There is the option of gated check-ins, of using TFS to store the reports, etc.
Frankly, I am confused by all of the options. Our traditional build system was hacked together with a bunch of perl, batch scripts, executables, but now the build team switched to Team Build, which ought to be cleaner, but for the most part it is just a wrapper to the same old perl crap.
I am inclined to hack things together for testing too, because I can at least see what the pieces are. So, I envision the poor man's version as:
* A dedicated fast computer to run tests
* Some script to copy build files (test code as well as product code) over to that computer.
* A batch/perl script which would run mstest.exe from command line and execute a few test batches on some by-category filter within some test dlls (the product is so huge, that we do want to organize tests by various categories).
* Some script which will invoke the latter script remotely from the build server using psexec.exe (http://technet.microsoft.com/en-us/sysinternals/bb897553), as well as grabbing the xml output from a shared drive, and then sending out an email with results to those who are interested.
This can probably work, but then I have to worry about how well error handling can work with so many potential points of failure. It would be nice to configure things the "right way", taking advantage of whatever MSFT has cooked up. I am just not sure where to turn for a good guide. Have you done something like this?
Eventually we will want to have a farm of test computers, if we are to run out of the allotted time. Something else of concern is - for coded ui tests to succeed, I think a user has to be logged in, so I am not sure if psexec will be of much help here.
Can you share your positive/negative experience, point me to a good guide perhaps? Thanks!
Here are some tips off the top of my head if you want to get started with testing using the MS tools:
If you have an MSDN subscription, install a Test Rig by installing the Test Controller on your network and the the Test Agent service on each of the machines that will be collecting diagnostic data. See the following link for reference: http://msdn.microsoft.com/en-us/library/dd293551.aspx.
Add a Test Project to your solution. See the first part of the following blog post: http://blogs.microsoft.co.il/blogs/eranruso/archive/2010/03/27/visual-studio-2010-coded-ui-test-user-guide-create-a-simple-coded-ui-test.aspx.
Automated test options can be configured through the .testsettings file(s) that are added automatically when you add a Test project (you can also manually add these files to your solution).
Install Team Foundation Server (2010 recommended) in order to take advantage automating your tests with a daily build. You will also need TFS 2010 if you want to use the VS2010 Test Manager tool to define test environments and plan manual tests (these can be fully automated with CodedUI). Customize your new automated build to setup / deploy your application after build and set the build to run tests. Deployment will likely not be necessary for unit tests, but they will be for Web Performance and CodedUI test types.
If you have VS Ultimate or Test Professional licenses, you can also go further and set up virtual test labs using "Lab Management" features.
I have a solution with 2 projects in Visual Studio 2008 SP1, .NET Framework 3.5 SP1.
a ASP Web site.
a Class Library (dll) project.
I have a reference from the Web Site to the Class Library, as the Class Library is my data layer. But anyway, the thing happens only with this basic setup, a solution with these 2 types of projects and a reference from the Web Site to the Class Library.
Now, each time I modify something in the Class Library and I build it, Visual Studio creates a file called app_offline.htm and then deletes it (it sends it to the Recycle Bin).
This is really annoying because at the end of the day I end up with a full Recycle Bin and me, being the perfectionist I am, I want to keep it clean. I'm not the only one with this problem: here and here.
I know now the cause of the problem, but still not how to fix it. If you didn't hear about app_offline.htm before, here's ScottGu's article on app_offline.:
Does anyone know a solution to the problem? Some setting in VS to delete the file forever after the Build process? (I really don't want to set my Recycle Bin to do that, as I do delete things unintentionally from time to time and I'd like to be able to recover those.)
This file does not go into the Recycle Bin for me. Perhaps you have some draconian utilities installed, which do this? Many anti-virus tools and general system utility suites used to do this back in 2000 but I do not have experience with later versions.
Update: You can use Process monitor to find out which process moves this file to the recycle bin.
[Disclaimer: I'm adding an answer firstly because I hope it will get the question seen by more people (I admit it) and secondly because I have no characters limit on an answer, as oposed to a comment.]
I followed Sander's suggestion and used Process Monitor to track which process moves this file to the Recycle Bin.
It was indeed devenv.exe.
There are several events where it makes operations like: QueryDirectory, QueryOpen, CreateFile and CloseFile. And devenv.exe is the only process that has anything to do with app_offline.htm
Still... How could I make Visual Studio stop filing up my Recycle Bin? (way to go, Dan, putting a question in the 'answer' (: )
I started seeing the same problem shortly after we suffered a VSTS server problem. The VSTS server went down for a day so I had to open the solution in offline mode. After the VSTS server came back online, I had to reopen the solution under source control, and the app_offline.htm files start occurring non-stop every time I recompile my web projects.
THIS IS REALLY ANNOYING!
I am not sure how to stop it yet, but I know how to reliably recreate the problem on my environment:
Windows XP Pro, VS2008, SourceGear (Source Control System).
Whenever I perform a checkout, the app_offline.htm file is instantly created and deleted in/from the root folder. The source control system is using SQL Enterprise, so I am not sure it is related to some references from posts people are making about SQL Express.
Again, still don't know how to stop it, but maybe this will help other figure out how/when the file is generated and deleted.
Use Web Application projects, not the Web Site templates, those are for 'dummies'. :)
I had this problem because I published directly to Azure Web Service from the dev machine.
The answer here with another possible workaround here.
This is all I could find on the subject. Unfortunately it's also speculation.
http://petermcg.wordpress.com/2008/05/12/silverlight-app-offline/
I have a Windows 2003 Server with IIS, I installed VisualSVN Server on it.
I have two developers, who are going to use TortoiseSVN.
Since this is my first time ever setting up a SVN server I am kind of confused on how this will all work. The way I see it, each developer would have a copy of the repository on his or her local pc, would each person be required to have IIS installed on their PC as well to test their copies before checking out?
Should I create a testing folder on the server and then a production ready? It seems as if that would cause more issues with copies?
What would you do?
EDIT
I dont know what I was thinking, I forgot that VS has a built in IIS when you debug so the issue about setting up IIS on either client or server is now a non-issue. But I am confused, I imported the site into the repo, it said it was on revision 2 but I dont see any of the files in the repo folder. Do I create a virtual folder in IIS pointing to the repo that I created?
No, each developer uses your repository, and checks out their own copies to do their work. They do not need IIS or svn, etc, installed on their systems.
I recommend reading up on the Subversion FAQ.
Your devs don't have a local repository, they have a Working Copy on their PC. Typically, this is the most recent version of the app with whatever changes have been made by the developers but not committed yet.
As this is a web app, then your developers will need some kind of web server locally to test it - this could be IIS, or Visual Studio's built in web server (although that does behave differently to IIS in subtle ways).
You said in a comment: "My problem is I dont want the devs to commit to the live site in case there was a bug.".
The devs commit to the SVN repository on the server: at some point you will want to export (aka 'publish') a copy from the latest version in your repository to your live site. In order to make sure this works, you can check out a specific version from the server, test it, and if it passes the tests upload it: devs will always check in code with bugs (even though it builds) as it's better to check code in frequently than build up lots of changes locally then commit them, as there are bound to be conflicts with work other developers have done.
Branching and Tagging are useful concepts here: when you have a version which is almost right, you 'branch' it away from the main 'trunk' of the source code tree, fix any issues in the branch (back-porting to the main trunk as required), then when you have a working version you 'tag' it (as version x.y.z) and upload it. This way you can always refer to the particular version of the code you have uploaded, which makes it a lot easier to identify bugs which turn up in production. As others have suggested, read the SVN documentation for more info.
It depends on how you work. There are other discussions about folder structure and such which play directly into how you use version control.
Uh, no, no local repositories. Setting up SVN is easy, well almost. You'll want to look for the svn windows installer and set it up on the server. You'll want to install Apache and then you'll have a little hurdle setting up the http.conf file to expose svn over http. There's a little complexity with setting up security so go with Windows Authentication, you'll need WebDav, google it.
Once that's done, any svn client can hit it and checkout a copy and work with SVN normally. If you get really stuck, comment here and I'll go get a copy of our install and config for you.
The good news is that it's rock solid, once you get it setup it'll run forever.
"Pragmatic Version Control Using Subversion" and the SVN red-bean are the two sources you need to see.
Set up SVN on a single server and have all your developers point to it.
I've installed tortoise on the server and do Updates / Checkouts of the release website. Some people don't like checking in compiled code, but I like having the production compiled site in SVN.
If you use tortoise on the server, Do the initial checkout to the inetpub/website directory and then on rollouts you just need to update the directory using tortoise->update
Of course checkin to rollout is considered bad practice without first rolling out and testing on staging servers, but depends on your team size.
I have used the following resources for learning SVN:
http://www.polymorphicpodcast.com/shows/subversion/
http://www.dimecasts.net/Casts/ByTag/SVN
Found both quite good, and learning by watching can be easier especially for getting started.
No - your central server will maintain the repository. Your developers will get copies of the repository, make changes, and then commit them to your repository.
You actually have quite a few things to figure out if you want to do a successful deployment of subversion.
One really good article about setting up subversion on Windows - https://blog.codinghorror.com/setting-up-subversion-on-windows/
No, SVN server must be installed on a single computer. Each developer point at this computer and get locally (and eventually) a full copy or a partial copy of the repository.
You may also buy a book from O'Reilly about Subversion. Don't remember the title, sorry, but it helps me a lot.
All the best ! Sylvain.