Build Process in TFS 2010 - build-process

I have read articles on build automation and it looks simple, but I am really not sure about parameterized build. I believe, there must be a xml file for that.
When we say build is automated, I believe it means our code/binaries sit in test environemnt. And all application related settings will also configured just by simple clicks of build, and push.
What are the required tools? What is MSBuild ?
Please put some light on it.

MSBuild is and exe that you run with command line tools and pass to it the project file (.csproj) which is an XML file as you said and it has all the instructions needed as you configured.
I created a series of videos that describe how to create simple MSBuild tasks and how to organize tasks and so on, for more info click on the following link:
MSBuild Tutorial
MSBuild is exe
When you run MSBuild from Command line
You will need to unload the projct so you can edit the (.csproj) or project file
The (.csproj) or project file

You are asking about build automation, and by your tags you menation TFS 2010 if so then you only need a cursory understanding of msbuild to get started. It is what eventually calls the compiler, but in all hoensty you need the step above it which are the build templates and defintions, along with how to set up agents and controller.
Here is a good overview document by Martin Woodward, this should give you enough to figure things out, or ask more specific questions

Related

Creating build and publish script for .net Application using powershell

I am new to power shell, I am trying to write a power shell script to build and publish the c#/.net application(Web). I have tried below code to generate build-
$projectFileAbsPath = .csproj path
$msbuild $projectFileAbsPath /t:rebuild /p:PlatformTarget=x86 /fl /p:outputpath=C:\test
by executing above line i am only getting dll of the project in destination folder. Supporting\Referenced dll's and related artifacts are missing.
Please help me understand where i am getting wrong and what additional things i need to do. Also, do i need any build configuration file .If yes, let me know if there is any sample posted on internet.

Visual Studio Team Services Compile LESS scripts during build

I'm deploying an ASP.NET Web Application to an Azure Website using VSTS's Continuous Integration. Everything works great except compiling LESS files.
I looked through the build steps and I couldn't find anything specific to LESS. Is there any documentation on how to do this?
It's pretty easy actually. You just have to set it's build action property to "content" and everything should be good to go.
If that doesn't do the trick, I found this blog post detailing another method to try (note that I haven't tried this technique myself yet):
In Visual Studio, open the properties of your web project, go to the "Build Events" section, and the in the section "Post-build event command line", insert the following line:
$(SolutionDir)\packages\dotless.1.1.0\Tools\dotless.Compiler.exe -m "$(ProjectDir)\content\*.less" "$(ProjectDir)\content\*.css"
Every time the project builds, this command will compile any .less file in the \content folder into a corresponding .css file, minifying it as well (with the -m switch).
Here is the post that contained this information:
http://tech-journals.com/jonow/2011/05/13/using-less-css-with-asp-net

OpenCover in localhost with C# MVC 4 app

I just want to try OpenCover to get Coverage statistics from my app.
But I don't understand well how to use it. So here my questions?
Does the dll's have to be in the same directory? (my solutions have several projects)
Any example to get Coverage using OpenCover?
Is necessary to run the site in IIS Express or with ASP.NET development server is ok?
Thanks a lot!
I set up open cover as an external tool to make it easier on me.
Download the exe and drop it in a folder with as short a path as possible. then setup and external tool as follows:
Title : Open Cover {this is your choice}
Command: {your path to opencover}\OpenCover.Console.exe
Arguments: -register:user -target:"C:\Progra~1\Microsoft Visual Studio 11.0\Common7\IDE\CommonExtensions\Microsoft\TestWindow\vstest.console.exe" -targetargs:"$(TargetName)$(TargetExt)" -output:coverage.xml -targetdir:"$(ProjectDir)\bin\debug"
Initial directory: $(TargetDir)
Set it to use output window and close on exit. You will need to adjust the test runner program to suit you, i use vs2012, if you do too, that will make it easier on you.
To use it, click your test project in your solution explorer, then click on the open cover external tool and it will generate you the coverage report. I use it with Report Generator.
Set it up as an external tool too:
Title: Report Generator
Command: {Your path to report generator}\ReportGenerator.exe
Arguments: $(TargetDir)coverage.xml $(TargetDir)\coverageResults
Again, set to close on exit and use output window.
after generating your coverage report, you can then use report generator to create a nice looking html version that you can click through and see the stats.
hth
The docs that are installed alongside OpenCover carry a lot of useful information about running OpenCover. You should have a copy of this file https://github.com/sawilde/opencover/blob/master/main/OpenCover.Documentation/Usage.pdf in you download package (MSI/ZIP/NUGET)
The DLLs do not need to all be in the same directory but you will normally find that this happens due to the build process. Any assemblies that you want to gather coverage from will require that the PDBs for those assemblies to be in the same directory as the assembly or in the folder referenced by targetdir switch.
Yes you can use it to run iisexpress or the ASP.NET development server use the target switch.

Can't msdeploy web package to temporary folder?

We've got an Asp.Net web application we're trying to get pseudo-deployed to a folder, and I'm starting to think Microsoft are crazy -- why is it so hard to get a WAP to do a "Local FileSystem" deploy as part of MSBuild?
I can build with this:
msbuild .\SubSite.csproj "/p:Platform=AnyCPU;Configuration=Release" /t:Package
And get a nice package.zip which I can deploy to a website...
However, I have two projects in my solution, which I need to combine before I ship them, so I want to deploy both packages into a folder, and then re-package that folder. Despite the documentation on technet to the contrary, it doesn't seem to be possible to do:
msdeploy -verb:sync -source:package=.\SubSite.zip -dest:contentPath=.\Www\SubSite
Because you get the error:
Source (sitemanifest) and destination (contentPath) are not compatible for the given operation.
Does anyone have a suggestion of how I could web deploy two sites inside one another without manually copying files out of a "PackageTmp" folder? I mean, I'm aware that I could just skip zipping the package and manually copy the files out to do this, but I'm not happy about having to create a custom target, and there's no other way to find that just to do something that should be built in.
We did eventually figure out how to do this, but I'm not really happy about it ;-)
Basically, (as mentioned elsewhere on so), you can call msbuild with the target set to _WPPCopyWebApplication. You can also specify/override the WebProjectOutputDir property when you do that. Something like this (where ${name} are variables we're using)
msbuild ${SourcePath}\Www\UI\UI.csproj "/p:Platform=AnyCPU;Configuration=Release;WebProjectOutputDir=${OutputPath}\AppRoot" "/t:_WPPCopyWebApplication"
msbuild ${SourcePath}\Www\Mobile\Mobile.csproj "/p:Platform=AnyCPU;Configuration=Release;WebProjectOutputDir=${OutputPath}\AppRoot\Mobile" "/t:_WPPCopyWebApplication"
msbuild ${SourcePath}\Www\Service\WebService.csproj "/p:Platform=AnyCPU;Configuration=Release;WebProjectOutputDir=${OutputPath}\AppRoot\WebServices" "/t:_WPPCopyWebApplication"
We can then package up the whole "AppRoot" in a separate step.

Best way to manage generated code in an automated build?

In my automated NAnt build we have a step that generates a lot of code off of the database (using SubSonic) and the code is separated into folders that match the schema name in the database. For example:
/generated-code
/dbo
SomeTable.cs
OtherTable.cs
/abc
Customer.cs
Order.cs
The schema names are there to isolate the generated classes that an app will need. For example, there is an ABC app, that will pull in the generated code from this central folder. I'm doing that on a pre-build event, like this:
del /F /Q $(ProjectDir)Entities\generated*.cs
copy $(ProjectDir)....\generated-code\abc*.cs $(ProjectDir)Entities\generated*.cs
So on every build, the Nant script runs the generator which puts all the code into a central holding place, then it kicks off the solution build... which includes pre-build events for each of the projects that need their generated classes.
So here's the friction I'm seeing:
1) Each new app needs to setup this pre-build event. It kind of sucks to have to do this.
2) In our build server we don't generate code, so I actually have an IF $(ConfigurationName) == "Debug" before each of those commands, so it doens't happen for release builds
3) Sometimes the commands fail, which fails our local build. It will fail if:
- there is no generated code yet (just setting up a new project, no database yet)
- there is no existing code in the directory (first build)
usually these are minor fixes and we've just hacked our way to getting a new project or a new machine up and running with the build, but it's preventing me from my 1-click-build Nirvana.
So I'd like to hear suggestions on how to improve this where it's a bit more durable. Maybe move the copying of the code into the application folders into the NAnt script? This seems kind of backwards to me, but I'm willing to listen to arguments for it.
OK, fire away :)
How often does your DB schema change? Wouldn't it be possible to generate the database-related files on demand (e.g. when the schema changes) and then check them into your code repository?
If your database schema doesn't change, you can also package the compiled *.cs classes and distribute the archive to other projects.
We have two projects in our solution that are built completely out of generated code. Basically, we run the code generator .exe as a post-build step for another project and along with generating the code, it automates the active instance of visual studio to make sure that the generated project is in the solution, that it has all of the generated code files, and that they are checked out/added to TFS as necessary.
It very rarely flakes out during the VS automation stage, and we have to run it "by hand" but that's usually only if you have several instances of VS open with >1 instance of the solution open and it can't figure out which one it's supposed to automate.
Our solution and process are such that the generation should always be done and correct before our auto-build gets to it, so this approach might not work for you.
Yeah I'd like to take VS out of the equation so that a build from VS is just simply compiling the code and references.
I can manage the NAnt script... I'm just wondering if people have advice around having 1 NAnt script, or possibly one for each project which can push the code into the projects rather than being pulled.
This does mean that you have to opt-in to generate code.

Resources