I want to figure out how to integrate TeamCity builds with Springloops deployments.
Lets say I have a Git repo called api that has two branches dev and dev.build
api
|-- dev
|-- dev.build
I have set up TeamCity with a VCS trigger on dev to build upon commit. Which then creates artifacts that I want to use to deploy. (In this case it's an ASP.NET website with various DLLs that I want to use for deployment).
I also have Springloops that I use right now for deployments. Ideally I would like to deploy from dev.build. Is there a way to commit build artifacts from TeamCity to the dev.build branch, and then deploy from that branch?
The basic workflow would be
Commit code to dev
TeamCity merges dev into dev.build
TeamCity builds from dev.build
TeamCity commits artifacts (DLLs) to dev.build
Springloops auto deploys from dev.build
I have read arguments against storing build binaries/artifacts in git, but right now I do my deployments from Springloops, and ideally I could keep that setup. I know you can call git as a command line build step, but I'm having trouble putting all the pieces together. Especially how to merge from dev to dev.build as a TeamCity build step and then use dev.build to build from.
Is this possible at all? Am I thinking about this completely wrong? Do I have other options?
Edit/Update
I've found that it will be smarter to switch to using WebDeploy to deploy TeamCity artifacts (WebDeploy package) instead of committing build artifacts from a git repository via Springloops. I will hopefully stop using Springloops for deployments and deploy directly to my IIS sites via WebDeploy through a TeamCity build task. This way build artifacts (\bin folder) will stay out of git, and web.config transforms can also be used instead of manually making web.config edits on production IIS sites.
The other option I would recommend is to build a TeamCity Artifact, which is there to store the result of your job in an integrated lightweight builds artifact repository.
That way, Springloops could look for artifacts to deploy directly on a TeamCity url (or on a TeamCity agent).
That would scale better than trying to put those artifacts in a source code repo like Git, which isn't made for that.
Regarding the TeamCity build itself, it could be compose of two jobs (one dependent on the other) with the first one using the automatic merge feature, in order to merge dev to dev.build.
Yep, this is totally doable in TeamCity. In fact Jetbrains covers a lot of this in a blog post here.
Related
Our current CI pipleline uses Team City and Octopus deploy but I'm evaluating changing this to Azure Devops doing both the build and deployment.
The solution we have consists of some ASP.NET code stored in GitHub which is compiled by Team City and then that along with some other packages which don't change are all deployed to a server. The directory they deploy to is wiped so that it is a fresh install each time.
So far I've managed to create the build pipleline and am then using the IIS deployment process to deploy the build (haven't got it to wipe the existing site yet).
What do I use for the other parts of the solution which are static though? In Octopus these are stored in the package library and have been manually uploaded. Should I be looking at Azure artifacts for this?
Also how should I go about deploying these? Should I create multiple web app deploy steps, do something different or is there a way to select multiple packages on the one web app deploy step?
As the Package library in Octopus deploy (last time is used it) is setup as a Nuget feed, it's a pretty fair comparison. Azure Artifacts allows you to upload several package types including Nuget, so you should be able to use that the way you did the package library.
As for the cleanup of you site, it depends a bit on the task you use during the release.
If you are using the 'IIS web app deploy' step, checking the 'Remove Additional Files at Destination' box would fit your needs.
As you refer to your target environment as 'a machine' i'm not sure what kind of infra you are you using but in case of some IaaS setup powershell remoting[1] might also be a way to go.
[1] https://www.howtogeek.com/117192/how-to-run-powershell-commands-on-remote-computers/
You can still use Octopus packing and deployment steps in your Azure DevOps pipeline by installing the extension here.
What you can do with your static content is to include it in your code repo in a way that gets compiled and packed by Octopus and the deployed as a whole. However, if moving the static content from Octopus to your code is not allowed, then you could try handling the merging of the compiled solution plus static content via adding different process steps in Octopus.
The equivalent of the package library was Azure Artifacts. I has the functionality to be set up as a NuGet feed however I had issues publishing into it. Potentially due to two factor authentication.
My eventual solution was to set up repos containing the static files each with a pipeline that just copied files and published an artifact at the end.
I have a standard way of working for building and deploying my asp .net applications on a build server (usually Jenkins). I have a custom build script and publish profile that builds my solution, copies some files and publishes the web project all with a single call to MSBuild.
I am trying to recreate this in Azure DevOps. My first obstacle seems to be that build and "publish" (or "release" in DevOps parlance) are separate steps. But it also seems like maybe I don't need to "publish"?
So
How do I control what ends up in the "artifacts" folder of the build? Is that through Azure? Can I do it with a .MSBuild file? And
Do I even need the concept of "publish" or does the release step just copy all artifacts to the destination (a VM in this case)? Can I control what gets deployed?
I am having a hard time finding some kind of basic tutorial that cover asp .net projects being deployed this way.
How do I control what ends up in the "artifacts" folder of the build? Is that through Azure? Can I do it with a .MSBuild file?
Yes, we could use the MSBuild Arguments parameter /p:PackageLocation=$(build.artifactstagingdirectory) to control what ends up in the "artifacts" folder.
When we use the task Visual Studio build to build the project, we could edit the MSBuild Arguments parameter to be like: /p:DeployOnBuild=true /p:PublishProfile=Test /p:PackageLocation=$(build.artifactstagingdirectory)
Then we could use Publish Build Artifacts task to publish build artifacts to Azure Pipelines, TFS, or a file share.
Do I even need the concept of "publish" or does the release step just copy all artifacts to the destination (a VM in this case)? Can I control what gets deployed?
Yes, you can simply understand the release step just copy artifacts to the destination. If you want to control what gets deployed, you can filter the artifacts when you use the copy task with contents.
Check the article Deploying Web Applications Using Team Build and Release Management for the details.
Hope this helps.
If you have existing scripts that work, just use those :). The new YAML builds are moving slowly away from fully featured UI tasks with lots of logic to more bare bones scripts that do what is needed.
It is a common practice though to split build from publish. That way you can put in additional checks, reviews and approvers on the release stage.
In your case it would mean:
Build stage
Use MsBuild to build & create a publish package either in the form of a folder or a zip file.
Use the Publish Pipeline Artifact task to store this folder or zip file
Release stage
Use the Download Pipeline Artifact task to restore the file to publish
Optionally use a Infra as Code tool to prep the target environment.
Use msdeploy.exe or another tool to publish the package. Optionally generate and pass in a settings file to override certain settings.
This way you can have multiple release stages to generate test environments, temporary POC environments etc by simply using another set of variables and/or settings override file.
There are special tasks in Azure Pipelines that wrap around these tools and provide an easy to use UI and some additional logic. These can be very useful if you don't have intimate knowledge of the commandline tools. If you do know your way around them, there may not be a very strong reason to use the fancy tasks.
I'm doing a web site deployment in azure with bit bucket source.
When I do the deployments I can see always its building the source,
Actually that is not required to me, because it is a Kentico 10 web site (.Net website project).
How do i avoid building while source deployment/ pull the latest from bitbuckt ?
You should stop using continuous integration process in bitbucket and hook your own process to do a xcopy (preferably delta) to target website folder.
Using the OOB tools it is not possible to deploy without a build. So you can do a few things:
FTP
Visual Studio publish
command line copy after a successful build locally.
Another setup could reduce the number of builds you have when deploying but will still build the solution, more branches in Bitbucket.
You could continue to use CI but make sure you hook your environments to proper branches so they only deploy when you perform a merge into that branch.
How do i avoid building while source deployment/ pull the latest from bitbuckt ?
You could check the deployment details under "DEPLOYMENT > Deployment options" as follows:
And you could leverage KUDU and check the auto-generated deploy.cmd file under D:\home\site\deployments\tools\deploy.cmd.
For your requirement, I would recommend you customize your deploy.cmd file, and put .deployment and deploy.cmd files into your Bitbucket repository. For a simple way, you could just download your current deployment script and modify the scripts under the Deployment section, you need to remove the script for building your solution and just leave the script for kudu sync, and you need to modify the value for the -f option from "%DEPLOYMENT_TEMP%" to "%DEPLOYMENT_SOURCE%" when invoking the %KUDU_SYNC_COMMAND%. Details you could follow Custom Deployment Script.
If you want to deploy the full content of your repo with no build or transformation at all, just set SCM_SCRIPT_GENERATOR_ARGS=--basic in the Azure App Settings. This will force the script generator to treat is as a 'basic' site, and won't do any build.
See wiki for more info.
I have done a fair amount of research for publishing and deploying web apps with VSTS and so far I have found the following:
1.Use .pubxml files and pass the DeployOnBuild = True to MSBuildArguments
2.Copy and publish Artifacts and create a separate release definition.
We are currently using the first method, but found that the build can be marked as successful but the publish fails (in our case it was a bad transform)
So we decided to look into separating the publish step and to create a release for our changes.
The issue I am running into is that the copy files and publish artifacts step only copies DLLs and our .js and .html/.css changes don't seem to make it to the UI. As a work around I added these params to the Copy Files method but it is taking a long time to publish. (There are no issues with the release definition)
**\*
!$tf\**
**\!$tf\**
**\!Debug\**
**\*!pdb
All the guides online for using the Publish Artifacts and creating a separate release definition seem to say the same thing, and only push Dlls from the bin folder to the IIS server.
Here is my build definition:
So my question is, why aren't UI changes being deployed to the website when the release completes if this isnt an issue anywhere else?
No, not really. We usually not only publish DLLs and our .js and .html/.css as build artifacts, but for the whole web app.
The Visual Studio Build task usually with the MSBuild Arguments as (if you use ASP.NET build template, it's the default setting):
/p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:SkipInvalidConfigurations=true /p:PackageLocation="$(build.artifactstagingdirectory)\\"
In this way, the web app (published files) will be copyed to $(build.artifactstagingdirectory) directly. So you don’t need the Copy Files task.
If you need to deploy all files, you can use ** in Copy Files task.
Then when you deploy the app to IIS server in release definition, the whole web app in build artifacts will be downloaded.
what is the preferred method of deploying a web project? Just copy the dll's after compiling in release mode and registering them? or using NSIS to build an installer or the MS set up project.
I usually use a Web Deployment Project per WebSite or WebApplication, it allows me to manage build configurations, merging, and define pre-post build MSBuild tasks.
You can also take a look to the Microsoft Web Application Installer, it will be really nice, but it still in beta stage, however you can try it...
This depends greatly on where your webapp is going and the experience you wish to provide.
If you deploy to a web host its best to use xcopy deploy and documentation. If you have a real end system its simpler to create an installer to do all the leg work for your customers and maybe save your self some documentation work.
I would recommend investing in setting up a continuous integration process (CruiseControl.Net or TeamCity etc...) As you are probably not only going to deploy it to your customer only once.
Having an automatic deploy at the push of a button is a Godsend. If you invest a few days you can have automatic deploy to a dev-environment every time someone checks in code (and it compiles and all tests pass), set up daily deploys to a test environment and have a button to automatically deploy it to a staging environment whenever you want.
Andreas, I am in the process of getting CC.Net. In the meanwhile, I am using the Web Deployment Project. Using this and going through the set up, it creates a 'release' folder with a bin folder conntaining dll's and also the aspx in the parent folder.
I assume I can now create an MSI file using the 'release' folder or do i need to do something different to create an MSI which i can run on the client server