Configuring nuget for visual-studio-generated docker build - .net-core

Visual Studio now generates Dockerfile for dotnet projects, and we are using it (with slight tweaks) for our continuous integration.
However that Dockerfile does not have any provision for configuring nuget. It even only copies the .csproj file from context before running dotnet restore to avoid re-running that step during development.
But our project requires some modules from internal, password-protected repository, so I need to provide package sources and credentials to the dotnet restore command inside.
What is the best current practice for injecting a (environment-specific) nuget configuration?

This is documented here: https://github.com/dotnet/dotnet-docker/blob/main/documentation/scenarios/nuget-credentials.md.
To summarize, there are a variety of ways in which this can be done:
Use a multi-stage build to protect nuget.config that contains hard-coded credentials. Only recommended if you ensure that credentials are kept out of source code control and the nuget.config file is ephemeral.
Passing secrets by file with BuildKit. This is similar to the previous option but makes use of Dockerfile secrets to provide access to the nuget.config file.
Use environment variables in nuget.config. In this scenario, the nuget.config file would reference environment variables for its credential values. The environment variables would then be set by the build machine when executing a docker build.
Use the Azure Artifact Credential Provider. This is only possible if you make use of Azure Artifacts for your package feed.
No matter which option you choose, be sure that credentials are never stored within an image layer that is published.

Related

Setting dotnetcli no_proxy

I'm pulling my hair out. I've got an On Prem installation of Azure DevOps, and a pair of build agents. We're trying to move to .net core, but we have never been able to get it to work to push the nuget packages into DevOps feed. This should be straight forward.
The whole environment is hidden behind corporate firewall and proxy, and while the proxy config is good for nuget pull, and any other activity you care to name, we cannot invoke nuget push (or dotnet push) to our internal package repository. The only error I get is a 502 (bad gateway) from tunnel.js, but I've explicitly set the address of the DevOps server in NO_PROXY (environment variables, .proxy & .proxybypass for the devops agent, netsh winhttp proxy, build agent user internet connection settings, and %AppData%\Nuget\Nuget.Config file). Git works, nuget restore works, build works, packaging works, but the dotnet push (or nuget push) fail with this error.
Can anyone suggest any other places I might need to set a proxy bypass or no_proxy setting?
There can be many reasons why you are getting this problem, which may be related to your organization network, users roles and permissions or even task may be restricted by any policy.
But if above mentioned things are not true in your case then you should configure NuGet tools to authenticate with Azure Artifacts and other NuGet repositories. If all of the Azure Artifacts feeds you use are in the same organization as your pipeline, you can use the NuGetAuthenticate task without specifying any inputs. Check this Restore and push NuGet packages within your organization document for more information.
This task must run before you use a NuGet tool to restore or push packages to an authenticated package source such as Azure Artifacts. This task installs the Azure Artifacts Credential Provider into the NuGet plugins directory if it is not already installed.
If your agent is behind a web proxy, the NuGetAuthenticate will not set up nuget.exe, dotnet, and MSBuild to use the proxy. Then set the environment variable http_proxy and optionally no_proxy to your proxy settings as shown below.
nuget.exe config -set http_proxy=http://my.proxy.address:port
nuget.exe config -set http_proxy.user=mydomain\myUserName
Check this NuGet CLI environment variables for more information.

What is the equivalent of Octopus package library in Azure Devops

Our current CI pipleline uses Team City and Octopus deploy but I'm evaluating changing this to Azure Devops doing both the build and deployment.
The solution we have consists of some ASP.NET code stored in GitHub which is compiled by Team City and then that along with some other packages which don't change are all deployed to a server. The directory they deploy to is wiped so that it is a fresh install each time.
So far I've managed to create the build pipleline and am then using the IIS deployment process to deploy the build (haven't got it to wipe the existing site yet).
What do I use for the other parts of the solution which are static though? In Octopus these are stored in the package library and have been manually uploaded. Should I be looking at Azure artifacts for this?
Also how should I go about deploying these? Should I create multiple web app deploy steps, do something different or is there a way to select multiple packages on the one web app deploy step?
As the Package library in Octopus deploy (last time is used it) is setup as a Nuget feed, it's a pretty fair comparison. Azure Artifacts allows you to upload several package types including Nuget, so you should be able to use that the way you did the package library.
As for the cleanup of you site, it depends a bit on the task you use during the release.
If you are using the 'IIS web app deploy' step, checking the 'Remove Additional Files at Destination' box would fit your needs.
As you refer to your target environment as 'a machine' i'm not sure what kind of infra you are you using but in case of some IaaS setup powershell remoting[1] might also be a way to go.
[1] https://www.howtogeek.com/117192/how-to-run-powershell-commands-on-remote-computers/
You can still use Octopus packing and deployment steps in your Azure DevOps pipeline by installing the extension here.
What you can do with your static content is to include it in your code repo in a way that gets compiled and packed by Octopus and the deployed as a whole. However, if moving the static content from Octopus to your code is not allowed, then you could try handling the merging of the compiled solution plus static content via adding different process steps in Octopus.
The equivalent of the package library was Azure Artifacts. I has the functionality to be set up as a NuGet feed however I had issues publishing into it. Potentially due to two factor authentication.
My eventual solution was to set up repos containing the static files each with a pipeline that just copied files and published an artifact at the end.

Basic Azure DevOps flow for ASP .Net

I have a standard way of working for building and deploying my asp .net applications on a build server (usually Jenkins). I have a custom build script and publish profile that builds my solution, copies some files and publishes the web project all with a single call to MSBuild.
I am trying to recreate this in Azure DevOps. My first obstacle seems to be that build and "publish" (or "release" in DevOps parlance) are separate steps. But it also seems like maybe I don't need to "publish"?
So
How do I control what ends up in the "artifacts" folder of the build? Is that through Azure? Can I do it with a .MSBuild file? And
Do I even need the concept of "publish" or does the release step just copy all artifacts to the destination (a VM in this case)? Can I control what gets deployed?
I am having a hard time finding some kind of basic tutorial that cover asp .net projects being deployed this way.
How do I control what ends up in the "artifacts" folder of the build? Is that through Azure? Can I do it with a .MSBuild file?
Yes, we could use the MSBuild Arguments parameter /p:PackageLocation=$(build.artifactstagingdirectory) to control what ends up in the "artifacts" folder.
When we use the task Visual Studio build to build the project, we could edit the MSBuild Arguments parameter to be like: /p:DeployOnBuild=true /p:PublishProfile=Test /p:PackageLocation=$(build.artifactstagingdirectory)
Then we could use Publish Build Artifacts task to publish build artifacts to Azure Pipelines, TFS, or a file share.
Do I even need the concept of "publish" or does the release step just copy all artifacts to the destination (a VM in this case)? Can I control what gets deployed?
Yes, you can simply understand the release step just copy artifacts to the destination. If you want to control what gets deployed, you can filter the artifacts when you use the copy task with contents.
Check the article Deploying Web Applications Using Team Build and Release Management for the details.
Hope this helps.
If you have existing scripts that work, just use those :). The new YAML builds are moving slowly away from fully featured UI tasks with lots of logic to more bare bones scripts that do what is needed.
It is a common practice though to split build from publish. That way you can put in additional checks, reviews and approvers on the release stage.
In your case it would mean:
Build stage
Use MsBuild to build & create a publish package either in the form of a folder or a zip file.
Use the Publish Pipeline Artifact task to store this folder or zip file
Release stage
Use the Download Pipeline Artifact task to restore the file to publish
Optionally use a Infra as Code tool to prep the target environment.
Use msdeploy.exe or another tool to publish the package. Optionally generate and pass in a settings file to override certain settings.
This way you can have multiple release stages to generate test environments, temporary POC environments etc by simply using another set of variables and/or settings override file.
There are special tasks in Azure Pipelines that wrap around these tools and provide an easy to use UI and some additional logic. These can be very useful if you don't have intimate knowledge of the commandline tools. If you do know your way around them, there may not be a very strong reason to use the fancy tasks.

Azure website deployment - .Net website

I'm doing a web site deployment in azure with bit bucket source.
When I do the deployments I can see always its building the source,
Actually that is not required to me, because it is a Kentico 10 web site (.Net website project).
How do i avoid building while source deployment/ pull the latest from bitbuckt ?
You should stop using continuous integration process in bitbucket and hook your own process to do a xcopy (preferably delta) to target website folder.
Using the OOB tools it is not possible to deploy without a build. So you can do a few things:
FTP
Visual Studio publish
command line copy after a successful build locally.
Another setup could reduce the number of builds you have when deploying but will still build the solution, more branches in Bitbucket.
You could continue to use CI but make sure you hook your environments to proper branches so they only deploy when you perform a merge into that branch.
How do i avoid building while source deployment/ pull the latest from bitbuckt ?
You could check the deployment details under "DEPLOYMENT > Deployment options" as follows:
And you could leverage KUDU and check the auto-generated deploy.cmd file under D:\home\site\deployments\tools\deploy.cmd.
For your requirement, I would recommend you customize your deploy.cmd file, and put .deployment and deploy.cmd files into your Bitbucket repository. For a simple way, you could just download your current deployment script and modify the scripts under the Deployment section, you need to remove the script for building your solution and just leave the script for kudu sync, and you need to modify the value for the -f option from "%DEPLOYMENT_TEMP%" to "%DEPLOYMENT_SOURCE%" when invoking the %KUDU_SYNC_COMMAND%. Details you could follow Custom Deployment Script.
If you want to deploy the full content of your repo with no build or transformation at all, just set SCM_SCRIPT_GENERATOR_ARGS=--basic in the Azure App Settings. This will force the script generator to treat is as a 'basic' site, and won't do any build.
See wiki for more info.

How to implement Continuous integration on asp.net website projects not web application using TFS 17

I have an asp.net website project where want to implement continuous integration using TFS 2017. I really wanted to know how to get rid of config files with environment specific keys.
Currently I have created some config transform files and publish profiles explicitly for each environments. My build definition will create a package after the build. But the package contain config specifically for an environment.
I want to know how to tokenize this thing like we do for web application project.
Any help will be much appreciated.
Thanks.
Here is an example about using parameters.xml file in website project and deploy in TFS release.
Pay attention to:
Step 2. MSBuild arguments as follows:
/p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:SkipInvalidConfigurations=true /p:PackageLocation=”$(build.artifactstagingdirectory)\\”
Step3: In each environment, using Replace Token step to replace the values you want to change in web.config file.
Update:
As a workaround, you could use the Unzip step to unzip the package and using the replace token step to change the web.config file. Then zip it again or deploy to IIS directly.

Resources