We are currently using SVN scripting to verify what is deployed production matches the last release deployed and checked into SVN. How can I achieve the same using artifactory?
Artifactory has in-build feature of promoting the artifacts to every stage. You can use this feature to move/promote the artifacts on higher env so that there is a single source of truth i.e artifactory
Related
I'm pulling my hair out. I've got an On Prem installation of Azure DevOps, and a pair of build agents. We're trying to move to .net core, but we have never been able to get it to work to push the nuget packages into DevOps feed. This should be straight forward.
The whole environment is hidden behind corporate firewall and proxy, and while the proxy config is good for nuget pull, and any other activity you care to name, we cannot invoke nuget push (or dotnet push) to our internal package repository. The only error I get is a 502 (bad gateway) from tunnel.js, but I've explicitly set the address of the DevOps server in NO_PROXY (environment variables, .proxy & .proxybypass for the devops agent, netsh winhttp proxy, build agent user internet connection settings, and %AppData%\Nuget\Nuget.Config file). Git works, nuget restore works, build works, packaging works, but the dotnet push (or nuget push) fail with this error.
Can anyone suggest any other places I might need to set a proxy bypass or no_proxy setting?
There can be many reasons why you are getting this problem, which may be related to your organization network, users roles and permissions or even task may be restricted by any policy.
But if above mentioned things are not true in your case then you should configure NuGet tools to authenticate with Azure Artifacts and other NuGet repositories. If all of the Azure Artifacts feeds you use are in the same organization as your pipeline, you can use the NuGetAuthenticate task without specifying any inputs. Check this Restore and push NuGet packages within your organization document for more information.
This task must run before you use a NuGet tool to restore or push packages to an authenticated package source such as Azure Artifacts. This task installs the Azure Artifacts Credential Provider into the NuGet plugins directory if it is not already installed.
If your agent is behind a web proxy, the NuGetAuthenticate will not set up nuget.exe, dotnet, and MSBuild to use the proxy. Then set the environment variable http_proxy and optionally no_proxy to your proxy settings as shown below.
nuget.exe config -set http_proxy=http://my.proxy.address:port
nuget.exe config -set http_proxy.user=mydomain\myUserName
Check this NuGet CLI environment variables for more information.
Can Visual Studio be configured to Publish (deploy) and Push (to GIT) simultaneously?
I have Visual Studio configured to "Publish" "only files needed to run this application" to a folder on a remote server which IIS is pointing to. When I make local changes, I can publish remotely easily.
I've also configured GIT for the project. The publish information is in the repo so that anyone can pull the project, make changes, and Publish. My general practice is to Pull, work, Push to GIT, then Publish the site--all using VS.
What is the best way to synchronize these actions? I don't want anyone to publish the app and forget to push to GIT at the same time.
I've worked with dev/production servers using typical web layouts before (i.e. push to git repo that IS the location of production files), but in this case that doesn't work because of the minimalist file structure of a "Published" site. I'd have to coordinate the exclude files in GIT with the files "not used" for publishing.
Visual Studio 2017, IIS 10.0
EDIT:
The GIT server as well as the project are hosted internally (albeit on different servers). Storing the code locally is a requirement, I cannot upload to TFS (so, so unfortunately).
Your requirement can be achieved in TFS/VSTS easily.
First, TFS/VSTS supports GIT version control, you can use it version control your project. You can refer to the following link for more details:
https://learn.microsoft.com/en-us/vsts/git/gitquickstart?view=vsts&tabs=visual-studio
Also, TFS/VSTS supports continuous integration and continuous deployment. A continuous integration trigger on a build definition indicates that the system should automatically queue a new build whenever a code change is committed. You can make the trigger more general or more specific, and also schedule your build (for example, on a nightly basis). You could also enable the Continuous deployment trigger, which will create release every time a new build is available.
https://learn.microsoft.com/en-us/vsts/build-release/actions/ci-cd-part-1?view=vsts
Does anyone know about any type plugin or tool through which we can do integration of TFS with MarkLogic which can help us doing deployment directly from TFS to MarkLogic Module database.
You will need a TFS client to first fetch the files from TFS, and then load the modules into MarkLogic.
It's the same sort of process that you would have for any version control system, such as Git or SVN. You need to obtain the modules from the version control repository and then deploy them.
For instance, for a CI environment you can configure a Jenkins project to use the TFS Plugin to poll and look for changes or build periodically on a schedule, and then use your build/deployment scripts, such as ml-gradle build to deploy to MarkLogic with the appropriate target (i.e. mlRedeploy, mlReloadModules, etc) depending upon whether it is a complete install or re-install, or if you simply want incremental deployment of module changes.
No, there is no tool/functionality available.
I am currently using Visual Studio Team Services (was VS Online) to version control my projects. When I want to deploy my projects to my VPS I use the Visual Studio Publish that stores the files on my hard drive and then I use an FTP client to send the files to my VPS.
But now I am viewing the build and release functions in Visual Studio Team Services. But I don't completely understand it all.
Questions:
What is the purpose of the build?
I have created a new standard build definition using the Visual Studio template and used the Host agent pool.
When I run the build I can see that it creates a new version using the last commit as reference. But what has it done in the backend on the host agent?
And where are the files stored of this new created build? In the log I see Copy Files to: $(build.artifactstagingdirectory) but where is this?
What is the purpose of the release?
I have created a new release plan using the empty template because I don't have Azure, I use another company where I have a VPS running.
I then added 3 environments called Development, Staging and Production.
All of them using the Host agent, but I think here I need to adjust this because if I understand it I now can assign my VPS to my Production environment or not?
Does someone has done this using Visual Studio Team Services and a VPS that runs on Windows Server 2012?
Are there videos or docs available about this because its quite confusing what the correct steps are in deploying versions of web projects.
In general, "Build" focus on integrating and building your code changes and run some basic unit tests. And "Release" is used to deploy the output of "Build" to your environments so that you can run some future tests/verification and final deploy the output to a production environment for your customers to use. See this link for reference: What is the difference between build and release engineering, DevOps and site reliability engineering?
When you start a build without any source version specified, it will get the latest version of the code and then build the code. All the files are placed in the working folder of the build agent during the build process. $(build.artifactstagingdirectory) is a predefined variable that point to a path in the build agent work folder. We usually copy the build output to this path so that we can access to output easily in the following tasks. For example, use "Publish Artifact" task to publish the output files in $(build.artifactstagingdirectory) folder to the server or a file share, and then the team members can get this build from server or file share.
In the release, you can link a release to the build definition. When the release starts, it will get the latest artifacts of the build and deploy to your environment.
The agent is a machine used to execute the tasks in build/release definition. Hosted agent is managed by Azure and you can also deploy your own build agent. So you don't need to change the agent settings for "Production" environment if your VPS can be accessed by the Hosted agent. Since you are using FTP to upload files to your VPS previously, you can still use this method to publish the files by adding a FTP Uploader task in release definition.
For more information, refer to Build and Release Management for details.
Can anyone provide insights of using Jenkins for automating deployment under controlled and uncontrolled enviroments. We have different environments - dev/qa/uat/prod and currently we are using batch files that call msbuild/nant scripts to deploy on web and DB servers (web farm). Developers only have access to dev/qa and production support will deploy on uat/prod. Prod. support will get the source code from SVN tag folder and run the batch file to deploy the application.
By using Jenkins, is it possible to eliminate the step of prod. support team getting the script from SVN by running the jobs using their credentials via url. And what is the general practice using source control and CI tool for deploying applications.
My recommendation is to reserve Jenkins for just building the software. That way the user of Jenkins only have access to development and perhaps QA systems.
To decouple the build system from the process that deploys the software I recommend the use of a binary repository manager like:
Nexus
Artifactory
Archiva
In that way deployment scripts could retrieve any version of a previous build. The use of a repository manager would enable your QA team to certify a release prior to it's deployment onto production.
Finally, consider one of the emerging deployment automation tools. Tools like Chef, Puppet, Rundeck can be used to further version control the configuration of your infrastructure.