TFS to Marklogic Integration for deployment - xquery

Does anyone know about any type plugin or tool through which we can do integration of TFS with MarkLogic which can help us doing deployment directly from TFS to MarkLogic Module database.

You will need a TFS client to first fetch the files from TFS, and then load the modules into MarkLogic.
It's the same sort of process that you would have for any version control system, such as Git or SVN. You need to obtain the modules from the version control repository and then deploy them.
For instance, for a CI environment you can configure a Jenkins project to use the TFS Plugin to poll and look for changes or build periodically on a schedule, and then use your build/deployment scripts, such as ml-gradle build to deploy to MarkLogic with the appropriate target (i.e. mlRedeploy, mlReloadModules, etc) depending upon whether it is a complete install or re-install, or if you simply want incremental deployment of module changes.

No, there is no tool/functionality available.

Related

Basic Azure DevOps flow for ASP .Net

I have a standard way of working for building and deploying my asp .net applications on a build server (usually Jenkins). I have a custom build script and publish profile that builds my solution, copies some files and publishes the web project all with a single call to MSBuild.
I am trying to recreate this in Azure DevOps. My first obstacle seems to be that build and "publish" (or "release" in DevOps parlance) are separate steps. But it also seems like maybe I don't need to "publish"?
So
How do I control what ends up in the "artifacts" folder of the build? Is that through Azure? Can I do it with a .MSBuild file? And
Do I even need the concept of "publish" or does the release step just copy all artifacts to the destination (a VM in this case)? Can I control what gets deployed?
I am having a hard time finding some kind of basic tutorial that cover asp .net projects being deployed this way.
How do I control what ends up in the "artifacts" folder of the build? Is that through Azure? Can I do it with a .MSBuild file?
Yes, we could use the MSBuild Arguments parameter /p:PackageLocation=$(build.artifactstagingdirectory) to control what ends up in the "artifacts" folder.
When we use the task Visual Studio build to build the project, we could edit the MSBuild Arguments parameter to be like: /p:DeployOnBuild=true /p:PublishProfile=Test /p:PackageLocation=$(build.artifactstagingdirectory)
Then we could use Publish Build Artifacts task to publish build artifacts to Azure Pipelines, TFS, or a file share.
Do I even need the concept of "publish" or does the release step just copy all artifacts to the destination (a VM in this case)? Can I control what gets deployed?
Yes, you can simply understand the release step just copy artifacts to the destination. If you want to control what gets deployed, you can filter the artifacts when you use the copy task with contents.
Check the article Deploying Web Applications Using Team Build and Release Management for the details.
Hope this helps.
If you have existing scripts that work, just use those :). The new YAML builds are moving slowly away from fully featured UI tasks with lots of logic to more bare bones scripts that do what is needed.
It is a common practice though to split build from publish. That way you can put in additional checks, reviews and approvers on the release stage.
In your case it would mean:
Build stage
Use MsBuild to build & create a publish package either in the form of a folder or a zip file.
Use the Publish Pipeline Artifact task to store this folder or zip file
Release stage
Use the Download Pipeline Artifact task to restore the file to publish
Optionally use a Infra as Code tool to prep the target environment.
Use msdeploy.exe or another tool to publish the package. Optionally generate and pass in a settings file to override certain settings.
This way you can have multiple release stages to generate test environments, temporary POC environments etc by simply using another set of variables and/or settings override file.
There are special tasks in Azure Pipelines that wrap around these tools and provide an easy to use UI and some additional logic. These can be very useful if you don't have intimate knowledge of the commandline tools. If you do know your way around them, there may not be a very strong reason to use the fancy tasks.

Visual Studio: Publish a website to remote IIS and Push to GIT simultaneously

Can Visual Studio be configured to Publish (deploy) and Push (to GIT) simultaneously?
I have Visual Studio configured to "Publish" "only files needed to run this application" to a folder on a remote server which IIS is pointing to. When I make local changes, I can publish remotely easily.
I've also configured GIT for the project. The publish information is in the repo so that anyone can pull the project, make changes, and Publish. My general practice is to Pull, work, Push to GIT, then Publish the site--all using VS.
What is the best way to synchronize these actions? I don't want anyone to publish the app and forget to push to GIT at the same time.
I've worked with dev/production servers using typical web layouts before (i.e. push to git repo that IS the location of production files), but in this case that doesn't work because of the minimalist file structure of a "Published" site. I'd have to coordinate the exclude files in GIT with the files "not used" for publishing.
Visual Studio 2017, IIS 10.0
EDIT:
The GIT server as well as the project are hosted internally (albeit on different servers). Storing the code locally is a requirement, I cannot upload to TFS (so, so unfortunately).
Your requirement can be achieved in TFS/VSTS easily.
First, TFS/VSTS supports GIT version control, you can use it version control your project. You can refer to the following link for more details:
https://learn.microsoft.com/en-us/vsts/git/gitquickstart?view=vsts&tabs=visual-studio
Also, TFS/VSTS supports continuous integration and continuous deployment. A continuous integration trigger on a build definition indicates that the system should automatically queue a new build whenever a code change is committed. You can make the trigger more general or more specific, and also schedule your build (for example, on a nightly basis). You could also enable the Continuous deployment trigger, which will create release every time a new build is available.
https://learn.microsoft.com/en-us/vsts/build-release/actions/ci-cd-part-1?view=vsts

Azure website deployment - .Net website

I'm doing a web site deployment in azure with bit bucket source.
When I do the deployments I can see always its building the source,
Actually that is not required to me, because it is a Kentico 10 web site (.Net website project).
How do i avoid building while source deployment/ pull the latest from bitbuckt ?
You should stop using continuous integration process in bitbucket and hook your own process to do a xcopy (preferably delta) to target website folder.
Using the OOB tools it is not possible to deploy without a build. So you can do a few things:
FTP
Visual Studio publish
command line copy after a successful build locally.
Another setup could reduce the number of builds you have when deploying but will still build the solution, more branches in Bitbucket.
You could continue to use CI but make sure you hook your environments to proper branches so they only deploy when you perform a merge into that branch.
How do i avoid building while source deployment/ pull the latest from bitbuckt ?
You could check the deployment details under "DEPLOYMENT > Deployment options" as follows:
And you could leverage KUDU and check the auto-generated deploy.cmd file under D:\home\site\deployments\tools\deploy.cmd.
For your requirement, I would recommend you customize your deploy.cmd file, and put .deployment and deploy.cmd files into your Bitbucket repository. For a simple way, you could just download your current deployment script and modify the scripts under the Deployment section, you need to remove the script for building your solution and just leave the script for kudu sync, and you need to modify the value for the -f option from "%DEPLOYMENT_TEMP%" to "%DEPLOYMENT_SOURCE%" when invoking the %KUDU_SYNC_COMMAND%. Details you could follow Custom Deployment Script.
If you want to deploy the full content of your repo with no build or transformation at all, just set SCM_SCRIPT_GENERATOR_ARGS=--basic in the Azure App Settings. This will force the script generator to treat is as a 'basic' site, and won't do any build.
See wiki for more info.

How is Jenkins helpful to automate deployment process

Can anyone provide insights of using Jenkins for automating deployment under controlled and uncontrolled enviroments. We have different environments - dev/qa/uat/prod and currently we are using batch files that call msbuild/nant scripts to deploy on web and DB servers (web farm). Developers only have access to dev/qa and production support will deploy on uat/prod. Prod. support will get the source code from SVN tag folder and run the batch file to deploy the application.
By using Jenkins, is it possible to eliminate the step of prod. support team getting the script from SVN by running the jobs using their credentials via url. And what is the general practice using source control and CI tool for deploying applications.
My recommendation is to reserve Jenkins for just building the software. That way the user of Jenkins only have access to development and perhaps QA systems.
To decouple the build system from the process that deploys the software I recommend the use of a binary repository manager like:
Nexus
Artifactory
Archiva
In that way deployment scripts could retrieve any version of a previous build. The use of a repository manager would enable your QA team to certify a release prior to it's deployment onto production.
Finally, consider one of the emerging deployment automation tools. Tools like Chef, Puppet, Rundeck can be used to further version control the configuration of your infrastructure.

How can I deploy ASP.NET (mvc) site using GIT and for ex. beanstalkapp.com via FTP?

The problem is, that when I commit project directory, there is uploaded everything including source code.
Not really sure why you want to upload via FTP? You shouldn't commit your own compiled binaries to source control for deployment though.
You could take a look at AppHarbor, just push your code with git and it will be build and deployed automatically.
more about AppHarbor
Real alternatives to Windows Azure PaaS (web role)?
Does it matter? Since asp.net pages can be compiled on the server, having source files on the web server is sometimes normal so IIS knows not to allow access to them.
That said, uploading output binaries into source control is generally a bad idea - it is better to do the deployment from your build server.
Actually, this is kind of hard.
For months, I've tried to automatize our deployment, without absolute success. For my experience, I can see only way to do that:
Have a build server on your deployment machine (or same network)
A build server will pull out your code from repository, say, once per minute and will check for modifications. If there's modifications, it will execute the build scripts related to this project. I suggest you to use TeamCity, because it is very easy to use compared to CruiseControl (I'm not sure if you can use Git with TFS). You can program your build server for build your solution or project and after, you can execute an msbuild script to copy the files to the production folder (e.g: c:\inetpub\yourapp or \\my_server\inetpub\yourapp). You can use MSBuild's Copy Task to do that.
UPDATE 1: I didn't tried, but if helps, you can push to an FTP server using git-ftp
UPDATE 2: Seems that some guy did some workarounds and successfully deployed his app using git and FTP.

Resources