Azure DevOps Pipeline Task to connect to Unix Server and execute commands - unix

I am seeking to set up a Release Pipeline in Azure DevOps Services that will deploy
an application to a Unix server, where it then executes some unix commands as part
of the deployment.
Would appreciate some guidance on which pipeline Task(s) I can set up to therefore
achieve the following objectives:
Connect to the Unix server.
Execute the required Unix commands.
By the way, the Agents are currently installed on Windows hosts but we are looking to
extend that to Unix servers in due course, so a solution that fits both setups would
be ideal, even though the former is the priority.

You can check out task SSH Deployment task.
Use this task to run shell commands or a script on a remote machine using SSH. This task enables you to connect to a remote machine using SSH and run commands or a script.
If you need to copy files to the remote linux server. You can check out Copy Files Over SSH task.
You probably need to create a SSH service connection. See steps here to create as service connection.

In the end, due to concerns raised about the install of private keys on the target server which is part of the SSH Deployment setup, we opted for the use of Deployment Groups which has enabled us to set up a persistent connection to our Linux server.

Related

How to start asp.net core server on linux and keep it running

I have created basic ASP.Net Core server on Azure Ubuntu VM. I have exposed the server to a port 80 using nginx.
I am conecting to the VM via ssh.
And starting the server with "dotnet run" command.
That works fine.
However, every time I close the ssh connection, the server is stopping as well.
Is there a way to start and keep running the asp.net core server on Linux without having an active ssh connection?
Basically what happens is:
You login with ssh
you startup an application under your user (dotnet run)
Close your ssh => logging out user, which means application is closed.
You need to start a service outside of your user. Here is some information:
https://askubuntu.com/questions/8653/how-to-keep-processes-running-after-ending-ssh-session
Otherwise i'd advice you to ask on https://askubuntu.com/

BMC Control-M - can it manage remote offsite servers

Our organisation uses Control-M to manage all the scripts within our network. However, we have recently entered a contract to have a new application hosted for us. It is an ecommerce application and has scripts run through Cron jobs.
What I'd like to know is that whether Control-M has any offsite agent functionality so that it can be installed on this remote site and then in some way keep in communication with the Control-M server we have within our infrastructure so that we can monitor the scripts along with the rest of our applications.
Thank you.
Control-M supports Agents anywhere as long as you have network connectivity. The default Server/Agent ports are 7005-7012 inclusive, so those will need opening on any firewall.
If you don't want the remote site to use the full Agent (which will need a local install) then you can use the Agentless option (works via WMI on Windows or SSH for Unix) and only needs defining on the Control-M Server side.
Not sure about you question. Control-M server works with tcp/ip to contact its agents and does not care if it is connecting to a server in the same datacenter or on another continent.
Install a VPN between you and your hosted application and install a control-m agent or setup an agentless scheduling using ssh and you are good to go.

Editing files on Google Cloud Engine VM

I have recently setup a VM on Google Cloud to develop and host my web site/application. The setup went fine, and I even have gcloud SDK up and running. I also have Apache installed and configured. My question is how do I setup my editing environment (PHP Storm) and upload my files? They seem to have the ports for FTP and SFTP blocked.
FTP uses a clear-text protocol and is thus not recommended. To use SFTP:
Make sure you can ssh to your instance: gcutil --project=<project> ssh <instance>. This does two things: (a) makes sure that port 22 is open on your VM, and (b) propagates your private key to the instance, if it's not already there.
Configure PHP Storm to use the Key pair authentication mechanism using the key ~/.ssh/google_compute_engine to log in to the instance.
That's it.

Java Deadlock detection on remote server

I've created a simple monitoring python program that checks remote servers over ssh
using paramiko and fabric api, now I want to find out if a deadlock occurred in remote server's java program.
I know that I can do this locally by using
$JAVA_HOME/bin/jstack -F `pgrep java`|grep "Deadlock"
but how can I do this over ssh using python?

Any issues running a daemon via XSP2?

We want to run a daemon that exposes itself via ASMX, using Mono 2.0 (or later). Instead of dealing with the ASP.NET hosting APIs, we're thinking about just starting a daemon thread in the Application_Start event. XSP2 shouldn't restart the appdomain, so our daemon will be safe.
Are there any downsides to this (besides being a bit odd)? Any other approaches that allow us to have our code running in the same appdomain as the ASMX requests?
Why do need XSP to run a daemon through calling an ASXM when you can just build a shell console application (with the same code or accepting arguments)? That can be called in terminal or called from any shell script and added to cron. Simple no server required to do this.
If you want to do this, not the way I would do it, you can setup a basic server instance (using nginx, lighty or apache) listing in a certain internal port, add that server to a dummy host and on cron/shell script you can do
WGET http://dummyhost/mydaemon.asmx

Resources