Fabric: rsync between two remote hosts - rsync

I want rsync files from remote Production server to remote Backup server using Fabric.
Server credentials stored in my local ~/.ssh/config
Host backup
HostName 1.1.1.1
Port 33333
User swasher
Host production
HostName 2.2.2.2
Port 44444
User swasher
Now I want run rsync on Production machine, and I need insert host/user/name of Backup server in this command, something like this
#hosts('production')
def backup():
run("rsync -avz -e 'ssh -p {PORT}' /from/ {USER}#{HOST}:/to/'.format(backup.PORT, backup.USER, backup.HOST))
How I can get credential of Backup server to run rsync?

Not w fabric but it does the job
try this to copy dirs and nested subdirs from local to remote:
cmd = "sshpass -p {} scp -r {}/* root#{}://{}".format(
remote_root_pass,
local_path,
remote_ip,
remote_path)
os.system(cmd)
don't forget to import os, You may check the exitcode returned (0 for success)
Also you might need to "yum install sshpass"
And change /etc/ssh/ssh_config StrictHostKeyChecking ask to: StrictHostKeyChecking no

Related

Trying to scp to an EC2 instance, states sftp only?

scp -Cpv -i /home/jamie/Downloads/jamie1.pem /srv/http/wordpress/wp- content/themes/dt-the7 ec2-user#52.210.108.143:/var/www/html/wp-content/themes/
[...]
debug1: Entering interactive session.
debug1: pledge: network
debug1: Sending command: scp -v -p -t /var/www/html/wp-content/themes/
This service allows sftp connections only.
Can anyone tell me how to also allow ssh/scp connections?
Thanks
You need to modify sshd_config on the server and restart the sshd daemon. The configuration will probably contain something like
ForceCommand internal-sftp
if you will comment that out, you should be able to get ssh and scp access.

Need to Run batch script in UNIX server and display the output through vbscript

I am currently developing the new VBScript to execute the Shell (through Putty software) in UNIX server,
Set shell = WScript.CreateObject("WScript.Shell")
shell.Exec D:\Putty.exe hostname -l username -pw password 1.sh
I am getting connection refused error.
when I run the below command without my script (1.sh)
shell.Exec D:\Putty.exe hostname -l username -pw password
Connection is getting established without any issues.
Also, I just wanted to extract the output, once extracted, the session should get closed automatically.
This doesn't work in putty.exe. Putty has however a dedicated program to do these kind of things, it's called plink.exe - there you can pass commands and read the output just as you would expect, and your example should work just like you specified it.
PuTTY Link: command-line connection utility
Release 0.63
Usage: plink [options] [user#]host [command]
("host" can also be a PuTTY saved session name)
Options:
-V print version information and exit
-pgpfp print PGP key fingerprints and exit
-v show verbose messages
-load sessname Load settings from saved session
-ssh -telnet -rlogin -raw -serial
force use of a particular protocol
-P port connect to specified port
-l user connect with specified username
-batch disable all interactive prompts
The following options only apply to SSH connections:
-pw passw login with specified password
-D [listen-IP:]listen-port
Dynamic SOCKS-based port forwarding
-L [listen-IP:]listen-port:host:port
Forward local port to remote address
-R [listen-IP:]listen-port:host:port
Forward remote port to local address
-X -x enable / disable X11 forwarding
-A -a enable / disable agent forwarding
-t -T enable / disable pty allocation
-1 -2 force use of particular protocol version
-4 -6 force use of IPv4 or IPv6
-C enable compression
-i key private key file for authentication
-noagent disable use of Pageant
-agent enable use of Pageant
-m file read remote command(s) from file
-s remote command is an SSH subsystem (SSH-2 only)
-N don't start a shell/command (SSH-2 only)
-nc host:port
open tunnel in place of session (SSH-2 only)
-sercfg configuration-string (e.g. 19200,8,n,1,X)
Specify the serial configuration (serial only)

How do you rsync build files from Gitlab CI to another server

It's unclear to me how to get my build files from the Gitlab CI (hosted on https://ci.gitlab.com) over to my personal server using rsync.
I have setup 1 test and 1 deploy job.
Under the deploy tab I have inputed the bash commands to:
Install rsync
Update packages
Finally, the rsync command to
transfer files over SSH to my personal server.
When I enter the SSH credentials (with verbose flag on) for my private personal server, it would appear that the SSH key is the issue. In Gitlab, I have already established the deploy key (for hooks - tested this and it works).
Where do I locate the public SSH key for the Gitlab deploy instance so that I can install that key on my server?
Below is the exact script entered in Gitlab CI deploy job script pane:
# Run as root
(
set -e
set -u
set -x
apt-get update -y
apt-get -y install rsync
)
git clone https://github.com/bla/deployments.git $HOME/deploy/deployments
SVR_WEB1_WEBSERVER="000.11.22.333"
USER1="franklin"
GROUP1="team1"
FROM_DIR="/gitlab-ci-runner/tmp/builds/myrepo-1/"
DEST1="subdomains/gitlab/myrepo"
EXCLUSIONS_LIST="${HOME}/deploy/deployments/exclusions/exclusions.txt"
ssh -v "$USER1#$SVR_WEB1_WEBSERVER"
/usr/bin/rsync -avzh --progress --delete -e ssh --group=$GROUP1 -p --exclude-from "$EXCLUSIONS_LIST" "$FROM_DIR" "$USER1#$SVR_WEB1_WEBSERVER:$DEST1"
Providing your private ssh key is dangerous unless you use your own gitlab-ci runners for deployment. That's why it is better to use rsync modules.

unix - SFTP server, limited user access

The usuall question that still is challanging to implement: having one directory shared through sftp (read-only). Following this wiki: http://en.wikibooks.org/wiki/OpenSSH/Cookbook/SFTP
The changes I made on ssh_config are:
PasswordAuthentication yes
Match Group sftp-users
ChrootDirectory /home
AllowTCPForwarding no
X11Forwarding no
ForceCommand internal-sftp
Then creating a user in the group:
useradd -M -G sftp-users username-sftp
passwd username-sftp
Restarting ssh to take settings:
restart ssh
Then connecting to this host from another PC provides the ftp prompt:
sftp username-sftp#hostname
But when executing the ls command, I see all directories in the root of the sftp server.
sftp> ls
bin boot cdrom dev etc home initrd.img initrd.img.old lib lib64 lost+found media mnt opt proc root run sbin srv sys tmp usr var vmlinuz
How can I limit the access of sftp-users to only one directory?
I was actually editing the wrong config file. The one that contains the settings is "sshd_config" and not "ssh_config".

Remote mounting

mount -t smbfs -o username=Administrator,password=Password //severIP/dev/hda1 /mnt/mountTemp
where
/dev/hda1 - filesystem in the remote machine
/mnt/mountTemp - mount pont in the local machine
This command mounts a remote filesystem in your local machine. But is there a possibility where you can mount the remote filesystem in the remote machine itself??(but the command has to be fired from your local machine)
You can use ssh to run command on remote machine
ssh user#remote command

Resources