Cloudflare Bulk domain Add Automation - unix

Looking for cloudflare solution.
trying to bulkonbaord 20 domains into CF account.
I am on Windows 10 running flarectl package for Windows.
I was able to set the necessary keys and env.
Now I want to add more domains in bulk from a txt file.
The solution provided here is containing only the linux commands.
https://support.cloudflare.com/hc/en-us/articles/360000841472-Adding-Multiple-Sites-to-Cloudflare-via-Automation
unable to execute or convert into windows.
for domain in $(cat domains.txt); do flarectl zone create --zone=$domain --jumpstart=false; done
Can anybody tell me the changes needed for Windows?
Kind regards and many thanks

Create a comma separated file "domains.txt" with the domains you plan to add to CloudFlare.
Open the command line and type (replacing path_to_file and path_to_flarectl accordingly):
set /P domains=<path_to_file\domains.txt
for %d in (%domains%) do (
path_to_flarectl\flarectl.exe zone create --zone=%d --jumpstart=false
)
Obs.: I've noticed a 50 entries limit.

Related

R studio proxy / no proxy switching

R-studio integration with github works behind my corporate firewall when I set the proxy in the .Renviron file
http_proxy = http://<proxy>:80
https_proxy = http://<proxy>:80
In my case I don't need to specify user name and password, which I don't want hard coded anywhere.
When I work from home I get errors since R-studio is trying to find the proxy and failing.
Is it possible to write to the .Renviron file so that
Try using the proxy
if works then continue
if fails then ignore proxy settings and continue.
Check first if you can use "Conditional file and directory names"
If the name of a file consists of a <key>=<value> specification, then that file will be included / used only if the specification is fulfilled (on the current system with the current R setup).
For instance, a file ~/.Rprofile.d/os=windows.R will be ignored unless startup::sysinfo()$os == "windows", i.e. the R session is started on a Windows system.
you could use a customed rprofile filename to:
unset HTTP(s)_proxy
only if, for instance, hostname=YourHomeLaptop
That is
~/.Rprofile.d/nodename=YourHomeLaptop
If you are using the same laptop, use:
dirname
the same project with two clones of the same repository
You can use a different profile that way, with an easy git pull from one repository (for home work) to the other (for office work), and vice-versa.

Download only new files with WinSCP

I am currently writing a WinSCP script whose aim is to retrieve all the files from an SFTP server and then put them on a specified location in a destination server (on which the script is located, FYI).
Is there anyone to check if a file has already been transferred on the destination server? Is it overwritten when it has? In that case, is that really a bad thing? In such a case, I guess that if the file already exists on the destination server, I would like nothing to happen. If it doesn't exist, then I'd like to proceed with the transfer.
You will find enclose the code written so far below
# Automatically abort script on errors
option batch abort
# Disable overwrite confirmations that conflict with the previous
option confirm off
# Connect using a password
open sftp://SERVER#IP_ADDRESS:PORT -privatekey="PRIVATE_KEY" -hostkey="HOSTKEY" -passive=off
# Change remote directory
cd in
cd DIRECTORY
# Force binary mode transfer
option transfer binary
# Get ALL files from the directory specified
get /*.csv* \\DIRECTORY
# Remove all .csv files
rm /*.csv
# Exit WinSCP
bye
Thank you very much in advance for your help, hope it was clear enough, otherwise please let me know if I can provide you with further information
The easiest solution is to add -neweronly switch to your get command:
get -neweronly /*.csv \\DIRECTORY
For very similar results, you can also use synchronize command:
synchronize local \\DIRECTORY / -filemask=*.csv
See also WinSCP article on Downloading the most recent file.

Best rsync syntax for transfering a wordpress site

I've found several rsync commands for moving my wordpress site from one local machine to a remote server. I've successfully used the following command suggested by another Stackoverflow user:
rsync -aHvz /path/to/sfolder name#remote.server:/path/to/remote/dfolder
Would you say that it's enough, or would you suggest other attributes?
It's my understanding that an SSH connection would be safer. How does this command change if I want to make it over SSH. Also are there other things to be done besides including the SSH command ( like generating/installing the keys etc etc ). I just starting so a detailed explanation for a noob would be very much appreciated.
Pakal
There are thousands of ways in which you can customize this powerful command. Don't worry about SSH, by default its using SSH. Rest of the options depend on your requirement. You can consider '--perms' to preserve permissions. Similarly '-t' preserves times. I don't know if its relevant in transfer of a site.
Also '-n' would show you a dry run of transfer scenario. '-B' switch allows you to override custom block size.
Probably you should have a look at options yourself and find the appropriate ones by running 'info rsync'.
the above command will use ssh and i see no problems with it's general usage.

How to log in over network with CMD via UNC path?

I want to be able to connect to a remote machine through its UNC path in either windows CMD or powershell; I have tried C:\pushd \\MyServer\"User Folders"\localUser\TestFolder but when this executes, I get "Logon failure: unknown user name or bad password"
is "pushd" even the right command to use here? I have files that I want to exchange between the two machines on the same network, can there be permission bits I'm overlooking here?
No, pushd is not the right command. For connecting to a remote share you need the command net use:
net use X: \\SERVER\SHARE /user:DOMAIN\USER
If you're using the same account on both hosts (both a domain account as well as identical local accounts will work) you can omit the /user:DOMAIN\USER part.
Normally you'd connect only to the share, but you can also connect directly to some folder below the share:
net use X: \\SERVER\SHARE\some\subfolder
pushd should work for you, given that you have the required permissions to access the share as the current user.
Source:
If you specify a network path, the pushd command temporarily assigns
the first unused drive letter (starting with Z:) to the specified
network resource. The command then changes the current drive and
directory to the specified directory on the newly assigned drive. If
you use the popd command with command extensions enabled, the popd
command removes the drive-letter assignation created by pushd.
Note that the Powershell pushd alias (really Push-Location) does not map a drive letter, but otherwise works the same, i.e. lets you use the respective share as current directory.
So, yes it looks like you have a permission problem. Try accessing the share using explorer (or net use as #Ansgar Wiechers suggests in his answer, or even a simple dir \\share\...) to cross check.

How to pull logs from Unix boxes to my local machine?

I deploy applications to Unix boxes, we will work on around 100 boxes and lets say application A will be deployed on 5 boxes like Box1,Box2,Box3,Box4,Box5, Every time we deploy an application A we will go to each Box1,2,3,4,5 and check whether the application A which i s deployed has started properly or not in the path of BOX1/A/B/C/logs folder on each and every box and for every single application.
Is there a way we can pull the logs to local from all the Boxes 1,2,3,4,5 and it should allow me to Search the logs by Application A name.
Thanks for your help in advance ...
Of course, your question is not so specific that I can tell you exactly what to do in your particular case, but something very like this will nonetheless aggregate the data on your local stdout, after which you can process it locally as you like:
for I in $(seq 1 5); do echo "box$I:"; ssh username#box$I 'cat /var/log/mylog'; echo; done
Many variations on the theme are possible, but if you can get this one to work, then you should soon be able to see how to adapt it to your own need.
Note that, for Ssh to do its work without requiring a manual login on each machine requires some setup on both the local and remote boxes: man 1 ssh and review the AUTHENTICATION section, especially the paragraph that speaks of the authorized_keys file.

Resources