How to download files and folders keeping directory structure Windows 10 - directory

Is there an app or process to download full directories and files in batch? Either using a txt file containing URLs or simply copying and pasting the URLs?
I've tried jdownloader2, IDM, free download manager, etc with no luck
wget seems a possibility from my research, I have never used it before...
Example:
I wish to download to C:\Files\
http:\\www.files.com\dir1\dir2\*.*
http:\\www.files.com\dira\*.*
http:\\www.files.com\dirb\dird\*.*
http:\\www.files.com\a.jpg
http:\\www.files.com\b.txt
http:\\www.files.com\c.nfo
http:\\www.files.com\d.png
And when I open C:\Files\ I have the following:
dir2\*.*
dira\*.*
dird\*.*
a.jpg
b.txt
c.nfo
d.png
I hope that I explained this clear enough.
Thank you for reading this!

Related

WinSCP script to synchronize directories, but exclude several subdirectories

I need to write a script that synchronizes local files with a remote machine.
My file structure is:
ProjectFolder/
.git/
input/
output/
classes/
main.py
readme.md
I need to synchronize everything, but:
completely ignore .git folder
ignore files in input and output folders, but copy the folder
So far my code is:
open sftp://me:password#server -hostkey="XXXXXXXX"
option batch abort
option confirm off
synchronize remote "C:\Users\MYNAME\Documents\MY FOLDER\Python Projects\ProjectFolder" "/home/MYNAME/py_proj/ProjectFolder" -filemask="|C:\Users\MYNAME\Documents\MY FOLDER\Python Projects\ProjectFolder\.git"
close
exit
First question: it doesn't seems to work.
Second question, how to add mask for input and output folder if I have spaces in file paths?
Thanks to all in advance.
Masks for directories have to end with a slash.
To exclude files in a specific folder, use something like */folder/*
-filemask="|.git\;*/input/*;*/output/*"

Reading files present in a directory in a remote folder through SFTP

TLDR; Convert the bash line to download sftp files get Inbox/* to c++ or python. We do not have execute permissions on Inbox directory.
I am trying to read the files present in a directory in a remote server through SFTP. The catch is that I only had read and write permissions on the directory and not execute. This means any method that requires opening (cding) into the folder would fail. I need to read the file names since they are variable. From what I understand ls does not require execute privs. If I can get a list of files in the directory then reading then would be fine. Here is the directory structure:
Inbox
--file-a.txt
--file_b.txt
...
I have tried libssh but sftp_readdir required a handle of the open directory. I also looked at paramiko for python but that too requires to open the directory to read the file names.
I am able to do this in bash using send "get Inbox/* ${destination_dir}". Is there anyway I can use a similar pattern match but on c++ or python?
Also, I cannot execute bash commands through my binary. Does anyone know of any library in python or c++ (preferred) that would support this?
I have not posted here in a while so please excuse me if I am not following the formatting. I will learn from your suggestions. Thank you!

while loop to restrict the number of downloaded files continues beyond conditional parameters

I want to download a large number of ftp files. I made a list file that contain the links of the thousands of ftp. FTP download results in 'gbff.gz' files. Now say for some reason I want to restrict the number of downloaded files (with .gz extension) in current directory to 5.
To test that I made a while loop in R that use bash system command:
setwd("~/Desktop/test")
a<-system('find . | grep -i ".gz$" |wc -l')
while (a<5) {
system('wget -nc -tries=1 -i list.txt')
}
But seems like the while loop is not working. I mean, ideally it should break the loop when the number of .gz files in current directory is more than 5, but the download continues for all links in list file.
N.B.- My apologies for making such a hybrid script. As I already working in R it seems easy to me. I would also appreciate any alternate bash/awk/sed script if that is more suitable for this.
N.B2- FYI, I use -nc tag in wget, so a re-download of an already existing file should not occur.

Unzipping Multiple zip files using 7zip command line

I have a number of zip files located in a single folder eg:
file1.gz
file2.gz
file3.gz
file4.gz
I'm looking for a way of automatically unzipping these using a batch job to a similarly named folder structure so for example the contents of file1.gz will drop into a folder named file1.
I have been told that 7zip would address my issue but can't figure out how to go about it.
Any help is greatly appreciated.
Which OS are you using? This is something you'd do using the shell's capabilities, you could write
for A in *.gz ; do gunzip $A ; done
I'm using gunzip here, because .gz is actually gzip, But you can use the 7zip CLI tool as well, of course. If you're on Windows, then I recommend installing a real shell (the standard cmd.exe can not really be considered a shell IMHO).

Unix file permissions, WARNING: can't access

I'm trying to change the permissions of a few files that are used with a webpage I'm uploading to my site. I'm using the Unix command line to do it.
I've tried two commands:
chmod 755 index.html
chmod 644 index.html
But I get the message
chmod: WARNING: can't access index.html
after using these commands for some reason, and I have no idea why... initially I though it might be because I had the file open in a couple of programs (text editor and web browser), but I've closed these down, and I'm still getting the same problem... any idea why, and how I can set the permissions correctly so that the file will be viewable by anyone on the web, but only editable by me?
Cheers!
Here's a link that looks similar to your problem but it's on Solaris:
http://www.unix.com/solaris/45229-unable-chmod-file-directory.html
The solution is on pg 2 of this thread but the Cliff's note version of the solution is the person found that something else was mounting at that directory. It showed up when they ran
df -k /their_dir_location
Hope this helps.
another possible issue is ... if you are using solaris zones .. the directory visiable in more than one zone but only one zone has write abilities.

Resources