I would like to programmatically move a group of files from a local directory into a WebDAV directory.
I am guessing a simple batch file would not work because it is a WebDAV directory. Note: the machine is Windows Server 2003 so there is no support for mapping a WebDAV directory to a drive letter so the drive just looks like this: http://dev1:8080/data/xml and cannot be made to look like //dev1/data/xml
you could use the BMOVE Method
You could use a webdav client such as the one contained in this project (it's Apache Licensed afaik), then basically call it with a batch file / shell script
Cadaver may allow you to write a batch script that does all this; otherwise you could use CURL directly, but you'd need to know a bit more about the actual WebDAV protocol (you'd basically need to locally traverse a directory, MKCOL for every subdirectory and PUT for every file).
I'm not sure how well either of these tools compile on Windows, but if it doesn't work out of the box, you could always run it on top of Cygwin. While you're using Cygwin, you can also just create standard shell scripts (/bin/sh or /bin/bash) which will likely actually be easier than windows' .BAT format.
You can use python-webdav-library
from webdav import WebdavClient
url = 'https://somesite.net'
mydav = WebdavClient.CollectionStorer(url, validateResourceNames=False)
mydav.connection.addBasicAuthorization(<username>, <password>)
fid = open(<filepath of file you want to upload> ,'rb')
mydav.path = <path to where you want the file to be, ie '/a/b/c.txt'>
mydav.uploadFile(fid)
Free WinSCP (for Windows) supports WebDAV (and WebDAVS). WinSCP supports scripting/command-line operations too.
A sample WinSCP script to upload file over WebDAV:
open http://user#webdav.example.com/
put file.txt /path/
close
Save the script to a file (e.g. script.txt) and run it like:
winscp.com /script=script.txt
You can also put everything on a single line:
winscp.com /command "open http://user#webdav.example.com/" ^
"put file.txt /path/" "close"
If you really want to move (not copy) the files, add the -delete switch to the put command:
put -delete file.txt /path/
See an introduction to scripting with WinSCP.
(I'm the author of WinSCP)
Try the below code.
$filename = 'testing.text';
exec('curl --digest --user "' . $username . ':' . $password . '" -T "' .
$filename . '" "https://sandbox.test.com/dav/content/" ');
Related
I have a very big file that has to be transferred to a remote server.
On that remote server there is a job activating each 5 min that, once sees a file name starting with the right prefix, processes it.
What happens if the job "wakes up" in the middle of transfer? In that case it would process a corrupted file.
Do pscp create a .temp file and renames it accordingly to account for that? Or do I have to handle this manually?
No pscp does not transfer the files via a temporary file.
You would have to use another SFTP client – If you use pscp as SFTP client. The pscp defaults to SFTP, but it falls back to SCP, if SFTP is not available. If you need to use SCP (what is rare), you cannot do this, as SCP protocol does not support file rename.
Either an SFTP client that at least supports file rename – Upload explicitly to a temporary file name and rename afterwards. For that you can use psftp from PuTTY package, with its put and mv commands:
open user#hostname
put C:\source\path\file.zip /destination/path/file.tmp
mv /destination/path/file.tmp /destination/path/file.zip
exit
Or use an SFTP client that can upload a files via a temporary file automatically. For example WinSCP can do that. By default it does for files over 100 KB only. If your files are smaller, you can configure it to do it for all files using the -resumesupport switch.
An example batch file that forces an upload of a file via a temporary file:
"C:\Program Files (x86)\WinSCP\WinSCP.com" ^
/log="C:\writable\path\to\log\WinSCP.log" /ini=nul ^
/command ^
"open sftp://username:password#example.com/ -hostkey=""ssh-ed25519 255 ...=""" ^
"put -resumesupport=on C:\source\path\file.zip /destination/path/" ^
"exit"
The code was generated by WinSCP GUI with the "Transfer to temporary filename" options set to "All files".
See also WinSCP article Locking files while uploading / Upload to temporary file name.
(I'm the author of WinSCP)
Related question: SFTP file lock mechanism.
I want to download only new files from one SFTP server using WinSCP.
Suppose, I have 10 files in source and destination today.
Tomorrow one new file may be added to the source. In this scenario, I want to copy the new file only into destination.
I am using below script:
open sftp_connection
cd /path
option transfer binary
get "*.txt" localpath
close
exit
By using above, I am able to copy all files, but I want only new files which are not available in destination.
Thanks,
Srikrishna.
The easiest solution is to add -neweronly switch to your get command:
get -neweronly *.txt C:\local\path\*
For a very similar results, you can also use synchronize command:
synchronize local . C:\local\path -filemask=*.txt
See also WinSCP article on Downloading the most recent file.
I have a dropbox link like https://www.dropbox.com/sh/w4366ttcz6/AAB4kSz3adZ which opens the ususal dropbox site with folders and files.
Is there any chance to download the complete content (tar or directly as sync) to a unix machine using wget?
I have seen some posts here where single files were downloaded but could not find any answer to this. There is an api from Dropbox but that does not work on my server due to the 64 bit issue on my server and http://www.dropboxwiki.com/dropbox-addons/dropbox-gallery-download#BASH_Version does also not work for me.... any other suggestions?
This help article documents some parameters you can use to get different behaviors from Dropbox shared links:
https://www.dropbox.com/help/201
For example, using this link:
https://www.dropbox.com/sh/igoku2mqsjqsmx1/AAAeF57DR2ou_nZGC4JPoQKfa
We can use the dl parameter to get a direct download. Using curl, we can download it as such:
curl -L https://www.dropbox.com/sh/igoku2mqsjqsmx1/AAAeF57DR2ou_nZGC4JPoQKfa?dl=1 > download.zip
(The -L is necessary in order to follow redirects.)
Or, with wget, something like:
wget --max-redirect=20 -O download.zip https://www.dropbox.com/sh/igoku2mqsjqsmx1/AAAeF57DR2ou_nZGC4JPoQKfa
You can use --content-disposition with wget too.
wget https://www.dropbox.com/sh/igoku2mqsjqsmx1/AAAeF57DR2ou_nZGC4JPoQKfa --content-disposition
It will auto-detect the folder name as the zip filename.
Currently, you're probably better off creating an app that you don't publish, which can either access all your files, or just a dedicated app folder (safer). Click the generate API token button about halfway down the app's settings page, and store it securely! You can then use the dedicated download or zip download API calls to get your files from anywhere like so:
curl -X POST https://content.dropboxapi.com/2/files/download_zip \
--header "Authorization: Bearer $MY_DROPBOX_API_TOKEN" \
--header 'Dropbox-API-Arg: {"path": "/path/to/directory"}' \
> useful-name.zip
Adding your token as an environment variable makes it easier & safer to type/script these operations. If you're using BASH, and you have ignorespace in your $HISTCONTROL you can just type + paste your key with a leading space so it's not saved in your history. For frequent use, save it in a file with 0600 permissions that you can source, as you would an SSH key.
export MY_DROPBOX_API_TOKEN='...'
Yes you can as it is pretty wasy follow below steps
Firstly, get the dropbox share link. It will look like this https://www.dropbox.com/s/ad2arn440pu77si/test.txt
Then add a “?dl=1” to the end of that url and a “-O filename” so that you end up with something like this: wget https://www.dropbox.com/s/ad2arn440pu77si/test.txt?dl=1 -O test.txt
Now you can easily get files onto your linux.
I have a website project, and an outlook addin that communicates via a webservice to the same database. I'd like to add the outlook addin as "downloadable file" to the interface of the website.
How to achieve that at build time the outlook addin installer ends up in the website's "Download" folder?
Is that possible?
Thanks in advance!
I am not sure this is really a good idea, because maybe not every time you build it it is ok to upload it (broken builds? untested bugs?), but anyway, the idea might be this:
find a way to mount the FTP site mounted as disk Z in the computer and keep it there
you probably want to zip it before, so find and install a command line zip.exe
find a way to have an automated job start every few minutes (like a batch file)
The job (might be a batch file) should do this:
check the file creation date of C:\build\folder\executable.exe and compare it with the file creation date of Z:\download\folder\executable.zip
only if newer, zip C:\build\folder\executable.exe to C:\build\folder\executable.zip and copy C:\build\folder\executable.zip to Z:\download\folder\executable.zip
In what language you write the script is your choice, a windows batch could do (the XCOPY command can copy only newer files), I know PHP and probably would use that with a batch file calling "php my_php_task.php", but you can launch any language interpreter you like.
UPDATE
For zipping you can download this:
http://www.info-zip.org/Zip.html
For copying only newer files u can use XCOPY with options /D (newer only) and /Y (confirm overwriting). Other options here:
http://www.computerhope.com/xcopyhlp.htm
So the batch file might look just similar to these two lines:
zip -f C:\build\folder\executable.zip C:\build\folder\executable.exe
xcopy /D /Y C:\build\folder\executable.zip Z:\download\folder\executable.zip
Have it called every 30 seconds and the job is done. The -f option in zip and /D option in xcopy make sure the script does nothing except check creation dates if you have not recently rebuilt the file.
I am a newbie to powershell. I want to write a script to do the following
Check if I have mapped to a network drive
If not, map to it
Once mapped, check the files in 4 folders at a path on the network drive
If the files are newer to those I compare to on a local drive, copy them ie. only copy new/updated files
Any help with this would be great as I need a starting position to take on this.
You can use net use to determine if the drive is mapped or not:
net use s:
if ($LastExitCode -ne 0)
{
net use s: \\server\share
}
$folders = #{local = 'c:\path1';remote='s:\path1'},
#{local = 'c:\path2';remote='s:\path2'}
$folders | Foreach {xcopy $_.remote $_.local /E /C /H /R /Y /D /I}
Don't forget the existing console tools tend to work just fine in PowerShell and sometimes are the easiest way to get the job done.