Copy, verify, and then delete files/children from network location - networking

I want to develop a script that copies,verifies, and then deletes from one network location to another (files over x days old).
Here is my algorithm:
Recursively traverse a network location ($movePath)
for all files $_.LastWriteTime >= x days | forEach {
xcopy or robocopy $FileName = $_.FullName.Replace($movePath, $newPath)
if (the files where written correctly) {
(delete) Remove-Item $Filename from $movePath
}
Can I combine the xcopy /v (verify) with robocopy?

Do you want to maintain the subfolder structure (i.e. files from a subfolder in the source go into the same subfolder in the destination)? If so, this should suffice:
$src = 'D:\source\folder'
$dst = '\\server\share'
$age = 10 # days
robocopy $src $dst /e /move /minage:$age
robocopy can handle verification (done automatically) and deletion by itself.

Related

PowerShell script not working on server

I have a simple PowerShell script in which I try to write some log data.
The script that's not working:
Add-Content -Path C:\temp\logg.txt -Value $file.FullName
This line I placed in the middle of the script and I never get any text in the logg.txt. The Script executes fine except this, it compiles some files to a zip file.
This is on a Windows server.
If I, however, run the same script on my local machine, Win7, it does work!?
So I think it must be something with the server but I can't figure it out. Am using -ExecutionPolicy Bypass when I run the script.
EDIT: more info.
The script is called from an ASP.NET webpage:
Process.Start("Powershell.exe", "-ExecutionPolicy Bypass -Command ""& {" & destinationFolderRoot.Replace("\\", "\") & "\scriptParameters.ps1}""")
BaseZipScript:
function New-Zip
{
param([string]$zipfilename)
Set-Content $zipfilename ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $zipfilename).IsReadOnly = $false
}
function Add-Zip
{
param([string]$zipfilename)
if(-not (Test-Path($zipfilename)))
{
Set-Content $zipfilename ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $zipfilename).IsReadOnly = $false
}
$shellApplication = New-Object -COM Shell.Application
$zipPackage = $shellApplication.NameSpace($zipfilename)
foreach($file in $input)
{
$zipPackage.CopyHere($file.FullName)
Add-Content -Path "C:\temp\logg.txt" -Value $file.FullName
Start-Sleep -Milliseconds 750
}
}
scriptParameters1:
. "C:\temp\multiDownload\baseZipScript.ps1"
New-Zip c:\temp\multiDownload\user379\2016-09-15_14.39\files_2016-09-15_14.39.zip
dir c:\temp\multiDownload\user379\2016-09-15_14.39\tempCopy\*.* -Recurse |
Add-Zip c:\temp\multiDownload\user379\2016-09-15_14.39\files_2016-09-15_14.39.zip
Start-sleep -milliseconds 250
exit
As I said earlier the script works on local machine and also on the server, e.g. the files are being zipped, except that on the server nothing is logged in logg.txt.
Though the logg.txt works and exists on the server because the webpage also log some info there and that is being written.
Update:
As the comments guessed it was as simple as write access to the file. Though, it is kind of wierd because all files are created from the webservice if they doesn't exists? Any how, it works now. Thx! : )

Recursion Logic

I've been working on a PowerShell script that will preserve directory structure when cabbing nested folders. I'm having a bit of difficulty getting the recursion logic right, though.
This is a rough, so there isn't any try/catch error code yet. I've commented out the Remove-Item to prevent accidental runs.
The logic I worked out for this is as follows.
Check & get base tree for subdirectories
Go one level in and check again
Continue until no directories and then return one level up.
Cab directory, remove directory, write log for automated extraction (file names of subdirectory cabs).
Repeat process next level up and continue until base directory
function Chkfordir ($clevel)
{
$dir = dir $clevel | ? { $_.PSIsContainer -eq $true } #Does Current Level have Folders?
if($dir -ne $null) # Yes
{
Chkfordir $dir #Go Deeper
}
if ($dir -eq $null) #Deepest Branch
{
return # Go Back One Level and begin Cabbing
}
$dir | % {
Compress-Directory $_.FullName (".\" + [string]$_.Name + ".cab")
echo ($_.FullName + ".cab" >> .\cleaf.log"
#Remove-Item -Recurse $_.FullName
return
}
}
The function call Compress-Directory is from here.
Edit Changes:
Will Re-Post Code Soon (08/18)
Edit 08/18 So I finally had a chance to test it and the logic seems to work now. There were some problems.
Most of the difficulty came with a powershell gotcha and the unnoticed problem that Compress-Directory is not path independent. Looks like I'll be needing to re-write this function later to be path independent.
The powershell gotcha was in a type change for a value on the pipeline. Apparently after returning from a function directory items are changed from System.IO.FileInfo to System.IO.DirectoryInfo with differently named member functions.
Echo was replaced with Add-Content as the redirection operators don't work in powershell.
There were some unaccounted states to contend with as well. A leaf directory which had no files would cause Compress-Directory to error or complete silently with no file creation (thus not preserving hierarchy).
Solution was to add an Add-Content for leaf folders before return, and moved Add-Content to before the Compress-Directory so there is at least one file in each directory.
I've included my current version below but it is a work in progress.
function Chkfordir ($clevel)
{
$dir = dir $clevel | ? { $_.PSIsContainer -eq $true } # Get Folders?
if ($dir -eq $null) { #Check if deepest branch
Add-Content (Join-Path $_.PSPath "\leaf.log") ([string]$_.FullName + ".cab")
return $_ # Return one level up and cab
}
$dir | % { #for each sub go deeper
Chkfordir $_.FullName
Add-Content (Join-Path $_.PSParentPath "\branch.log") ([string]$_.FullName + ".cab")
Compress-Directory $_.FullName ([string]$_.Name + ".cab")
#Remove-Item $_.FullName -recurse
}
}
You need to recurse for each subdirectory and compress it after the recursive call returns:
function Chkfordir($clevel) {
Get-ChildItem $clevel |
Where-Object { $_.PSIsContainer } |
ForEach-Object {
Chkfordir $_
Compress-Directory ...
...
}
}
That way you automatically descend first, then create the archives as you return.

How to merge multiple files from multiple directories/folders

I have 300 directories/folders, each directory has two columns single file (xxx.gz), I want to merge all files from all folders in a single file. In all files first column is Identifier (ID) which is same.
How to merge all files into single file?
And I want to header for each column as name of file in respective directory.
Directory names are are: (68a7eb0a-123, b5694957-764, etc.. ) and files name are : (a5c403c2, 292c4a2f etc),
directory name and respective file name are not same, I want file name as header.
all directories
ls
6809b1c3-75a5
68e9b641-0cc9
71ae07b8-8bde
b7815cd2-1e69
..
..
each directory contain single file:
cd 6809b1c3-75a5
ls bd21dc2e.txt.gz
Try this:
for i in * ; do for j in $i/*.gz ; do echo $j >> ../final.txt ; gunzip -c $j >> ../final.txt ; done ; done
Annotated version:
for i in * # for each directory under current working directory
do # have nothing else in there
for j in $i/*.gz # for each gzipped file under directories
do
echo $j >> ../final.txt # echo path/file to the final file
gunzip -c $j >> ../final.txt # append gunzipping the file to the final file
done
done
Result:
$ head -8 ../final.txt
6809b1c3-75a5/bd21dc2e.txt.gz
blabla
whatever
you
have
in
those
files

Move/copy folders by groups

I have a folder containing subfolders named; *_1, *_2, *_3, *_4 . . . *_1000.
Then there is another set of folders named: Destination_folder1, Destination_folder2, Destination_folder3, ....Destination_folder10.
I would like to move (or copy) the subfolders by groups of 100 in Destination_folders* so that: the Destination_folder1 will contain subfolders *_1: *_100; Destination_folder2 will contain subfolders *_101: *_200 and so on and so forh. I tried to use:
for i in {1..100}
do
cp -r *_$((i)) Destination_folder$i/
done
but unfortunately folders are not copied by groups, but instead they are copied
individually. Can anyone help me please?
Best regards
Use a second loop (remove the word echo when you are happy):
for i in {1..10}; do
for j in {1..100}; do
(( dir = 100 * (i - 1) + j ))
echo cp -r *_$((dir)) Destination_folder${i}/
done
done

Batch file to find a file, create a copy in the same location and run this on multiple directories

here's what I am trying to do.
I have a few hundred users My Documents folders in which most(not all) have a file(key.shk for shortkeys program).
I need to upgrade the software but doing so makes changes to the original file.
I would like to run a batch file on the server to find the files in each My Docs folder and make a copy of it there called backup.shk
I can then use this for roll back.
The folder structure looks like this
userA\mydocs
userB\mydocs
userC\mydocs
My tools are xcopy, robocopy or powershell
Thanks in advance
This powershell script works... save as .ps1
Function GET-SPLITFILENAME ($FullPathName) {
$PIECES=$FullPathName.split(“\”)
$NUMBEROFPIECES=$PIECES.Count
$FILENAME=$PIECES[$NumberOfPieces-1]
$DIRECTORYPATH=$FullPathName.Trim($FILENAME)
$baseName = [System.IO.Path]::GetFileNameWithoutExtension($_.fullname)
$FILENAME = [System.IO.Path]::GetFileNameWithoutExtension($_.fullname)
return $FILENAME, $DIRECTORYPATH
}
$Directory = "\\PSFS03\MyDocs$\Abbojo\Insight Software"
Get-ChildItem $Directory -Recurse | where{$_.extension -eq ".txt"} | % {
$details = GET-SPLITFILENAME($_.fullname)
$name = $details[0]
$path = $details[1]
copy $_.fullname $path$name"_backup".txt
}

Resources