Recursion Logic - recursion

I've been working on a PowerShell script that will preserve directory structure when cabbing nested folders. I'm having a bit of difficulty getting the recursion logic right, though.
This is a rough, so there isn't any try/catch error code yet. I've commented out the Remove-Item to prevent accidental runs.
The logic I worked out for this is as follows.
Check & get base tree for subdirectories
Go one level in and check again
Continue until no directories and then return one level up.
Cab directory, remove directory, write log for automated extraction (file names of subdirectory cabs).
Repeat process next level up and continue until base directory
function Chkfordir ($clevel)
{
$dir = dir $clevel | ? { $_.PSIsContainer -eq $true } #Does Current Level have Folders?
if($dir -ne $null) # Yes
{
Chkfordir $dir #Go Deeper
}
if ($dir -eq $null) #Deepest Branch
{
return # Go Back One Level and begin Cabbing
}
$dir | % {
Compress-Directory $_.FullName (".\" + [string]$_.Name + ".cab")
echo ($_.FullName + ".cab" >> .\cleaf.log"
#Remove-Item -Recurse $_.FullName
return
}
}
The function call Compress-Directory is from here.
Edit Changes:
Will Re-Post Code Soon (08/18)
Edit 08/18 So I finally had a chance to test it and the logic seems to work now. There were some problems.
Most of the difficulty came with a powershell gotcha and the unnoticed problem that Compress-Directory is not path independent. Looks like I'll be needing to re-write this function later to be path independent.
The powershell gotcha was in a type change for a value on the pipeline. Apparently after returning from a function directory items are changed from System.IO.FileInfo to System.IO.DirectoryInfo with differently named member functions.
Echo was replaced with Add-Content as the redirection operators don't work in powershell.
There were some unaccounted states to contend with as well. A leaf directory which had no files would cause Compress-Directory to error or complete silently with no file creation (thus not preserving hierarchy).
Solution was to add an Add-Content for leaf folders before return, and moved Add-Content to before the Compress-Directory so there is at least one file in each directory.
I've included my current version below but it is a work in progress.
function Chkfordir ($clevel)
{
$dir = dir $clevel | ? { $_.PSIsContainer -eq $true } # Get Folders?
if ($dir -eq $null) { #Check if deepest branch
Add-Content (Join-Path $_.PSPath "\leaf.log") ([string]$_.FullName + ".cab")
return $_ # Return one level up and cab
}
$dir | % { #for each sub go deeper
Chkfordir $_.FullName
Add-Content (Join-Path $_.PSParentPath "\branch.log") ([string]$_.FullName + ".cab")
Compress-Directory $_.FullName ([string]$_.Name + ".cab")
#Remove-Item $_.FullName -recurse
}
}

You need to recurse for each subdirectory and compress it after the recursive call returns:
function Chkfordir($clevel) {
Get-ChildItem $clevel |
Where-Object { $_.PSIsContainer } |
ForEach-Object {
Chkfordir $_
Compress-Directory ...
...
}
}
That way you automatically descend first, then create the archives as you return.

Related

PowerShell script not working on server

I have a simple PowerShell script in which I try to write some log data.
The script that's not working:
Add-Content -Path C:\temp\logg.txt -Value $file.FullName
This line I placed in the middle of the script and I never get any text in the logg.txt. The Script executes fine except this, it compiles some files to a zip file.
This is on a Windows server.
If I, however, run the same script on my local machine, Win7, it does work!?
So I think it must be something with the server but I can't figure it out. Am using -ExecutionPolicy Bypass when I run the script.
EDIT: more info.
The script is called from an ASP.NET webpage:
Process.Start("Powershell.exe", "-ExecutionPolicy Bypass -Command ""& {" & destinationFolderRoot.Replace("\\", "\") & "\scriptParameters.ps1}""")
BaseZipScript:
function New-Zip
{
param([string]$zipfilename)
Set-Content $zipfilename ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $zipfilename).IsReadOnly = $false
}
function Add-Zip
{
param([string]$zipfilename)
if(-not (Test-Path($zipfilename)))
{
Set-Content $zipfilename ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $zipfilename).IsReadOnly = $false
}
$shellApplication = New-Object -COM Shell.Application
$zipPackage = $shellApplication.NameSpace($zipfilename)
foreach($file in $input)
{
$zipPackage.CopyHere($file.FullName)
Add-Content -Path "C:\temp\logg.txt" -Value $file.FullName
Start-Sleep -Milliseconds 750
}
}
scriptParameters1:
. "C:\temp\multiDownload\baseZipScript.ps1"
New-Zip c:\temp\multiDownload\user379\2016-09-15_14.39\files_2016-09-15_14.39.zip
dir c:\temp\multiDownload\user379\2016-09-15_14.39\tempCopy\*.* -Recurse |
Add-Zip c:\temp\multiDownload\user379\2016-09-15_14.39\files_2016-09-15_14.39.zip
Start-sleep -milliseconds 250
exit
As I said earlier the script works on local machine and also on the server, e.g. the files are being zipped, except that on the server nothing is logged in logg.txt.
Though the logg.txt works and exists on the server because the webpage also log some info there and that is being written.
Update:
As the comments guessed it was as simple as write access to the file. Though, it is kind of wierd because all files are created from the webservice if they doesn't exists? Any how, it works now. Thx! : )

Batch file to find a file, create a copy in the same location and run this on multiple directories

here's what I am trying to do.
I have a few hundred users My Documents folders in which most(not all) have a file(key.shk for shortkeys program).
I need to upgrade the software but doing so makes changes to the original file.
I would like to run a batch file on the server to find the files in each My Docs folder and make a copy of it there called backup.shk
I can then use this for roll back.
The folder structure looks like this
userA\mydocs
userB\mydocs
userC\mydocs
My tools are xcopy, robocopy or powershell
Thanks in advance
This powershell script works... save as .ps1
Function GET-SPLITFILENAME ($FullPathName) {
$PIECES=$FullPathName.split(“\”)
$NUMBEROFPIECES=$PIECES.Count
$FILENAME=$PIECES[$NumberOfPieces-1]
$DIRECTORYPATH=$FullPathName.Trim($FILENAME)
$baseName = [System.IO.Path]::GetFileNameWithoutExtension($_.fullname)
$FILENAME = [System.IO.Path]::GetFileNameWithoutExtension($_.fullname)
return $FILENAME, $DIRECTORYPATH
}
$Directory = "\\PSFS03\MyDocs$\Abbojo\Insight Software"
Get-ChildItem $Directory -Recurse | where{$_.extension -eq ".txt"} | % {
$details = GET-SPLITFILENAME($_.fullname)
$name = $details[0]
$path = $details[1]
copy $_.fullname $path$name"_backup".txt
}

tcl deep recursive file search, search for files with *.c extension

Using an old answer to search for a file in tcl:
https://stackoverflow.com/a/435094/984975
First lets discuss what I am doing right now:
Using this function:(credit to Jacson)
# findFiles
# basedir - the directory to start looking in
# pattern - A pattern, as defined by the glob command, that the files must match
proc findFiles { basedir pattern } {
# Fix the directory name, this ensures the directory name is in the
# native format for the platform and contains a final directory seperator
set basedir [string trimright [file join [file normalize $basedir] { }]]
set fileList {}
# Look in the current directory for matching files, -type {f r}
# means ony readable normal files are looked at, -nocomplain stops
# an error being thrown if the returned list is empty
foreach fileName [glob -nocomplain -type {f r} -path $basedir $pattern] {
lappend fileList $fileName
}
# Now look for any sub direcories in the current directory
foreach dirName [glob -nocomplain -type {d r} -path $basedir *] {
# Recusively call the routine on the sub directory and append any
# new files to the results
set subDirList [findFiles $dirName $pattern]
if { [llength $subDirList] > 0 } {
foreach subDirFile $subDirList {
lappend fileList $subDirFile
}
}
}
return $fileList
}
And calling the following command:
findFiles some_dir_name *.c
The current result:
bad option "normalize": must be atime, attributes, channels, copy, delete, dirname, executable, exists, extension, isdirectory, isfile, join, lstat, mtime, mkdir, nativename, owned, pathtype, readable, readlink, rename, rootname, size, split, stat, tail, type, volumes, or writable
Now, if we run:
glob *.c
We get a lot of files but all of them in the current directory.
The goal is to get ALL the files in ALL sub-folders on the machine with their paths.
Anyone who could help?
What I really want to do is find the directory with the highest count of *.c files.
However, if I could list all the files and their paths, I could count, how many files are in each directory and get the one with the highest count.
You are using an old version of Tcl. [file normalize] was introduced in Tcl 8.4 around 2002 or so. Upgrade already.
If you can't - then you use glob but call it once just for files and then walk the directories. See the glob -types option.
Here's a demo:
proc on_visit {path} {
puts $path
}
proc visit {base glob func} {
foreach f [glob -nocomplain -types f -directory $base $glob] {
if {[catch {eval $func [list [file join $base $f]]} err]} {
puts stderr "error: $err"
}
}
foreach d [glob -nocomplain -types d -directory $base *] {
visit [file join $base $d] $glob $func
}
}
proc main {base} {
visit $base *.c [list on_visit]
}
main [lindex $argv 0]
I would use the ::fileutil::traverse function to do it.
Something like:
package require ::fileutil::traverse
proc check_path {path} {
string equal [file extension $path] ".c"
}
set obj [::fileutil::traverse %AUTO% -filter check_path]
array set pathes {}
$obj foreach file {
if {[info exists pathes([file dirname $file])]} {
incr pathes([file dirname $file])
} else {
set pathes([file dirname $file]) 1
}
}
# print pathes and find the biggest
foreach {name value} [array get pathes] {
puts "$name : $value"
}
For quick (1 level) file pattern matching use:
glob **/*.c
If you want to recursively search, then use:
proc ::findFiles { baseDir pattern } {
set dirs [ glob -nocomplain -type d [ file join $baseDir * ] ]
set files {}
foreach dir $dirs {
lappend files {*}[ findFiles $dir $pattern ]
}
lappend files {*}[ glob -nocomplain -type f [ file join $baseDir $pattern ] ]
return $files
}
puts [ join [ findFiles $basepath "*.tcl" ] \n ]

insert header to a file

I would like to hear your directions on how to insert lines of header (all lines in a file) to another file (more bigger, several GB). I prefer the Unix/awk/sed ways of do that job.
# header I need to insert to another, they are in a file named "header".
##fileformat=VCFv4.0
##fileDate=20090805
##source=myImputationProgramV3.1
##reference=1000GenomesPilot-NCBI36
##phasing=partial
##INFO=<ID=NS,Number=1,Type=Integer,Description="Number of Samples With Data">
##INFO=<ID=DP,Number=1,Type=Integer,Description="Total Depth">
##INFO=<ID=AF,Number=.,Type=Float,Description="Allele Frequency">
##INFO=<ID=AA,Number=1,Type=String,Description="Ancestral Allele">
##INFO=<ID=DB,Number=0,Type=Flag,Description="dbSNP membership, build 129">
##INFO=<ID=H2,Number=0,Type=Flag,Description="HapMap2 membership">
##FILTER=<ID=q10,Description="Quality below 10">
##FILTER=<ID=s50,Description="Less than 50% of samples have data">
##FORMAT=<ID=GT,Number=1,Type=String,Description="Genotype">
##FORMAT=<ID=GQ,Number=1,Type=Integer,Description="Genotype Quality">
##FORMAT=<ID=DP,Number=1,Type=Integer,Description="Read Depth">
##FORMAT=<ID=HQ,Number=2,Type=Integer,Description="Haplotype Quality">
#CHROM POS ID REF ALT QUAL FILTER INFO
header="/name/of/file/containing/header"
for file in "$#"
do
cat "$header" "$file" > /tmp/xx.$$
mv /tmp/xx.$$ "$file"
done
You might prefer to locate the temporary file on the same file system as the file you are editing, but anything that requires inserting data at the front of the file is going to end up working very close to this. If you are going to be doing this all day, every day, you might assemble something a little slicker, but the chances are the savings will be minuscule (fractions of a second per file).
If you really, really must use sed, then I suppose you could use:
header="/name/of/file/containing/header"
for file in "$#"
do
sed -e "0r $header" "$file" > /tmp/xx.$$
mv /tmp/xx.$$ "$file"
done
The command reads the content of header 'after' line 0 (before line 1), and then everything else is passed through unchanged. This isn't as swift as cat though.
An analogous construct using awk is:
header="/name/of/file/containing/header"
for file in "$#"
do
awk '{print}' "$header" "$file" > /tmp/xx.$$
mv /tmp/xx.$$ "$file"
done
This simply prints each input line on the output; again, not as swift as cat.
One more advantage of cat over sed or awk; cat will work even if the big files are mainly binary data (it is oblivious to the content of the files). Both sed and awk are designed to handle data split into lines; while modern versions will probably handle even binary data fairly well, it is not what they are designed for.
I did it all with a Perl script, because I had to traverse a directory tree and handle various file types differently. The basic script was
#!perl -w
process_directory(".");
sub process_directory {
my $dir = shift;
opendir DIR, $dir or die "$dir: not a directory\n";
my #files = readdir DIR;
closedir DIR;
foreach(#files) {
next if(/^\./ or /bin/ or /obj/); # ignore some directories
if(-d "$dir/$_") {
process_directory("$dir/$_");
} else {
fix_file("$dir/$_");
}
}
}
sub fix_file {
my $file = shift;
open SRC, $file or die "Can't open $file\n";
my $file = "$file-f";
open FIX, ">$fix" or die "Can't open $fix\n";
print FIX <<EOT;
-- Text to insert
EOT
while(<SRC>) {
print FIX;
}
close SRC;
close FIX;
my $oldfile = $file;
$oldFile =~ s/(.*)\.\(\w+)$/$1-old.$2/;
if(rename $file, $oldFile) {
rename $fix, $file;
}
}
Share and enjoy! Or not -- I'm no Perl hacker, so this is probably double-plus-unoptimal Perl code. Still, it worked for me!

Equivalent of *Nix 'which' command in PowerShell?

How do I ask PowerShell where something is?
For instance, "which notepad" and it returns the directory where the notepad.exe is run from according to the current paths.
The very first alias I made once I started customizing my profile in PowerShell was 'which'.
New-Alias which get-command
To add this to your profile, type this:
"`nNew-Alias which get-command" | add-content $profile
The `n at the start of the last line is to ensure it will start as a new line.
Here is an actual *nix equivalent, i.e. it gives *nix-style output.
Get-Command <your command> | Select-Object -ExpandProperty Definition
Just replace with whatever you're looking for.
PS C:\> Get-Command notepad.exe | Select-Object -ExpandProperty Definition
C:\Windows\system32\notepad.exe
When you add it to your profile, you will want to use a function rather than an alias because you can't use aliases with pipes:
function which($name)
{
Get-Command $name | Select-Object -ExpandProperty Definition
}
Now, when you reload your profile you can do this:
PS C:\> which notepad
C:\Windows\system32\notepad.exe
I usually just type:
gcm notepad
or
gcm note*
gcm is the default alias for Get-Command.
On my system, gcm note* outputs:
[27] » gcm note*
CommandType Name Definition
----------- ---- ----------
Application notepad.exe C:\WINDOWS\notepad.exe
Application notepad.exe C:\WINDOWS\system32\notepad.exe
Application Notepad2.exe C:\Utils\Notepad2.exe
Application Notepad2.ini C:\Utils\Notepad2.ini
You get the directory and the command that matches what you're looking for.
Try this example:
(Get-Command notepad.exe).Path
My proposition for the Which function:
function which($cmd) { get-command $cmd | % { $_.Path } }
PS C:\> which devcon
C:\local\code\bin\devcon.exe
A quick-and-dirty match to Unix which is
New-Alias which where.exe
But it returns multiple lines if they exist so then it becomes
function which {where.exe command | select -first 1}
I like Get-Command | Format-List, or shorter, using aliases for the two and only for powershell.exe:
gcm powershell | fl
You can find aliases like this:
alias -definition Format-List
Tab completion works with gcm.
To have tab list all options at once:
set-psreadlineoption -editmode emacs
This seems to do what you want (I found it on http://huddledmasses.org/powershell-find-path/):
Function Find-Path($Path, [switch]$All = $false, [Microsoft.PowerShell.Commands.TestPathType]$type = "Any")
## You could comment out the function stuff and use it as a script instead, with this line:
#param($Path, [switch]$All = $false, [Microsoft.PowerShell.Commands.TestPathType]$type = "Any")
if($(Test-Path $Path -Type $type)) {
return $path
} else {
[string[]]$paths = #($pwd);
$paths += "$pwd;$env:path".split(";")
$paths = Join-Path $paths $(Split-Path $Path -leaf) | ? { Test-Path $_ -Type $type }
if($paths.Length -gt 0) {
if($All) {
return $paths;
} else {
return $paths[0]
}
}
}
throw "Couldn't find a matching path of type $type"
}
Set-Alias find Find-Path
Check this PowerShell Which.
The code provided there suggests this:
($Env:Path).Split(";") | Get-ChildItem -filter notepad.exe
Try the where command on Windows 2003 or later (or Windows 2000/XP if you've installed a Resource Kit).
BTW, this received more answers in other questions:
Is there an equivalent of 'which' on Windows?
PowerShell equivalent to Unix which command?
If you want a comamnd that both accepts input from pipeline or as paramater, you should try this:
function which($name) {
if ($name) { $input = $name }
Get-Command $input | Select-Object -ExpandProperty Path
}
copy-paste the command to your profile (notepad $profile).
Examples:
❯ echo clang.exe | which
C:\Program Files\LLVM\bin\clang.exe
❯ which clang.exe
C:\Program Files\LLVM\bin\clang.exe
I have this which advanced function in my PowerShell profile:
function which {
<#
.SYNOPSIS
Identifies the source of a PowerShell command.
.DESCRIPTION
Identifies the source of a PowerShell command. External commands (Applications) are identified by the path to the executable
(which must be in the system PATH); cmdlets and functions are identified as such and the name of the module they are defined in
provided; aliases are expanded and the source of the alias definition is returned.
.INPUTS
No inputs; you cannot pipe data to this function.
.OUTPUTS
.PARAMETER Name
The name of the command to be identified.
.EXAMPLE
PS C:\Users\Smith\Documents> which Get-Command
Get-Command: Cmdlet in module Microsoft.PowerShell.Core
(Identifies type and source of command)
.EXAMPLE
PS C:\Users\Smith\Documents> which notepad
C:\WINDOWS\SYSTEM32\notepad.exe
(Indicates the full path of the executable)
#>
param(
[String]$name
)
$cmd = Get-Command $name
$redirect = $null
switch ($cmd.CommandType) {
"Alias" { "{0}: Alias for ({1})" -f $cmd.Name, (. { which $cmd.Definition } ) }
"Application" { $cmd.Source }
"Cmdlet" { "{0}: {1} {2}" -f $cmd.Name, $cmd.CommandType, (. { if ($cmd.Source.Length) { "in module {0}" -f $cmd.Source} else { "from unspecified source" } } ) }
"Function" { "{0}: {1} {2}" -f $cmd.Name, $cmd.CommandType, (. { if ($cmd.Source.Length) { "in module {0}" -f $cmd.Source} else { "from unspecified source" } } ) }
"Workflow" { "{0}: {1} {2}" -f $cmd.Name, $cmd.CommandType, (. { if ($cmd.Source.Length) { "in module {0}" -f $cmd.Source} else { "from unspecified source" } } ) }
"ExternalScript" { $cmd.Source }
default { $cmd }
}
}
Use:
function Which([string] $cmd) {
$path = (($Env:Path).Split(";") | Select -uniq | Where { $_.Length } | Where { Test-Path $_ } | Get-ChildItem -filter $cmd).FullName
if ($path) { $path.ToString() }
}
# Check if Chocolatey is installed
if (Which('cinst.bat')) {
Write-Host "yes"
} else {
Write-Host "no"
}
Or this version, calling the original where command.
This version also works better, because it is not limited to bat files:
function which([string] $cmd) {
$where = iex $(Join-Path $env:SystemRoot "System32\where.exe $cmd 2>&1")
$first = $($where -split '[\r\n]')
if ($first.getType().BaseType.Name -eq 'Array') {
$first = $first[0]
}
if (Test-Path $first) {
$first
}
}
# Check if Curl is installed
if (which('curl')) {
echo 'yes'
} else {
echo 'no'
}
You can install the which command from https://goprogram.co.uk/software/commands, along with all of the other UNIX commands.
If you have scoop you can install a direct clone of which:
scoop install which
which notepad
There also always the option of using which. there are actually three ways to access which from Windows powershell, the first (not necessarily the best) wsl -e which command (this requires installation of windows subsystem for Linux and a running distro). B. gnuwin32 which is a port of several gnu binaries in .exe format as standle alone bundled lanunchers option three, install msys2 (cross compiler platform) if you go where it installed in /usr/bin you'll find many many gnu utils that are more up-to-date. most of them work as stand alone exe and can be copied from the bin folder to your home drive somewhere amd added to your PATH.
There also always the option of using which. there are actually three ways to access which from Windows powershell
The first, (though not the best) is wsl(windows subsystem for linux)
wsl -e which command
This requires installation of windows subsystem for Linux and a running distro.
Next is gnuwin32 which is a port of several gnu binaries in .exe format as standle alone bundled lanunchers
Third, install msys2 (cross compiler platform) if you go where it installed in /usr/bin you'll find many many gnu utils that are more up-to-date. most of them work as stand alone exe and can be copied from the bin folder to your home drive somewhere amd added to your PATH.

Resources