Upload multiple files from Powershell script - http

I have a webapplication that can process POSTing of a html form like this:
<form action="x" method="post" enctype="multipart/form-data">
<input name="xfa" type="file">
<input name="pdf" type="file">
<input type="submit" value="Submit">
</form>
Note that there are two type="file" <input> elements.
How can I script POSTing this from a Powershell script? I plan to do that to create a simple test-framework for the service.
I found WebClient.UploadFile(), but that can only handle a single file.
Thank you for taking your time.

I've been crafting multipart HTTP POST with PowerShell today. I hope the code below is helpful to you.
PowerShell itself cannot do multipart form uploads.
There are not many sample about it either. I built the code based on this and this.
Sure, Invoke-RestMethod requires PowerShell 3.0 but the code in the latter of the above links shows how to do HTTP POST with .NET directly, allowing you to have this running in Windows XP as well.
Good luck! Please tell if you got it to work.
function Send-Results {
param (
[parameter(Mandatory=$True,Position=1)] [ValidateScript({ Test-Path -PathType Leaf $_ })] [String] $ResultFilePath,
[parameter(Mandatory=$True,Position=2)] [System.URI] $ResultURL
)
$fileBin = [IO.File]::ReadAllBytes($ResultFilePath)
$computer= $env:COMPUTERNAME
# Convert byte-array to string (without changing anything)
#
$enc = [System.Text.Encoding]::GetEncoding("iso-8859-1")
$fileEnc = $enc.GetString($fileBin)
<#
# PowerShell does not (yet) have built-in support for making 'multipart' (i.e. binary file upload compatible)
# form uploads. So we have to craft one...
#
# This is doing similar to:
# $ curl -i -F "file=#file.any" -F "computer=MYPC" http://url
#
# Boundary is anything that is guaranteed not to exist in the sent data (i.e. string long enough)
#
# Note: The protocol is very precise about getting the number of line feeds correct (both CRLF or LF work).
#>
$boundary = [System.Guid]::NewGuid().ToString() #
$LF = "`n"
$bodyLines = (
"--$boundary",
"Content-Disposition: form-data; name=`"file`"$LF", # filename= is optional
$fileEnc,
"--$boundary",
"Content-Disposition: form-data; name=`"computer`"$LF",
$computer,
"--$boundary--$LF"
) -join $LF
try {
# Returns the response gotten from the server (we pass it on).
#
Invoke-RestMethod -Uri $URL -Method Post -ContentType "multipart/form-data; boundary=`"$boundary`"" -TimeoutSec 20 -Body $bodyLines
}
catch [System.Net.WebException] {
Write-Error( "FAILED to reach '$URL': $_" )
throw $_
}
}

I was bothered by this thing and haven't found a satisfactory solution. Although the gist here proposed can do the yob, it is not efficient in case of large files transmittal. I wrote a blog post proposing a solution for it, basing my cmdlet on HttpClient class present in .NET 4.5. If that is not a problem for you, you can check my solution at the following address http://blog.majcica.com/2016/01/13/powershell-tips-and-tricks-multipartform-data-requests/
EDIT:
function Invoke-MultipartFormDataUpload
{
[CmdletBinding()]
PARAM
(
[string][parameter(Mandatory = $true)][ValidateNotNullOrEmpty()]$InFile,
[string]$ContentType,
[Uri][parameter(Mandatory = $true)][ValidateNotNullOrEmpty()]$Uri,
[System.Management.Automation.PSCredential]$Credential
)
BEGIN
{
if (-not (Test-Path $InFile))
{
$errorMessage = ("File {0} missing or unable to read." -f $InFile)
$exception = New-Object System.Exception $errorMessage
$errorRecord = New-Object System.Management.Automation.ErrorRecord $exception, 'MultipartFormDataUpload', ([System.Management.Automation.ErrorCategory]::InvalidArgument), $InFile
$PSCmdlet.ThrowTerminatingError($errorRecord)
}
if (-not $ContentType)
{
Add-Type -AssemblyName System.Web
$mimeType = [System.Web.MimeMapping]::GetMimeMapping($InFile)
if ($mimeType)
{
$ContentType = $mimeType
}
else
{
$ContentType = "application/octet-stream"
}
}
}
PROCESS
{
Add-Type -AssemblyName System.Net.Http
$httpClientHandler = New-Object System.Net.Http.HttpClientHandler
if ($Credential)
{
$networkCredential = New-Object System.Net.NetworkCredential #($Credential.UserName, $Credential.Password)
$httpClientHandler.Credentials = $networkCredential
}
$httpClient = New-Object System.Net.Http.Httpclient $httpClientHandler
$packageFileStream = New-Object System.IO.FileStream #($InFile, [System.IO.FileMode]::Open)
$contentDispositionHeaderValue = New-Object System.Net.Http.Headers.ContentDispositionHeaderValue "form-data"
$contentDispositionHeaderValue.Name = "fileData"
$contentDispositionHeaderValue.FileName = (Split-Path $InFile -leaf)
$streamContent = New-Object System.Net.Http.StreamContent $packageFileStream
$streamContent.Headers.ContentDisposition = $contentDispositionHeaderValue
$streamContent.Headers.ContentType = New-Object System.Net.Http.Headers.MediaTypeHeaderValue $ContentType
$content = New-Object System.Net.Http.MultipartFormDataContent
$content.Add($streamContent)
try
{
$response = $httpClient.PostAsync($Uri, $content).Result
if (!$response.IsSuccessStatusCode)
{
$responseBody = $response.Content.ReadAsStringAsync().Result
$errorMessage = "Status code {0}. Reason {1}. Server reported the following message: {2}." -f $response.StatusCode, $response.ReasonPhrase, $responseBody
throw [System.Net.Http.HttpRequestException] $errorMessage
}
$responseBody = [xml]$response.Content.ReadAsStringAsync().Result
return $responseBody
}
catch [Exception]
{
$PSCmdlet.ThrowTerminatingError($_)
}
finally
{
if($null -ne $httpClient)
{
$httpClient.Dispose()
}
if($null -ne $response)
{
$response.Dispose()
}
}
}
END { }
}
Cheers

I have found a solution to my problem after studying how multipart/form-data is built. A lot of help came in the form of http://www.paraesthesia.com/archive/2009/12/16/posting-multipartform-data-using-.net-webrequest.aspx.
The solution then is to build the body of the request up manually according to that convention. I have left of niceties like correct Content-Lengths etc.
Here is an excerpt of what I am using now:
$path = "/Some/path/to/data/"
$boundary_id = Get-Date -Format yyyyMMddhhmmssfffffff
$boundary = "------------------------------" + $boundary_id
$url = "http://..."
[System.Net.HttpWebRequest] $req = [System.Net.WebRequest]::create($url)
$req.Method = "POST"
$req.ContentType = "multipart/form-data; boundary=$boundary"
$ContentLength = 0
$req.TimeOut = 50000
$reqst = $req.getRequestStream()
<#
Any time you write a file to the request stream (for upload), you'll write:
Two dashes.
Your boundary.
One CRLF (\r\n).
A content-disposition header that tells the name of the form field corresponding to the file and the name of the file. That looks like:
Content-Disposition: form-data; name="yourformfieldname"; filename="somefile.jpg"
One CRLF.
A content-type header that says what the MIME type of the file is. That looks like:
Content-Type: image/jpg
Two CRLFs.
The entire contents of the file, byte for byte. It's OK to include binary content here. Don't base-64 encode it or anything, just stream it on in.
One CRLF.
#>
<# Upload #1: XFA #>
$xfabuffer = [System.IO.File]::ReadAllBytes("$path\P7-T.xml")
<# part-header #>
$header = "--$boundary`r`nContent-Disposition: form-data; name=`"xfa`"; filename=`"xfa`"`r`nContent-Type: text/xml`r`n`r`n"
$buffer = [Text.Encoding]::ascii.getbytes($header)
$reqst.write($buffer, 0, $buffer.length)
$ContentLength = $ContentLength + $buffer.length
<# part-data #>
$reqst.write($xfabuffer, 0, $xfabuffer.length)
$ContentLength = $ContentLength + $xfabuffer.length
<# part-separator "One CRLF" #>
$terminal = "`r`n"
$buffer = [Text.Encoding]::ascii.getbytes($terminal)
$reqst.write($buffer, 0, $buffer.length)
$ContentLength = $ContentLength + $buffer.length
<# Upload #1: PDF template #>
$pdfbuffer = [System.IO.File]::ReadAllBytes("$path\P7-T.pdf")
<# part-header #>
$header = "--$boundary`r`nContent-Disposition: form-data; name=`"pdf`"; filename=`"pdf`"`r`nContent-Type: application/pdf`r`n`r`n"
$buffer = [Text.Encoding]::ascii.getbytes($header)
$reqst.write($buffer, 0, $buffer.length)
$ContentLength = $ContentLength + $buffer.length
<# part-data #>
$reqst.write($pdfbuffer, 0, $pdfbuffer.length)
$ContentLength = $ContentLength + $pdfbuffer.length
<# part-separator "One CRLF" #>
$terminal = "`r`n"
$buffer = [Text.Encoding]::ascii.getbytes($terminal)
$reqst.write($buffer, 0, $buffer.length)
$ContentLength = $ContentLength + $buffer.length
<#
At the end of your request, after writing all of your fields and files to the request, you'll write:
Two dashes.
Your boundary.
Two more dashes.
#>
$terminal = "--$boundary--"
$buffer = [Text.Encoding]::ascii.getbytes($terminal)
$reqst.write($buffer, 0, $buffer.length)
$ContentLength = $ContentLength + $buffer.length
$reqst.flush()
$reqst.close()
# Dump request to console
#$req
[net.httpWebResponse] $res = $req.getResponse()
# Dump result to console
#$res
# Dump result-body to filesystem
<#
$resst = $res.getResponseStream()
$sr = New-Object IO.StreamReader($resst)
$result = $sr.ReadToEnd()
$res.close()
#>
$null = New-Item -ItemType Directory -Force -Path "$path\result"
$target = "$path\result\P7-T.pdf"
# Create a stream to write to the file system.
$targetfile = [System.IO.File]::Create($target)
# Create the buffer for copying data.
$buffer = New-Object Byte[] 1024
# Get a reference to the response stream (System.IO.Stream).
$resst = $res.GetResponseStream()
# In an iteration...
Do {
# ...attemt to read one kilobyte of data from the web response stream.
$read = $resst.Read($buffer, 0, $buffer.Length)
# Write the just-read bytes to the target file.
$targetfile.Write($buffer, 0, $read)
# Iterate while there's still data on the web response stream.
} While ($read -gt 0)
# Close the stream.
$resst.Close()
$resst.Dispose()
# Flush and close the writer.
$targetfile.Flush()
$targetfile.Close()
$targetfile.Dispose()

I've remixed #akauppi's answer into a more generic solution, a cmdlet that:
Can take pipeline input from Get-ChildItem for files to upload
Takes an URL as a positional parameter
Takes a dictionary as a positional parameter, which it sends as additional form data
Takes an (optional) -Credential parameter
Takes an (optional) -FilesKey parameter to specify the formdata key for the files upload part
Supports -WhatIf
Has -Verbose logging
Exits with an error if something goes wrong
It can be called like this:
$url ="http://localhost:12345/home/upload"
$form = #{ description = "Test 123." }
$pwd = ConvertTo-SecureString "s3cr3t" -AsPlainText -Force
$creds = New-Object System.Management.Automation.PSCredential ("john", $pwd)
Get-ChildItem *.txt | Send-MultiPartFormToApi $url $form $creds -Verbose -WhatIf
Here's the code to the full cmdlet:
function Send-MultiPartFormToApi {
# Attribution: [#akauppi's post](https://stackoverflow.com/a/25083745/419956)
# Remixed in: [#jeroen's post](https://stackoverflow.com/a/41343705/419956)
[CmdletBinding(SupportsShouldProcess = $true)]
param (
[Parameter(Position = 0)]
[string]
$Uri,
[Parameter(Position = 1)]
[HashTable]
$FormEntries,
[Parameter(Position = 2, Mandatory = $false)]
[System.Management.Automation.Credential()]
[System.Management.Automation.PSCredential]
$Credential,
[Parameter(
ParameterSetName = "FilePath",
Mandatory = $true,
ValueFromPipeline = $true,
ValueFromPipelineByPropertyName = $true
)]
[Alias("Path")]
[string[]]
$FilePath,
[Parameter()]
[string]
$FilesKey = "files"
);
begin {
$LF = "`n"
$boundary = [System.Guid]::NewGuid().ToString()
Write-Verbose "Setting up body with boundary $boundary"
$bodyArray = #()
foreach ($key in $FormEntries.Keys) {
$bodyArray += "--$boundary"
$bodyArray += "Content-Disposition: form-data; name=`"$key`""
$bodyArray += ""
$bodyArray += $FormEntries.Item($key)
}
Write-Verbose "------ Composed multipart form (excl files) -----"
Write-Verbose ""
foreach($x in $bodyArray) { Write-Verbose "> $x"; }
Write-Verbose ""
Write-Verbose "------ ------------------------------------ -----"
$i = 0
}
process {
$fileName = (Split-Path -Path $FilePath -Leaf)
Write-Verbose "Processing $fileName"
$fileBytes = [IO.File]::ReadAllBytes($FilePath)
$fileDataAsString = ([System.Text.Encoding]::GetEncoding("iso-8859-1")).GetString($fileBytes)
$bodyArray += "--$boundary"
$bodyArray += "Content-Disposition: form-data; name=`"$FilesKey[$i]`"; filename=`"$fileName`""
$bodyArray += "Content-Type: application/x-msdownload"
$bodyArray += ""
$bodyArray += $fileDataAsString
$i += 1
}
end {
Write-Verbose "Finalizing and invoking rest method after adding $i file(s)."
if ($i -eq 0) { throw "No files were provided from pipeline." }
$bodyArray += "--$boundary--"
$bodyLines = $bodyArray -join $LF
# $bodyLines | Out-File data.txt # Uncomment for extra debugging...
try {
if (!$WhatIfPreference) {
Invoke-RestMethod `
-Uri $Uri `
-Method Post `
-ContentType "multipart/form-data; boundary=`"$boundary`"" `
-Credential $Credential `
-Body $bodyLines
} else {
Write-Host "WHAT IF: Would've posted to $Uri body of length " + $bodyLines.Length
}
} catch [Exception] {
throw $_ # Terminate CmdLet on this situation.
}
Write-Verbose "Finished!"
}
}

Related

Invoke-Webrequest ASP.Net Error

I have a script which I use to load a webpage, retrieve information from said webpage, then output said information to a file. It had been working perfectly until today, when I have been getting an error which reads:
invoke-webrequest : Response object error 'ASP 0251 : 80004005'
Response Buffer Limit Exceeded
/foo/Reports/SearchLocation.asp, line 0
Execution of the ASP page caused the Response Buffer to exceed its configured limit.
At C:\path.ps1:7 char:12
+ $url = invoke-webrequest "http://url/ ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (System.Net.HttpWebRequest:HttpWebRequest) [Invoke-WebRequest], WebException
+ FullyQualifiedErrorId : WebCmdletWebResponseException,Microsoft.PowerShell.Commands.InvokeWebRequestCommand
I do not believe there have been any changes to the site which it is pulling its data from, and the file it is getting the input information from has no formatting errors.
Some googling around leads me to believe that the issue is that the page has more than 4 mb worth of data to load, and the default buffer size is 4 mb, but I can't find any instructions for how to change the buffer size in PowerShell.
I came across the clear-webconfiguration cmdlet, but I'm not certain whether or not that is what I need, or how exactly to implement it within my script. Here is the main portion of my code:
foreach($c in $csv){
[array]$tags = $null
$url = invoke-webrequest "http://url.com" -UseDefaultCredentials
$table = $url.ParsedHTML.getElementsByTagName('table')[7]
$rows = $table.getElementsByTagName('tr')
if($c.'Was this user on the old file?' -eq 'Yes'){
if($table.innerText -like "*N/A*" ){
$man = $c.'Manager Name' -replace ',',';'
$badusers += $c.'User ID' + "," + $c.Name + "," + $man + "," + $c.'CC'
}
else{
foreach($row in $rows){
$bcol = $row.getElementsByTagName('td') | Where-Object{$_.cellIndex -eq 1} | select -First 1
$ccol = $row.getElementsByTagName('td') | Where-Object{$_.cellIndex -eq 7} | select -First 1
$bcol = $bcol.innerText
$ccol = $ccol.innerText
if($ccol -ne $c.'CC'){
$tags += $bcol + ",," + $c.'CC' + "," + $c.'User ID'
}
}
if($tags -ne $null){
$results += $tags
}
}
}
}
Any help on solving this issue is much appreciated.

Running powershell script on asp.net site

I am trying to run a powershell script and have it output to my asp.net site. I have made it work with a very simple script where the only command in the script was
Get-Service | Out-String
and this output onto my site everything I expected
but when I use the script I actually want info from it doesn't output anything
I can tell it runs (or trys to run) because when my site hits the code that invokes the script it hangs about 10 seconds.
The script I am trying to run is
$user = "user"
$token = "token"
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $user,$token)))
$result = Invoke-WebRequest -Method Get -Uri 'https://site.vsrm.visualstudio.com/defaultcollection/product/_apis/release/releases?definitionId=1&api-version=3.0-preview.2&$expand=environments' -ContentType "application/json" -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)}
$releaseArr = $result.Content | ConvertFrom-Json
[System.Collections.ArrayList]$enviromentName = #()
[System.Collections.ArrayList]$latestRelease = #()
foreach($env in $releaseArr.value[0].environments)
{
$enviromentName.Add($env.name) | Out-Null
}
foreach($releaseValue in $releaseArr.value)
{
For($i = 0; $i -lt $enviromentName.Count; $i++)
{
if($latestRelease[$i] -eq $null)
{
foreach($release in $releaseValue.environments)
{
if($release.name -eq $enviromentName[$i] -and $release.status -eq "succeeded")
{
$latestRelease.Add($releaseValue.name) | Out-Null
}
}
}
}
}
For($i = 0; $i -lt $enviromentName.Count; $i++)
{
Write-Host $enviromentName[$i] " : " $latestRelease[$i]
}
I know this script runs and outputs, but is there some code in this script that would cause it to not output properly.
The code in my asp.net site I am using to call the script is
ResultBox.Text = string.Empty;
// Initialize PowerShell engine
var shell = PowerShell.Create();
// Add the script to the PowerShell object
shell.Commands.AddScript(#"C:\Users\user\Desktop\script.ps1");
// Execute the script
var results = shell.Invoke();
// display results, with BaseObject converted to string
// Note : use |out-string for console-like output
if (results.Count > 0)
{
// We use a string builder ton create our result text
var builder = new StringBuilder();
foreach (var psObject in results)
{
// Convert the Base Object to a string and append it to the string builder.
// Add \r\n for line breaks
builder.Append(psObject.BaseObject.ToString() + "\r\n");
}
// Encode the string in HTML (prevent security issue with 'dangerous' caracters like < >
ResultBox.Text = Server.HtmlEncode(builder.ToString());
}
Change "Write-Host" to "Write-Output." Write-Host only outputs to interactive consoles.
You can see this in action:
Make a new PowerShell file and add a write-host statement to it:
[nick#nick-lt temp]$ New-Item -Type File -Path .\example.ps1 -Force
[nick#nick-lt temp]$ Set-Content .\example.ps1 "Write-Host 'Hello World'"
Then try and set a variable to the result of the script:
[nick#nick-lt temp]$ $what = .\example.ps1
Hello World
[nick#nick-lt temp]$ $what
[nick#nick-lt temp]$
Hello World shows up when the script executes but the variable is empty.
Now change it to write-output:
[nick#nick-lt temp]$ Set-Content .\example.ps1 "Write-Output 'Hello World'"
[nick#nick-lt temp]$ $what = .\example.ps1
[nick#nick-lt temp]$ $what
Hello World
The variable actually contains what it is supposed to now.
One of the cardinal rules of PowerShell is to not use Write-Host except in script that will be run interactively. .NET needs the results in the output stream not the host stream.

Write to text file but escape special characters

I'm trying to use R to write some perl script to a text file. I just cannot figure out how to escape certain characters?
I've used backslash (single and double), square brackets, "\\Q...\\E", etc. but still can't make it work.
Any assistance would be appreciated. Thanks in advance!
taskFilename = "example.txt"
cat("
#!/usr/bin/perl
use strict;
use warnings;
use File::Find;
use File::Temp qw(tempfile);
my #imagedir_roots = ("/Users/Ross/Desktop/images");
my $parallel = 8;
my $exiftool_command = 'exiftool -all= -tagsfromfile # -all:all --gps:all --xmp:geotag -unsafe -icc_profile -overwrite_original';
# Create the (temporary) -# files
my #atfiles;
my #atfilenames;
for (my $i = 0; $i < $parallel; ++$i) {
my ($fh, $filename) = tempfile(UNLINK => 1);
push #atfiles, $fh;
push #atfilenames, $filename;
}
# Gather all JPG image files and distribute them over the -# files
my $nr = 0;
find(sub { print { $atfiles[$nr++ % $parallel] } "$File::Find::name\n" if (-f && /\.(?:jpg|jpeg)/i); }, #imagedir_roots);
# Process all images in parallel
printf("Processing %d JPG files...\n", $nr);
for (my $i = 0; $i < $parallel; ++$i) {
close($atfiles[$i]);
my $pid = fork();
if (!$pid) {
# Run exiftool in the background
system qq{$exiftool_command -# \"$atfilenames[$i]\"};
last;
}
}
# Wait for processes to finish
while (wait() != -1) {}
", fill = TRUE, file = taskFilename
)
I also played around with this once. If I remember correctly:
the double quotes you need to escape with \
if you want to write a \ you also need to escape it with \
if you want to write a \" you need to \\"
taskFilename = "example.txt"
cat("
#!/usr/bin/perl
use strict;
use warnings;
use File::Find;
use File::Temp qw(tempfile);
my #imagedir_roots = (\"/Users/Ross/Desktop/images\");
my $parallel = 8;
my $exiftool_command = 'exiftool -all= -tagsfromfile # -all:all --gps:all --xmp:geotag -unsafe -icc_profile -overwrite_original';
# Create the (temporary) -# files
my #atfiles;
my #atfilenames;
for (my $i = 0; $i < $parallel; ++$i) {
my ($fh, $filename) = tempfile(UNLINK => 1);
push #atfiles, $fh;
push #atfilenames, $filename;
}
# Gather all JPG image files and distribute them over the -# files
my $nr = 0;
find(sub { print { $atfiles[$nr++ % $parallel] } \"$File::Find::name\n\" if (-f && /\\.(?:jpg|jpeg)/i); }, #imagedir_roots);
# Process all images in parallel
printf(\"Processing %d JPG files...\n\", $nr);
for (my $i = 0; $i < $parallel; ++$i) {
close($atfiles[$i]);
my $pid = fork();
if (!$pid) {
# Run exiftool in the background
system qq{$exiftool_command -# \\\"$atfilenames[$i]\\\"};
last;
}
}
# Wait for processes to finish
while (wait() != -1) {}
", fill = TRUE, file = taskFilename
)

Downloading files from the Internet using Powershell with progress

I have been working on a powershell script that uses a .txt file to download multiple files from tinyurls. I have been successful in using Jobs to make this happen simultaneously, thanks to those on this forum.
The project requires some pretty large files to be downloaded, and using the current method has no progress indicator. I figured some users might think the program died. Looking for a way give a status of where it is in the download. Here is what I came up with, but I'm lost in how to pipe this information back out to the console. Any suggestions?
#Checks to see if NT-Download folder is on the Desktop, if not found, creates it
$DOCDIR = [Environment]::GetFolderPath("Desktop")
$TARGETDIR = "$DOCDIR\NT-Download"
if(!(Test-Path -Path $TARGETDIR )){
New-Item -ItemType directory -Path $TARGETDIR
}
$filepaths = Resolve-Path "files.txt"
Get-Content "$filepaths" | Foreach {
Start-Job {
function Save-TinyUrlFile
{
PARAM (
$TinyUrl,
$DestinationFolder
)
$response = Invoke-WebRequest -Uri $TinyUrl
$filename = [System.IO.Path]::GetFileName($response.BaseResponse.ResponseUri.OriginalString)
$filepath = [System.IO.Path]::Combine($DestinationFolder, $filename)
$totalLength = [System.Math]::Floor($response.get_ContentLength()/1024)
$responseStream = $response.GetResponseStream()
$buffer = new-object byte[] 10KB
$count = $responseStream.Read($buffer,0,$buffer.length)
$downloadedBytes = $count
try
{
$filestream = [System.IO.File]::Create($filepath)
$response.RawContentStream.WriteTo($filestream)
$filestream.Close()
while ($count -gt 0)
{
[System.Console]::CursorLeft = 0
[System.Console]::Write("Downloaded {0}K of {1}K", [System.Math]::Floor($downloadedBytes/1024), $totalLength)
$targetStream.Write($buffer, 0, $count)
$count = $responseStream.Read($buffer,0,$buffer.length)
$downloadedBytes = $downloadedBytes + $count
}
"`nFinished Download"
$targetStream.Flush()
$targetStream.Close()
$targetStream.Dispose()
$responseStream.Dispose()
}
finally
{
if ($filestream)
{
$filestream.Dispose();
}
}
}
Save-TinyUrlFile -TinyUrl $args[0] -DestinationFolder $args[1]
} -ArgumentList $_, "$TARGETDIR"
}
Have a look at Write-Progress
PS C:> for ($i = 1; $i -le 100; $i++ )
{write-progress -activity "Search in Progress" -status "$i% Complete:" -percentcomplete $i;}
Far more simple way : rely on Bits:
Start-BitsTransfer -Source $tinyUrl

How can I perform HTTP PUT uploads to a VMware ESX Server in PowerShell?

VMware ESX, ESXi, and VirtualCenter are supposed to be able to support HTTP PUT uploads since version 3.5. I know how to do downloads, that's easy. I've never done PUT before.
Background information on the topic is here: http://communities.vmware.com/thread/117504
You should have a look at the Send-PoshCode function in the PoshCode cmdlets script module ... it uses a POST, not a PUT, but the technique is practically identical. I don't have PUT server I can think of to test against, but basically, set your $url and your $data, and do something like:
param($url,$data,$filename,[switch]$quiet)
$request = [System.Net.WebRequest]::Create($url)
$data = [Text.Encoding]::UTF8.GetBytes( $data )
## Be careful to set your content type appropriately...
## This is what you're going to SEND THEM
$request.ContentType = 'text/xml;charset="utf-8"' # "application/json"; # "application/x-www-form-urlencoded";
## This is what you expect back
$request.Accept = "text/xml" # "application/json";
$request.ContentLength = $data.Length
$request.Method = "PUT"
## If you need Credentials ...
# $request.Credentials = (Get-Credential).GetNetworkCredential()
$put = new-object IO.StreamWriter $request.GetRequestStream()
$put.Write($data,0,$data.Length)
$put.Flush()
$put.Close()
## This is the "simple" way ...
# $reader = new-object IO.StreamReader $request.GetResponse().GetResponseStream() ##,[Text.Encoding]::UTF8
# write-output $reader.ReadToEnd()
# $reader.Close()
## But there's code in PoshCode.psm1 for doing a progress bar, something like ....
$res = $request.GetResponse();
if($res.StatusCode -eq 200) {
[int]$goal = $res.ContentLength
$reader = $res.GetResponseStream()
if($fileName) {
$writer = new-object System.IO.FileStream $fileName, "Create"
}
[byte[]]$buffer = new-object byte[] 4096
[int]$total = [int]$count = 0
do
{
$count = $reader.Read($buffer, 0, $buffer.Length);
if($fileName) {
$writer.Write($buffer, 0, $count);
} else {
$output += $encoding.GetString($buffer,0,$count)
}
if(!$quiet) {
$total += $count
if($goal -gt 0) {
Write-Progress "Downloading $url" "Saving $total of $goal" -id 0 -percentComplete (($total/$goal)*100)
} else {
Write-Progress "Downloading $url" "Saving $total bytes..." -id 0
}
}
} while ($count -gt 0)
$reader.Close()
if($fileName) {
$writer.Flush()
$writer.Close()
} else {
$output
}
}
$res.Close();
In the VI Toolkit Extensions use Copy-TkeDatastoreFile. It will work with binaries.

Resources