Powershell Scheduled Task Open Web Page to Execute Script - asp.net

In an effort to setup a "cron job" like scheduled task on Windows I've setup a Powershell script using code recommended via previous stackoverflow question.
I have some backups that I need to cleanup daily and delete old backups so I created a asp.net script to perform this task - the file name is BackupCleanup.aspx and I have confirmed that the ASP.net script does work when executed on its own by visiting the above url - I however cannot get it to execute using the Powershell script below.
The Powershell Script code I'm using is:
$request = [System.Net.WebRequest]::Create("http://127.0.0.1/BackupCleanup.aspx")
$response = $request.GetResponse()
$response.Close()
I have created this file with a PS1 extension, it shows properly in my os (Windows 2008) - I have tried both manually executing this task by right clicking and choosing "Run with Powershell" and also have scheduled this as a task - both to no avail.
I cannot figure out why the script does not work - any help would be GREATLY appreciated.

Here is the Powershell script I use to call up web pages using IE. Hopefully this will work for you as well.
Function NavigateTo([string] $url, [int] $delayTime = 100)
{
Write-Verbose "Navigating to $url"
$global:ie.Navigate($url)
WaitForPage $delayTime
}
Function WaitForPage([int] $delayTime = 100)
{
$loaded = $false
while ($loaded -eq $false) {
[System.Threading.Thread]::Sleep($delayTime)
#If the browser is not busy, the page is loaded
if (-not $global:ie.Busy)
{
$loaded = $true
}
}
$global:doc = $global:ie.Document
}
Function SetElementValueByName($name, $value, [int] $position = 0) {
if ($global:doc -eq $null) {
Write-Error "Document is null"
break
}
$elements = #($global:doc.getElementsByName($name))
if ($elements.Count -ne 0) {
$elements[$position].Value = $value
}
else {
Write-Warning "Couldn't find any element with name ""$name"""
}
}
Function ClickElementById($id)
{
$element = $global:doc.getElementById($id)
if ($element -ne $null) {
$element.Click()
WaitForPage
}
else {
Write-Error "Couldn't find element with id ""$id"""
break
}
}
Function ClickElementByName($name, [int] $position = 0)
{
if ($global:doc -eq $null) {
Write-Error "Document is null"
break
}
$elements = #($global:doc.getElementsByName($name))
if ($elements.Count -ne 0) {
$elements[$position].Click()
WaitForPage
}
else {
Write-Error "Couldn't find element with name ""$name"" at position ""$position"""
break
}
}
Function ClickElementByTagName($name, [int] $position = 0)
{
if ($global:doc -eq $null) {
Write-Error "Document is null"
break
}
$elements = #($global:doc.getElementsByTagName($name))
if ($elements.Count -ne 0) {
$elements[$position].Click()
WaitForPage
}
else {
Write-Error "Couldn't find element with tag name ""$name"" at position ""$position"""
break
}
}
#Entry point
# Setup references to IE
$global:ie = New-Object -com "InternetExplorer.Application"
$global:ie.Navigate("about:blank")
$global:ie.visible = $true
# Call the page
NavigateTo "http://127.0.0.1/BackupCleanup.aspx"
# Release resources
$global:ie.Quit()
$global:ie = $null

I had the same issue. I manually opened powershell and executed my script and I received "WebPage.ps1 cannot be loaded because running scripts is disabled on this system.".
You have to allow scripts to run
Execute the below in PowerShell
Set-ExecutionPolicy RemoteSigned -Scope LocalMachine

Related

Creating multiple zip files with PHP

I have a problem in PHP / Laravel to create multiple zip files synchronously, I copy all commands that is generated and squeeze into the Shell, it executes normally, but when I step into the PHP run it only generates the first file = /.
Controller code.
foreach ($passwords as $p){
if($i == 0){
$command = 'zip -u -j -P '.$p.' '.$dir.'/'.$count.'.zip '.storage_path().'/app/'.$directory.'/'.$file1->getClientOriginalName();
$commands->push($command);
}else{
$command = 'zip --quiet -j -P '.$p.' '.$dir.'/'.$count.'.zip '.storage_path().'/app/'.$directory.'/'.($count+1).'.zip';
$commands->push($command);
}
$count--;
$i++;
}
foreach ($commands as $p){
echo $p.'<br/>';
}
foreach ($commands as $c){
$process = new Process($c);
$process->start();
sleep(10);
if($process->isTerminated()){
sleep(1);
}
if ($errorOutput = $process->getErrorOutput()) {
throw new RuntimeException('Process: ' . $errorOutput);
}
}
Data $commands
The script only generates the file 50.zip.
Not sure if sleep could interfere with subprocess (shell command). Please try:
foreach ($commands as $c){
$process = new Process($c);
// Set the timeout instead of sleeping
$process->setTimeout(10);
$process->start();
// Wait for the process to finish
$process->wait();
if ($errorOutput = $process->getErrorOutput()) {
throw new RuntimeException('Process: ' . $errorOutput);
}
}
wait() call uses usleep in a more fine-grained manner it might help with that.
Does it work like this?

Running powershell script on asp.net site

I am trying to run a powershell script and have it output to my asp.net site. I have made it work with a very simple script where the only command in the script was
Get-Service | Out-String
and this output onto my site everything I expected
but when I use the script I actually want info from it doesn't output anything
I can tell it runs (or trys to run) because when my site hits the code that invokes the script it hangs about 10 seconds.
The script I am trying to run is
$user = "user"
$token = "token"
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $user,$token)))
$result = Invoke-WebRequest -Method Get -Uri 'https://site.vsrm.visualstudio.com/defaultcollection/product/_apis/release/releases?definitionId=1&api-version=3.0-preview.2&$expand=environments' -ContentType "application/json" -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)}
$releaseArr = $result.Content | ConvertFrom-Json
[System.Collections.ArrayList]$enviromentName = #()
[System.Collections.ArrayList]$latestRelease = #()
foreach($env in $releaseArr.value[0].environments)
{
$enviromentName.Add($env.name) | Out-Null
}
foreach($releaseValue in $releaseArr.value)
{
For($i = 0; $i -lt $enviromentName.Count; $i++)
{
if($latestRelease[$i] -eq $null)
{
foreach($release in $releaseValue.environments)
{
if($release.name -eq $enviromentName[$i] -and $release.status -eq "succeeded")
{
$latestRelease.Add($releaseValue.name) | Out-Null
}
}
}
}
}
For($i = 0; $i -lt $enviromentName.Count; $i++)
{
Write-Host $enviromentName[$i] " : " $latestRelease[$i]
}
I know this script runs and outputs, but is there some code in this script that would cause it to not output properly.
The code in my asp.net site I am using to call the script is
ResultBox.Text = string.Empty;
// Initialize PowerShell engine
var shell = PowerShell.Create();
// Add the script to the PowerShell object
shell.Commands.AddScript(#"C:\Users\user\Desktop\script.ps1");
// Execute the script
var results = shell.Invoke();
// display results, with BaseObject converted to string
// Note : use |out-string for console-like output
if (results.Count > 0)
{
// We use a string builder ton create our result text
var builder = new StringBuilder();
foreach (var psObject in results)
{
// Convert the Base Object to a string and append it to the string builder.
// Add \r\n for line breaks
builder.Append(psObject.BaseObject.ToString() + "\r\n");
}
// Encode the string in HTML (prevent security issue with 'dangerous' caracters like < >
ResultBox.Text = Server.HtmlEncode(builder.ToString());
}
Change "Write-Host" to "Write-Output." Write-Host only outputs to interactive consoles.
You can see this in action:
Make a new PowerShell file and add a write-host statement to it:
[nick#nick-lt temp]$ New-Item -Type File -Path .\example.ps1 -Force
[nick#nick-lt temp]$ Set-Content .\example.ps1 "Write-Host 'Hello World'"
Then try and set a variable to the result of the script:
[nick#nick-lt temp]$ $what = .\example.ps1
Hello World
[nick#nick-lt temp]$ $what
[nick#nick-lt temp]$
Hello World shows up when the script executes but the variable is empty.
Now change it to write-output:
[nick#nick-lt temp]$ Set-Content .\example.ps1 "Write-Output 'Hello World'"
[nick#nick-lt temp]$ $what = .\example.ps1
[nick#nick-lt temp]$ $what
Hello World
The variable actually contains what it is supposed to now.
One of the cardinal rules of PowerShell is to not use Write-Host except in script that will be run interactively. .NET needs the results in the output stream not the host stream.

Returning output from scriptblock from Get-Job?

I'm trying to test out asynchronous functionality in PowerShell 3.
So I figured I would query the uptimes of some servers remotely, and wrote the following script:
function getServerUptimes() {
# Obtain credentials for logging into
# the remote machines...
$cred = Get-Credential
$compsHash = #{
"server1.domain.com" = $null;
"server2.domain.com" = $null;
}
# Define an async block to run...no blocking in this script...
$cmd = {
param($computer, $dasCred)
Write-Host "which: $($computer)"
$session = new-cimsession $computer -Credential $dasCred
return (get-CimInstance win32_operatingsystem -CimSession $session | Select PScomputername, LastBootuptime);
}
ForEach($comp in $compsHash.Keys) {
Write-Host "${comp}"
# Kick off an async process to create a cimSession
# on each of these machines...
Start-Job -ScriptBlock $cmd -ArgumentList $comp, $cred -Name "Get$($comp)"
}
$results = Get-Job | Wait-Job | Receive-Job
# Retrieve the job, so we can look at the output
ForEach($comp in $compHash.Keys) {
$dasJob = Get-Job -Name "Get$($comp)"
Write-Host $dasJob.output
}
}
However, I don't really seem to get back any output in the resulting $dasJob object, I returned the value in my scriptblock, where is it going?
You already have the output (not including Write-Host output, of course) in the variable $results. In PowerShell you retrieve job output via Receive-Job. The property Output seems to always be empty (not sure why).

Downloading files from the Internet using Powershell with progress

I have been working on a powershell script that uses a .txt file to download multiple files from tinyurls. I have been successful in using Jobs to make this happen simultaneously, thanks to those on this forum.
The project requires some pretty large files to be downloaded, and using the current method has no progress indicator. I figured some users might think the program died. Looking for a way give a status of where it is in the download. Here is what I came up with, but I'm lost in how to pipe this information back out to the console. Any suggestions?
#Checks to see if NT-Download folder is on the Desktop, if not found, creates it
$DOCDIR = [Environment]::GetFolderPath("Desktop")
$TARGETDIR = "$DOCDIR\NT-Download"
if(!(Test-Path -Path $TARGETDIR )){
New-Item -ItemType directory -Path $TARGETDIR
}
$filepaths = Resolve-Path "files.txt"
Get-Content "$filepaths" | Foreach {
Start-Job {
function Save-TinyUrlFile
{
PARAM (
$TinyUrl,
$DestinationFolder
)
$response = Invoke-WebRequest -Uri $TinyUrl
$filename = [System.IO.Path]::GetFileName($response.BaseResponse.ResponseUri.OriginalString)
$filepath = [System.IO.Path]::Combine($DestinationFolder, $filename)
$totalLength = [System.Math]::Floor($response.get_ContentLength()/1024)
$responseStream = $response.GetResponseStream()
$buffer = new-object byte[] 10KB
$count = $responseStream.Read($buffer,0,$buffer.length)
$downloadedBytes = $count
try
{
$filestream = [System.IO.File]::Create($filepath)
$response.RawContentStream.WriteTo($filestream)
$filestream.Close()
while ($count -gt 0)
{
[System.Console]::CursorLeft = 0
[System.Console]::Write("Downloaded {0}K of {1}K", [System.Math]::Floor($downloadedBytes/1024), $totalLength)
$targetStream.Write($buffer, 0, $count)
$count = $responseStream.Read($buffer,0,$buffer.length)
$downloadedBytes = $downloadedBytes + $count
}
"`nFinished Download"
$targetStream.Flush()
$targetStream.Close()
$targetStream.Dispose()
$responseStream.Dispose()
}
finally
{
if ($filestream)
{
$filestream.Dispose();
}
}
}
Save-TinyUrlFile -TinyUrl $args[0] -DestinationFolder $args[1]
} -ArgumentList $_, "$TARGETDIR"
}
Have a look at Write-Progress
PS C:> for ($i = 1; $i -le 100; $i++ )
{write-progress -activity "Search in Progress" -status "$i% Complete:" -percentcomplete $i;}
Far more simple way : rely on Bits:
Start-BitsTransfer -Source $tinyUrl

Tcl - Recursive walking and FTP upload

How can I do a recursive walk through a local folder in order to upload everything it has inside it to the desired ftp folder? Here's what I have so far :
package require ftp
set host **
set user **
set pass **
set ftpdirectory **
set localdirectory **
proc upload {host user pass dir fileList} {
set handle [::ftp::Open $host $user $pass]
ftpGoToDir $handle $dir
# some counters for our feedback string
set j 1
set k [llength $fileList]
foreach i $fileList {
upload:status "uploading ($j/$k) $i"
::ftp::Put $handle $i
incr j
}
::ftp::Close $handle
}
#---------------
# feedback
#---------------
proc upload:status {msg} {
puts $msg
}
#---------------
# go to directory in server
#---------------
proc ftpGoToDir {handle path} {
::ftp::Cd $handle /
foreach dir [file split $path] {
if {![::ftp::Cd $handle $dir]} {
::ftp::MkDir $handle $dir
::ftp::Cd $handle $dir
}
}
}
proc watchDirChange {dir intv {script {}} {lastMTime {}}} {
set nowMTime [file mtime $dir]
if [string eq $lastMTime ""] {
set lastMTime $nowMTime
} elseif {$nowMTime != $lastMTime} {
# synchronous execution, so no other after event may fire in between
catch {uplevel #0 $script}
set lastMTime $nowMTime
}
after $intv [list watchDirChange $dir $intv $script $lastMTime]
}
watchDirChange $localdirectory 5000 {
puts stdout {Directory $localdirectory changed!}
upload $host $user $pass $ftpdirectory [glob -directory $localdirectory -nocomplain *]
}
vwait forever
Thanks in advance :)
You're already using the ftp package, so that means you've got tcllib installed. Good. That means in turn that you've got the fileutil package as well, and can do this:
package require fileutil
# How to do the testing; I'm assuming you only want to upload real files
proc isFile f {
return [file isfile $f]
}
set filesToUpload [fileutil::find $dirToSearchFrom isFile]
The fileutil::find command is very much like a recursive glob, except that you specify the filter as a command instead of via options.
You might prefer to use rsync instead though; it's not a Tcl command, but it is very good and it will minimize the amount of data actually transferred.

Resources