Powershell script to create folder, website and deploy using MSDeploy.exe - iis-7

I would like to deploy a web application on Windows 2008 R2. I know the separate PowerShell commands to do various tasks. But I would like to put this into a nice PowerShell script.
I just need the syntax, can you please help me to do the following actions:
Test if C:\Inetpub\MyWebsite folder exists, if not, create it.
Test in IIS7 if MyWebsite exists, if not create it (I know how to Import-Module WebAdministration and call New-WebSite)
Now the complicated part. I deploy a Web site from a package prepared by Visual Studio 2010. VS supplies a .cmd file where I just need to execute it from a DOS prompt. This means I have to leave the PS Console, open a DOS Console to run that cmd file. Is it possible to run a .cmd file from within a PowerShell console ?

To answer your questions:
Import-Module WebAdministration
# Check for physical path
$sitePath = "c:\inetpub\MyWebsite"
if (-not (Test-Path -path $sitePath))
{
New-Item -Path $sitePath -type directory
}
# Check for site
$siteName = "MyWebSite"
$site = Get-WebSite | where { $_.Name -eq $siteName }
if($site -eq $null)
{
Write-Host "Creating site: $siteName"
# Put your New-WebSite code here
}
# Execute your .cmd here
c:\PathToScript\MakeMySite.cmd
You can run .cmd scripts from within PowerShell just fine.

I have also changed a little bit. Using the same Test syntax to test if a website exists or not:
if (-not (Test-Path -path IIS:\Sites\$SiteName))
{
New-WebSite -Name $SiteName ...etc...
}
Also for executing the *.cmd file, I lift some code from the web and saw that people use & to execute external command. Hope that you are OK:
& c:\PathToScript\MakeMySite.cmd arg1 arg2
Thank you very much for your help.

If you need to run the .cmd file as administrator, you can use the following code:
Start-Process -FilePath C:\PathToScript\MakeMySite.cmd -Verb RunAs -ArgumentList "/y"

Related

Robocopy everything in subdirectory excluding the root files

How do I use robocopy so the root contents are not copied?
I already have root files stored elsewhere and I just would like to copy the sub directories and their contents while the source folder still containing root directory contents.
This is not possible with native robocopy switches as far as I can tell. You will need to use a script to enumerate the subdirectories and run robocopy against them.
Here is a sample PowerShell command that will accomplish what you want, copying everything from C:\temp\source\ to c:\temp\target\, excluding the files that are in c:\temp\source:
get-childitem c:\temp\source\* |?{$_.PsIsContainer} | %{robocopy $_.FullName c:\temp\target\$($_.Name) /S}
Credit to powershell ignore files in root but robocopy folders and their contents for the basics of this.
I don't have a reputation but the answer Mr. Hinkle gave solved 2 days of effort and searching. My challenge was moving the files that were > 1hour of age. This combination of powershell and robocopy appears to work. Below is my final code.
# Clear screen
cls
# Disconnect the drive if it exist - don't know where it is pointing to
If (Test-path p:) {
net use p: /delete
}
#Map the destination
net use p: \\server\dir1\dir2\dir3 password /USER:domain\user /P:no
get-childitem -path 'D:\dir1\dir2\' |
where-object {$_.LastWriteTime -lt (get-date).Addhours(-1)} |
?{$_.PsIsContainer} |
%{robocopy $_.FullName p:\$($_.Name) /S /MOVE /r:3 /W:1}
net use p: /delete

Npm multiple file script (lessc)

I'm trying to run a lessc PowerShell commandline in my package.json.
So in my scripts section I have something like that:
Get-ChildItem *.less -Recurse | ForEach-Object {lessc $_.FullName > $_.BaseName.css}
but it's giving me the following error:
'Get-ChildItem' is not recognized as an internal or external command
while this is executing normally in PowerShell
Get-ChildItem *.less -Recurse | ForEach-Object {echo $_.Name}
A plain lessc command also works as expected.
Any ideas?
Also check this question that have another solution for this.
The error message suggests that you're not running the statement in PowerShell. lessc is a node.js application, not a PowerShell command, so it's unsurprising that it works when used by itself.
To be able to use PowerShell cmdlets you need to run the statement in PowerShell, e.g. like this:
powershell.exe -ExecutionPolicy Bypass -Command "Get-ChildItem *.less -recurse | Foreach-Object{ lessc $_.FullName > $_.BaseName.css }"

How to deploy applications and include IIS settings?

I am looking for a methodology of how to deploy applications, websites and web services from DEV to ITG and PRO. I am not referring only to files included in the projects, but also settings from IIS, files/folders permissions, etc.
For example, this weekend we had to deploy a new application from ITG to PRO, and PRO AppPool was set to run .NET 2.0 (from previous version). It took us sometime to realize what was going on, leading of course to a longer downtime than expected.
Currently we are using VS 2013, C# 4.x, IIS 8.x and TFS 2013.
The question here is, if there is a way to deploy an application with a "single click". Is MSBuild suitable for this task? (I have no experience on MSBuild, I just found some stuff while Googling, and I am wondering if I need to go deeper). Can TFS read those settings from the source machine and copy them on the target somehow? Is there any other tool than can complete this task? We would like to stay inside Microsoft's circle, however if something else does exactly that, we might consider it.
This can be accomplished via PowerShell. I'm not a PowerShell expert, so my syntax on this is probably not following good practice etc. But I have a script I use to create production and test websites and pre-configure a couple of IIS settings. I then do my deployment from VS, but you could expand the script to perform the build and deployment for you too. It utilizes the WebAdministration module for configuring IIS.
CreateIntranetSite.ps1
param([string]$SiteName, [string]$Hostname)
if($SiteName -eq '') {
Write-Error "You must provide a SiteName parameter."
}
elseif($HostName -eq ''){
Write-Error "You must provide a HostName parameter."
}
else {
Invoke-Command -ComputerName $HostName -credential DOMAIN\mason.sa -ArgumentList $SiteName -ScriptBlock {
param([string]$SiteName)
$IntranetRoot = "E:\Intranet"
$DefaultHtml = "<html><head><title>$SiteName</title></head><body><h1>$SiteName</h1><p>The $SiteName virtual application has been successfully configured.</p></body></html>"
#Import IIS tools
Import-Module "WebAdministration"
#Create Folder
New-Item $IntranetRoot\$SiteName -type Directory
#Create Default Page
Set-Content $IntranetRoot\$SiteName\index.html $DefaultHtml
#Create App Pool
New-WebAppPool $SiteName
#Create Virtual Application
New-WebApplication -Name $SiteName -Site "Intranet" -PhysicalPath $IntranetRoot\$SiteName -ApplicationPool $SiteName
#Configuration Virtual Application
#Disable AnonymousAuthentication
Set-WebConfigurationProperty -filter /system.WebServer/security/authentication/AnonymousAuthentication -name enabled -value false -location Intranet/$SiteName
#Enable Windows Authentication
Set-WebConfigurationProperty -filter /system.WebServer/security/authentication/WindowsAuthentication -name enabled -value true -location Intranet/$SiteName
}
#Launch Browser to verify
$SiteUrl=''
if($HostName -eq 'wr-test01'){
$SiteUrl='https://testnet.termana.net/'+$SiteName
}
elseif($HostName -eq 'wr-web01'){
$SiteUrl='https://intranet.termana.net/'+$SiteName
}
$ie = new-object -com InternetExplorer.Application
$ie.Visible = $true
$ie.Navigate($SiteUrl);
}

IIS 7 Log Files Auto Delete?

Is there any feature in IIS 7 that automatically deletes the logs files older than a specified amt of days?
I am aware that this can be accomplished by writing a script(and run it weekly) or a windows service, but i was wondering if there is any inbuilt feature or something that does that.
Also, Currently we turned logging off as it is stacking up a large amount of space. Will that be a problem?
You can create a task that runs daily using Administrative Tools > Task Scheduler.
Set your task to run the following command:
forfiles /p "C:\inetpub\logs\LogFiles" /s /m *.* /c "cmd /c Del #path" /d -7
This command is for IIS7, and it deletes all the log files that are one week or older.
You can adjust the number of days by changing the /d arg value.
One line batch script:
forfiles /p C:\inetpub\logs /s /m *.log /d -14 /c "cmd /c del /q #file"
Modify the /d switch to change number of days a log file hangs around before deletion. The /s switch recurses subdirectories too.
Ref: http://debug.ga/iis-log-purging/
Similar solution but in powershell.
I've set a task to run powershell with the following line as an Argument..
dir D:\IISLogs |where { ((get-date)-$_.LastWriteTime).days -gt 15 }| remove-item -force
It removes all files in the D:\IISLOgs folder older than 15 days.
Another viable Powershell one-liner:
Get-ChildItem -Path c:\inetpub\logs\logfiles\w3svc*\*.log | where {$_.LastWriteTime -lt (get-date).AddDays(-180)} | Remove-Item -force
In case $_.LastWriteTime doesn't work, you can use $PSItem.LastWriteTime instead.
For more info and other suggestions to leverage the IIS LogFiles folder HDD space usage, I also suggest to read this blog post that I wrote on the topic.

Powershell script to map to network drive and download files

I am a newbie to powershell. I want to write a script to do the following
Check if I have mapped to a network drive
If not, map to it
Once mapped, check the files in 4 folders at a path on the network drive
If the files are newer to those I compare to on a local drive, copy them ie. only copy new/updated files
Any help with this would be great as I need a starting position to take on this.
You can use net use to determine if the drive is mapped or not:
net use s:
if ($LastExitCode -ne 0)
{
net use s: \\server\share
}
$folders = #{local = 'c:\path1';remote='s:\path1'},
#{local = 'c:\path2';remote='s:\path2'}
$folders | Foreach {xcopy $_.remote $_.local /E /C /H /R /Y /D /I}
Don't forget the existing console tools tend to work just fine in PowerShell and sometimes are the easiest way to get the job done.

Resources