How we get analysis services database size in azure analysis services Tabular model - azure-analysis-services

we want to know analysis services database size after it's deploy on Azure analysis services

You can use below script to find overall size and model wise size For Azure Analysis server in (MB/GB)
Param($ServerName="<servername>")
$loadInfo =
[Reflection.Assembly]::LoadWithPartialName(“Microsoft.AnalysisServices”)
$server = New-Object Microsoft.AnalysisServices.Server
$server.connect($ServerName)
if ($server.name -eq $null) {
Write-Output (“Server ‘{0}’ not found” -f $ServerName)
break
}
$sum=0
foreach ($d in $server.Databases )
{
Write-Output ( “Database: {0}; Status: {1}; Size: {2}MB” -f $d.Name,
$d.State, ($d.EstimatedSize/1024/1024).ToString(“#,##0”) )
$sum=$sum+$d.EstimatedSize/1024/1024
}
$SizeGB=$Sum/1024
write-host ‘Sum of Database = ‘$sum ‘ MB’
Write-host ‘Total Size of Cube Databases =’ $SizeGB ‘ GB’
Thanks,
Mahendar

You can use either this http://www.sqlbi.com/tools/vertipaq-analyzer/ or this https://www.kasperonbi.com/new-ssas-memory-usage-report-using-power-bi/ tool.

Use SSMS:
Connect via SSMS, right click model name and properties. It will give you an estimated size of your deployed tabular model.
Use Metrics:
Under Monitoring Tab of your Azure Server listed in Portal, goto Monitoring->Metrics->From the available list of checkboxes, check Memory.
Hope this helps.

Related

Use Powershell to download attached excel file from Servicenow ticket

I'm currently working on a powershell script that should be able to download an attached excel file from a service now ticket, before I explain more please see the basic flow of the automation below.
The user will be asked to enter the ticket number.
The system will then find that incident ticket to accurately get the excel file needed(I saw online that I need to use sys_id).
It will then be downloaded to a specific path on the user's machine. ex: "C:\downloads\Demo\".
Following all this, I found a sample script online that I'm trying to configure to match my needs; however, I'm not sure where to get the values on that sample script. You can check the bullets below the script for the questions I have in mind.
$IncidentNumber = Read-Host -Prompt 'Enter Incident Request #'
#$admin = "admin"
#$password = "admin" | ConvertTo-SecureString -AsPlainText -Force
#$Credential = New-Object pscredential -ArgumentList ($admin,$password)
$Uri = "https://dev42835.service-now.com/api/now/table/incident?sysparm_query=number=$($IncidentNumber)&sysparm_fields=sys_id&sysparm_limit=1"
$IncidentResult = Invoke-RestMethod -Uri $Uri #-Method Get -Credential $Credential
if($IncidentResult.result.sys_id -ne $null) {
$IncidentAttachments = Invoke-RestMethod -Uri "https://dev42835.service-now.com/api/now/attachment?sysparm_query=table_sys_id=$($IncidentResult.result.sys_id)" #-Method Get -Credential $Credential
$IncidentAttachments.result | Select file_name , download_link
}
else{
"Incident Not Found!"
}
Do I really need the credentials to run the script? If yes, is there a way to
remove the need of the credentials?
Where can I get the URL that is assigned to the $URI variable?
I'm new to powershell automation so I would appreciate it if you can recommend better approach if there are any.
Yes, you need credentials but don't hard code them like that. Instead you can use built-in method Get-Credential that will securely collect your username and password. The user will have to enter their own ServiceNow credentials each time this is run.
My version only has one thing you need to configure, the $SubDomain variable which is specific to your tenant.
$SubDomain = "YourServiceNowSubdomaingoeshere" # Configure this per tenant
$Credential = Get-Credential
If(!$Credential){
# User Cancelled
exit 1
}
$IncidentNumber = Read-Host -Prompt 'Enter Incident Request #'
$Uri = "https://$SubDomain.service-now.com/api/now/table/incident?sysparm_query=number=$($IncidentNumber)&sysparm_fields=sys_id&sysparm_limit=1"
$IncidentResult = Invoke-RestMethod -Uri $Uri -Method Get -Credential $Credential
if($IncidentResult.result.sys_id -ne $null) {
$IncidentAttachments = Invoke-RestMethod -Uri "https://$SubDomain.service-now.com/api/now/attachment?sysparm_query=table_sys_id=$($IncidentResult.result.sys_id)" -Method Get -Credential $Credential
$IncidentAttachments.result | Select file_name , download_link
}
else{
"Incident Not Found!"
}
Yes you need credentials.
Your URI is the URL of your servicenow instance. Change the dev42835 portion to match. If you're unsure of your instance, contact servicenow support.
https://dev42835.service-now.com
If you use the REST API explorer, you can view API endpoints and versions which will help with forming your requests. You do need to have the rest_api_explorer role to access the REST API Explorer. If you do not have this role, contact your service-now admin requesting it.
https://docs.servicenow.com/bundle/geneva-servicenow-platform/page/integrate/inbound_rest/task/t_GetStartedAccessExplorer.html

Close user sessions AXAPTA

im making a website who uses the Dynamics AX Business Connector to connect with AX, it´s working fine but sometimes the users don´t logout.
Here is my code:
Microsoft.Dynamics.BusinessConnectorNet.Axapta DynAx = new Microsoft.Dynamics.BusinessConnectorNet.Axapta();
try
{
DynAx.Logon(null, null, null, null);
//Execute some methods
DynAx.Logoff();
}
catch (Exception ex)
{
DynAx.Logoff();
}
and in ax i can see the users logged in. Again this happens sometimes, thats why i don´t know who may be.
Maybe Dispose() Method it's better?.
Thank you for taking your time to read this.
Logon/logoff works for me correctly, but if you're saying sometimes it doesn't, then the reason is most likely one of the following:
The business connector can be flaky. It's wasn't a Microsoft priority and eventually was depreciated.
Whatever is happening in your //Execute some methods section could be locking or preventing the logoff.
You may need to update your kernel to get an updated version of the business connector
In my AX2012R3 environment I can run the below PowerShell code over and over with success. This points me towards one of the above as a cause.
Add-Type -Path "C:\Program Files\Microsoft Dynamics AX\60\BusinessConnector\Bin\Microsoft.Dynamics.BusinessConnectorNet.dll"
$ax = new-object Microsoft.Dynamics.BusinessConnectorNet.Axapta
 
$ax.logon($null, $null, $null, $null)
$b = $ax.CreateAxaptaRecord("userinfo")
$array = New-Object System.Collections.ArrayList
 
$b.ExecuteStmt("select id from %1")
while($b.found){
$array.add($b.get_field("id")) | out-null
$b.next() | out-null
}
 
$array | Format-Table -AutoSize
$ax.Logoff()
$ax.Dispose()

Limit number of instances of Symfony Command

I have a command in my Symfony app launched by Cron. I want to be able to limit the number of instances executed at the same time on my server, let's say 4 instances. I don't have any clue on how to do this. I found how to lock the command to launch the command only one time and wait for it to finish, but I don't know how to launch more than one and limit the number of instances anyway.
Do you have an idea ?
What you are looking for is a semaphore.
There is a LockComponent currently scheduled for 3.4 (was pulled from 3.3). It is a major improvement over the LockHandler in the FilesystemComponent.
In a pinch, you can probably pool a fixed number of locks from the LockHandler. I don't recommend it, because it uses flock on the filesystem. This limits the lock to a single server. Additionally, flock may be limited to the process scope on some systems.
<?php
use Symfony\Component\Filesystem\LockHandler;
define('LOCK_ID', 'some-identifier');
define('LOCK_MAX', 5);
$lockPool = [];
for ($i = 0; $i <= LOCK_MAX;) {
$lockHandle = sprintf('%s-%s.lock', LOCK_ID, ++$i);
$lockPool[$i] = new LockHandler($lockHandle);
}
$activeLock = null;
$lockTimeout = 60 * 1000;
$lockWaitStart = microtime(true);
while(!$activeLock) {
foreach ($lockPool as $lockHandler) {
if ($lockHandler->lock()) {
$activeLock = $lockHandler;
break 2;
}
}
if ($lockTimeout && ($lockTimeout > microtime(true) - $lockWaitStart)) {
break;
}
// Randomly wait between 0.1ms and 10ms
usleep(mt_rand(100, 10000));
}
A much better and efficient solution would be to use the semaphore extension and work some magic with ftok, shm_* and sem_*.
i suggest you to use a process control system as supervisor. it's pretty simple to use and you can choose how many instance of your script you start.
http://supervisord.org/
You could use a shared counter file which holds a counter that gets increased when the Command starts running and decreases it before it's finished.
Another solution would be checking the process list with something like this:
$processCount = exec('ps aux | grep "some part of the console command you run" | grep -v "grep" | wc -l' );
if(!empty($processCount) && $processCount >= X) {
return false;
}
You can create a "launcher command" executed by your cron or supervisor.
This Symfony commad can launch your instances with the process component on your server. You can also check whatever you want to check and do everything you want to do like the exec php function.

Download Multiple Files from http using Powershell with proper names

I have searched for something similar and I keep running across the FTP download answers. This is helpful information, but ultimately proving to be difficult to translate. I have found a powershell script and it works, but I am wondering if it can be tweaked for my needs. I don't have much experience with powershell scripting, but I'm trying to learn.
The need is this. I need to download and install a series of files to a remote machine, unattended. The files are distributed via email via tinyurls. I currently throw those into a .txt file, then have a powershell script read the list and download each file.
Requirements of the project and why I have turned to powershell (and not other utilities), is that these are very specialized machines. The only tools available are ones that are baked into Windows 7 embedded.
The difficulties I run into are:
The files download one at the time. I would like to grab as many downloads at the same time that the web server will allow. (usually 6)
The current script creates file names based off the tinyurl. I need the actual file name from the webserver.
Thanks in advance for any suggestions.
Below is the script I’m currently using.
# Copyright (C) 2011 by David Wright (davidwright#digitalwindfire.com)
# All Rights Reserved.
# Redistribution and use in source and binary forms, with or without
# modification or permission, are permitted.
# Additional information available at http://www.digitalwindfire.com.
$folder = "d:\downloads\"
$userAgent = "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:7.0.1) Gecko/20100101 Firefox/7.0.1"
$web = New-Object System.Net.WebClient
$web.Headers.Add("user-agent", $userAgent)
Get-Content "d:\downloads\files.txt" |
Foreach-Object {
"Downloading " + $_
try {
$target = join-path $folder ([io.path]::getfilename($_))
$web.DownloadFile($_, $target)
} catch {
$_.Exception.Message
}
}
If you do the web request before you decide on file name you should be able to get the expanded path (otherwise you would have to make two web requests, one to get the extended path and one to download the file).
When I tried this, I found that the BaseResponse property of the Microsoft.PowerShell.Commands.HtmlWebResponseObject returned by the Invoke-WebRequest cmdlet had a ResponseUri property which was the extended path we are looking for.
If you get the correct response, just save the file using the name from the extended path, something like the following (this sample code does not look at HTTP response codes or similar, but expects everything to go well):
function Save-TinyUrlFile
{
PARAM (
$TinyUrl,
$DestinationFolder
)
$response = Invoke-WebRequest -Uri $TinyUrl
$filename = [System.IO.Path]::GetFileName($response.BaseResponse.ResponseUri.OriginalString)
$filepath = [System.IO.Path]::Combine($DestinationFolder, $filename)
try
{
$filestream = [System.IO.File]::Create($filepath)
$response.RawContentStream.WriteTo($filestream)
$filestream.Close()
}
finally
{
if ($filestream)
{
$filestream.Dispose();
}
}
}
This method could be called using something like the following, given that the $HOME\Documents\Temp folder exists:
Save-TinyUrlFile -TinyUrl http://tinyurl.com/ojt3lgz -DestinationFolder $HOME\Documents\Temp
On my computer, that saves a file called robots.txt, taken from a github repository, to my computer.
If you want to download many files at the same time, you could let PowerShell make this happen for you. Either use PowerShell workflows parallel functionality or simply start a Job for each url. Here's a sample on how you could do it using PowerShell Jobs:
Get-Content files.txt | Foreach {
Start-Job {
function Save-TinyUrlFile
{
PARAM (
$TinyUrl,
$DestinationFolder
)
$response = Invoke-WebRequest -Uri $TinyUrl
$filename = [System.IO.Path]::GetFileName($response.BaseResponse.ResponseUri.OriginalString)
$filepath = [System.IO.Path]::Combine($DestinationFolder, $filename)
try
{
$filestream = [System.IO.File]::Create($filepath)
$response.RawContentStream.WriteTo($filestream)
$filestream.Close()
}
finally
{
if ($filestream)
{
$filestream.Dispose();
}
}
}
Save-TinyUrlFile -TinyUrl $args[0] -DestinationFolder $args[1]
} -ArgumentList $_, "$HOME\documents\temp"
}

What's a good way in PowerShell to check for an IP address switch on a webserver?

Problem
Our web host provider is changing the IP address of one of the servers we are on. We have been given a time frame for when the switch will take place, but no exact details. Therefore, our current poor man's check requires a periodic page refresh on a browser to see if our website is still there.
Question
We are all programmers here and this is killing me that any manual checking is required. I would know how to do this in other languages, but want to know if there is a way to write a script in PowerShell to tackle this problem. Does anyone know how I might going about this?
If you can alert if the page is gone or does not have an expected value, you could use a script like
$ip = 192.168.1.1
$webclient = new-object System.Net.WebClient
$regex = 'regular expression to match something on your page'
$ping = new-object System.Net.NetworkInformation.Ping
do
{
$result = $ping.Send($ip)
if ($result.status -ne 'TimedOut' )
{
$page = $webclient.downloadstring("http://$ip")
if (($page -notmatch $regex) -or ($page -match '404') -or ($page -eq $null))
{ break}
}
} while ($true)
write-host "The website has moved"
This will list the IP Address for each network adapter in your system.
Get-WmiObject -Class Win32_NetworkAdapterConfiguration -Filter IPEnabled=TRUE -ComputerName . | Select-Object -Property IPAddress

Resources