We have a asp.net web app with "Always on" that is running a long task. To avoid to run this task two or more time simultaneously, at the beggining of the task a flag is set to database. If this task is forced to shutdown the flag is not removed, and the task is not gonna run again without manual intervention.
I've been looking for if the concept of recycle a website is existing in Azure, I didn't find much about it. I found for example https://stackoverflow.com/a/21841469/1081568 it seems that is never executed recycled, but I find some people complaining about web apps with "always on" set that recycles randomly.
I would like to know in which circumstance an app could be Recycled/shutdown in Azure? Just for maintenance? Azure recycle asp.net webs apps? Or is a concept exclusive of On-Premise servers?
And another question, Is there a way to capture this shutdown/recycle from Azure and stop my running task gracefully if it's running.
Thanks
As far as I know, normally azure will not recycled your web app's resource, if you set web apps with "always on".
If web app's “Always On” setting is off, which means the web site will be recycled after period of inactivity (20 minutes).
And another question, Is there a way to capture this shutdown/recycle from Azure and stop my running task gracefully if it's running.
According to your description, I suggest you could send a kudu restapi request to get the current web app's processid.
If the application restarted, the processid will be changed. By comparing the processid, you could capture this web app is recycled.
More details about how to get the current web app's processid, you could refer to below steps:
1.Set a Deployment credentials in your azure web application as below:
Notice:Remember the user name and password, we will use them to generate the access token
2.Send the request to below url to get the process information.
Url:https://yourwebsitename.scm.azurewebsites.net/api/processes
Code sample:
string url = #"https://yourwebsitename.scm.azurewebsites.net/api/processes";
var httpWebRequest = (HttpWebRequest)WebRequest.Create(url);
httpWebRequest.Method = "GET";
httpWebRequest.ContentLength = 0;
string logininforation = "username:password";
byte[] byt = System.Text.Encoding.UTF8.GetBytes(logininforation);
string encode = Convert.ToBase64String(byt);
httpWebRequest.Headers.Add(HttpRequestHeader.Authorization, "Basic " + encode);
using (HttpWebResponse response = (HttpWebResponse)httpWebRequest.GetResponse())
{
using (System.IO.StreamReader r = new System.IO.StreamReader(response.GetResponseStream()))
{
string jsonResponse = r.ReadToEnd();
dynamic result = JsonConvert.DeserializeObject(jsonResponse);
dynamic resultList = result.Children();
foreach (var item in resultList)
{
Console.WriteLine(item.name + " : " + item.id);
}
}
}
Result:
You could also find the processid in the portal.
Select your web app --> Process explorer
Image:
Related
We build and deploy our web application to our dev environment automatically every night (using VSTS). When we come into the office in the morning, the first person to access the application has to wait an extended period for each page to load the first time. Subsequent loads are very fast.
The problem has a greater impact in our live environment where, after a deployment, it is potentially an end-user who is the first person to access the application and complain of slowness. To mitigate for this, a member of the team is currently accessing every page of the application manually after deployment to the live environment so that they 'pre-load' every page, which works, but is obviously time-consuming!
I've done a fair bit of searching on the subject, and have configured the appropriate Application Pool in our IIS server (IIS 8.5) so that its Start Mode is set to "AlwaysRunning". I've also edited our applicationHost file and set the appropriate Sites with the preloadEnabled="true" attribute. I did this after reading the instructions in this very helpful Microsoft documentation.
However, if I'm reading that documentation correctly, any pre-loading of the website which might alleviate the issue we're having (and I'm not even certain that this is the kind of pre-loading that I'm thinking of) only takes place when the server, the IIS service of the Application Pool are restarted. This isn't happening in our case. We need the pre-loading to take place following a deployment of the application to the IIS server.
Is there a way to automate this pre-loading?
One way of doing this would be to perform a HTTP request automatically:
As soon as the app was deployed (by running a task from the deploying machine)
Before the application pool has the chance to shut itself down (using Task Scheduler for instance)
Personally, I use a tool that is run in both cases to keep the site warmed up.
Advantages
Robust control over how and when this warm-up is executed.
It's completely independent from any IIS or web.config setup.
Disadvantages
Generates "bogus" log information.
Keeps the app permanently in memory (the Pool would never time-out, essentially wasting server resources for sites with a low # of visitors).
Sample
Such a tool could be a simple console app written as follows:
var taskInfo = new {
Url = "http://www.a-website-to-keep-warm.url",
UseHostHeader = true,
HostHeader = "www.a-website-to-keep-warm.url",
HttpMethod = "head"
};
HttpStatusCode statusCode = HttpStatusCode.Unused;
long contentLength = 0;
try
{
Dictionary<string, string> headers = new Dictionary<string, string>();
HttpWebRequest webRequest = (HttpWebRequest)WebRequest.Create(taskInfo.Url);
webRequest.Method = taskInfo.HttpMethod.ToUpper();
if(taskInfo.UseHostHeader)
webRequest.Host = taskInfo.HostHeader;
using (HttpWebResponse webResponse = (HttpWebResponse)webRequest.GetResponse())
{
//did we warm-up the site successfully?
statusCode = webResponse.StatusCode;
contentLength = webResponse.ContentLength;
//optionally read response headers
foreach (string header in webResponse.Headers)
{
headers.Add(header, webResponse.Headers[header]);
}
}
decimal kilobytes = Math.Round(contentLength / 1024M, 1);
Debug.WriteLine($"Got {kilobytes:F1} kB with statuscode: \"{statusCode} \" ...");
}
catch (Exception ex)
{
Debug.WriteLine($"taskInfo failed with exception: {ex.Message}");
}
In my case, I read a bunch of taskInfo objects from a json file and execute them asynchronously every X minutes, making sure X is lower than the Pool-timeout value. It is also run immediately after every deploy.
Because we're not interested in getting the entire content, it uses a HTTP HEAD request instead of GET. Lastly, it supports multiple sites on the same host by adding a Host header to the request.
I have a MVC web application and a console application created as a separate project inside my web application. I want this console application to be run as a windows service at specified intervals. The console application is for sending mail to some persons. I need to include my application URL in the mail content body which redirect to my application login page. Since i am running this service for more than one instance i could't hard code the URL in code. Someone please help. I tried the below code. But it is returning null value.
var site = HttpContext.Current.Request.Url.Scheme + "://" + HttpContext.Current.Request.Url.Authority + HttpContext.Current.Request.ApplicationPath.TrimEnd('/');
var url = string.Format("<a href='{0}'>Login</a>", site);
I converted this from the comment to the question.
You are trying to obtain information from HttpContext.Current in situation when there is no HttpContext available at all - because your console application is launched directly by the operating system and not on the event of incoming http request (as oposed to request handling in your MVC application) - so there simply is no http context to use (hence HttpContext.Current is null in your console application).
You have to establish your own application logic for your console application that determines which URL to use in your emails. What would that be depends on the answer to the question "what does the url to be used in each email depend on specifically"? In other words - "how should each email know what URL to use"? Once you figure out the answer to that question then you can think of how to pass that dependency to your windows service.
Example (I do not know if it describes your case):
there are several web applications on different URLs
each of these web applications can add email to the queue to be send
windows service (console app) is scheduled to run once in a while and process the queue by sending the emails. Each email has to have an URL of the application from where it was added.
Assuming the example above you can just add the email together with the URL of the application to the queue (insted of just the email) and then retrieve that information from the queue in your console application. So then each of the emails has associated URL. It is irrelevant how would the queue itself be implemented (SQL, file, ...).
I am working on an ERP asp.net mvc 5 web application deployed under iis7. And now I want to implement a new scanning service, which mainly uses powercli and power shell scripts, and scan our network for servers & vms and get their specifications and their statues.
So I am thinking of the following approach:-
1.Since the scanning should be access by only specific users and requires the hosting server to have powercli and other tools installed, so I want to create a new asp.net mvc 5 web application , and deploy it under iis7 instread of modifying my current ERP syste,. Where the new application will have the following action method which will do the scan as follow:-
public ActionResult ScanServer(string token)
{
// do the scan
//send n email with scanning result
}
2.Now inside my current ERP system I can manually initiating the scan by calling the above action method as follow:-
[HttpPost]
[CheckUserPermissions(Action = "", Model = "Admin")]
public ActionResult Scan()
{
try
{
string currentURL = System.Web.Configuration.WebConfigurationManager.AppSettings["scanningURL"];
using (WebClient wc = new WebClient())
{
string url = currentURL + "home/scanserver?token=*******" ;
var json = wc.DownloadString(url);
TempData["messagePartial"] = string.Format("Scan has been completed. Scan reported generated");
}
}
catch (WebException ex)
{
TempData["messageDangerPartial"] = string.Format("scanningservice can not be accessed");
}
catch (Exception e)
{
TempData["messageDangerPartial"] = string.Format("scan can not be completed");
}
Now I did a quick test where I manually started the scan from the ERP and the scanning service deployed under iis worked well.
But I have these questions:-
The scanning service might take 20-30 minutes to complete. So from an architecture point of view is my current approach considered valid ? I mean to initiate a scan by calling an action method from another application ?
Now can i inside the scanning service web application, to force it to call its action method on a timly basis (for example every 4 hours)?
Thanks
Your best option would be to write a windows service to install on the webserver alongside the web app. This windows service can use threads or a timer to execute a long running task (such as scanning your network) at a specified interval and send an email when finished.
You can talk to your service from the app using the database, a config file, or maybe even a registry entry.
If this will not work for you, you can look into some task scheduling apps such as Quartz.NET. If you do use a windows service, I recommend the excellent TopShelf which makes it easy to create and deploy. Here is a nice blog post I found by Scott Hanselman that may help.
I'm running an ASP web application that should start a Powershell script on the server. To run this Powershell script a lot of Domain rights are needed. So I run the apppool under a user that has all the rights.
But when I start the powershellscript I alway get the that the access is denied.
Has any one an idea how to solve the problem?
When I start a process as described, is the process running under the usercontext of the app pool or under the usercontext of the user which is logged in in the ASP.NET web application?
I'ver tried two methods
1.
string cmdArg = "C:\\Scripts\\test.ps1 " + username;
Runspace runspace = RunspaceFactory.CreateRunspace();
runspace.Open();
Pipeline pipeline = runspace.CreatePipeline();
pipeline.Commands.AddScript(cmdArg);
pipeline.Commands[0].MergeMyResults(PipelineResultTypes.Error, PipelineResultTypes.Output);
Collection<PSObject> results = pipeline.Invoke();
runspace.Close();
StringBuilder stringBuilder = new StringBuilder();
foreach (PSObject obj in results)
{
stringBuilder.AppendLine(obj.ToString());
string test = Environment.UserName;
}
return results[0].ToString();
2.
string cmdArg = "C:\\Scripts\\test.ps1 " + username;
Process myProcess = new Process();
ProcessStartInfo myProcessStartInfo = new ProcessStartInfo("powershell.exe",cmdArg);
myProcessStartInfo.UseShellExecute = false;
myProcessStartInfo.RedirectStandardOutput = true;
myProcess.StartInfo = myProcessStartInfo;
myProcess.Start();
StreamReader myStreamReader = myProcess.StandardOutput;
myProcess.WaitForExit();
string myString = myStreamReader.ReadLine();
return myString;
Ok, you think running the Apppool with these grand permissions is not best practise.
What about puting a webservice between? The webservice is in an appdomain that is only reachable from localhost?
Update
Ok, I've written an asp.net webservice. The webservice runs in an applicationpool with all rights but is only reachable from localhost. The webservice contains the code to start the script. The ASP MVC3 webapplication is running in a applicationpool with nearly no rights.
But when the webmethod is executed I always get an error that tell me, that I haven't enought rights. I tried to set the impersonate in the webconfig false, but without success.
Does anyone know how to solve this probleme?
Update:
I've read out the current user who execute the powershell when I start it from the webservice. I says it is the user who've got all rights. But the ps throws Errors like: you can't start a method with value null.
Then I've tried to run the ps with runsas as a low level user. I get the same errors.
Then I've tried to run the ps with the same user as in the webservice and everything worked!
Is there anyone who could explain this phenomenon?
And what is the different between my code above and a runas? (same user context)
thanks a lot!
Starting a new process in a HTTP request is not great for performance and it may also be a security risk. However, before ASP.NET and other modern web servers was available the only way to serve content (besides static files) was to execute a program in a separate process.
The API for doing this called the Common Gateway Interface (CGI) and still supported by IIS. If configured correctly you can execute a program on each request.
I'm not sure that you can use CGI to execute a script but then you can create an ISAPI filter that will execute PowerShell on files having extension .ps1. This is basically how for instance php.exe is executed in a new process when a file with extension .php is requested.
Enabling executable content can and should be done on a folder-by-folder basis to limit the security risk. In general you should avoid mixing different kinds of content, ie. it should not be possible to both view a script and also execute it.
If you intention is to be able to remotely run PowerShell scripts and not much else it should also be easy to write a small HTTP server in PowerShell completely removing IIS and ASP.NET from the equation.
I suppose this merely depends on the impersonation settings, if impersonation is enabled, then the currently logged in user is used, otherwise the app pool user
Bellow is my code from asp.net service which is trying to run some external exe. It works fine from my Visual Studio on win 7, but fails on my server (server 2008).
Myapp.exe reports back eror that account under which is runned doesn't have sufficiet priviliges.
List<ProcInfo> allProcesses = new List<ProcInfo>();
ProcessStartInfo pInfo = new ProcessStartInfo();
pInfo.FileName = binPath + #"\myApp.exe";
pInfo.WindowStyle = ProcessWindowStyle.Hidden;
pInfo.CreateNoWindow = true;
pInfo.UseShellExecute = false;
pInfo.RedirectStandardOutput = true;
string exitMsg = "";
int exitCode = 1;
try
{
using (Process proc = Process.Start(pInfo))
{
exitMsg = proc.StandardOutput.ReadToEnd();
proc.WaitForExit(1000);
exitCode = proc.ExitCode;
}
}
Resource pool on the server runs under account with sufficient priviliges and I also tried to use same account in code to start service with those same credentials and still nothing.
I have been told that account under which asp.net worker thread runs impose some additional limitations. So even if resource pool runs under appropriate account, you still won't have sufficient priviligies.
I also found something about using pInvoke and win32 api calls as the only way to run external code from asp.net service. But I don't have any win32 api knowlege nor did I found exples of this.
I would be very grateful for any tip/example how to run external exe under specified account from asp.net service.
If the account the worker process is runnning under lacks sufficient privelages then you have a few options.
You can either use impersonation in your code:
WindowsIdentity.Impersonate Method
Or configure IIS to run the application under a user account with the required privileges.
Here is an article which explains different methods of impersonation security:
Understanding ASP.NET Impersonation Security
If you do not feel confident implementing the user impersonation code yourself, here is a link to a codeproject article:
A small C# Class for impersonating a User