I have a command line program which runs fine from the DOS window interactively. When I try to execute the same command and arguments from within an asp.net app, it fails. The web app has Script & Execute permissions. For testing purposes, I gave full control permissions to the Everyone group for the whole web app folder and for the folder containing the .exe file (resides in Program Files). The web app is using Windows integrated security (no anonymous) so it should use my credentials and I am an admin. Running in WIndows XP.
The code below runs the command. The filename and arguments look good. However the p.ExitCode =1 instead of 0. How do I troubleshoot and find out why the process is failing? I looked at the process in the debugger and I see some 'System.InvalidOperationException' exceptions but that doesn't tell me much. Why InvalidOperationException?
Process p = new Process();
p.StartInfo.FileName = filename;
p.StartInfo.Arguments = arguments;
p.StartInfo.CreateNoWindow = true;
p.StartInfo.UseShellExecute = true;
p.StartInfo.RedirectStandardOutput = false;
p.StartInfo.RedirectStandardError = false;
p.Start();
p.WaitForExit();
if (0 != p.ExitCode)
{
ApplicationException ex =
new ApplicationException(string.Format("\"{0} {1}\" returned ({2})", filename, arguments, p.ExitCode));
throw ex;
}
There's a variety of things an app can do which would cause problems when ran by another app unfortunately, with no further info there's simply too many options.
You'll probably have to redirect standard output and/or standard error to find out what's going wrong (Or if the command-line program is .NET you'll likely have errors in your windows Event Log).
For a simple test you could do something as simple as:
p.StartInfo.RedirectStandardError = true;
p.Start();
p.WaitForExit();
string err = p.StandardError.ReadToEnd();
Note that you'll need a more complicated threaded solution for a real build, as if the console app writes too much data to standard error it'll fill the buffer and hang. I can help you out with a more robust solution for console output if you need it, but this will likely get you your debug info.
The web app is using Windows
integrated security (no anonymous) so
it should use my credentials and I am
an admin
No, that's not correct. The worker process still runs under ASPNET for Windows XP default settings, unless you enable ASP.NET impersonation.
Related
I have a signalr server which is hosted in IIS.
There is a function in the hub which starts 600 processes in windows and then kills them.
//start 600 processes
for (int i = 0; i < 600; i++)
{
try
{
Process myProcess = Process.Start(startInfo);
proclist.Add(myProcess);
Task.Delay(10).Wait();
}
catch(Exception e)
{
feedback = "Process " + i + " cannot be started: " + e.Message;
break;
}
feedback = "All processes are running.";
}
//kill them
foreach (var proc in proclist)
{
try
{
proc.Kill();
Task.Delay(10).Wait();
}
catch (Exception e)
{
feedback = "Process " + proclist.IndexOf(proc) + " cannot be killed: " + e.Message;
break;
}
feedback = "All Processes are killed.";
}
However, when I call this function in Client I get an Exception whiling killing the processes:
Process 104 cannot be killed: Die Anforderung kann nicht verarbeitet werden, da der Prozess beendet wurde(The request cannot be proceeded, because the process is already terminated.)
It seems that I can only keep 104 processes runing. And the rest of them terminate immediately after start.
I tried the same thing in a Console Application, and all processes can be started and killed.
I tried to consume a lot of memory using another application and I could also keep 104 processes running.
I tried to consume a lot of memory using another application and I could also keep 104 processes running.
I also checked all possible IIS configuration and I could not find any settings which is related to this issue.
So I would like to ask whether anyone knows how to start more procecces in an ASP.NET application.
I will appreciate it very much if someone can help me. Thanks!
I strongly suggest you do not execute 600 (or any multiple of hundred) processes under ASP.NET. You will really strain the resources on the Aspnet_wp.exe process which could hurt the performance of the IIS box.
You need to re-think the design.
If this was me, I would consider creating an external process outside of ASP.NET which could do the hard work for you. For example, maybe you can create a Windows service (or even just a .NET console application running on the server) that waits (i.e. listens) on a file system folder for a file (you can name the file anything you like e.g. start.txt) to be created which you could do when a request to your website is made. That service will then execute the 600 exe files for you.
I'm not familiar with lasttest, so my suggestion might not be adequate. However, I do not believe you will achieve what you are looking for using your current design. It's going to hurt performance, and in fact, I'm not suprised that a limit of running processes has been reached. I do not know of any documentation that points to how many exe files you can run in Aspnet_wp.exe, but that's likely because the ASP.NET team never expected anyone to attempt this.
I try to call an .exe file from a webapplication.
But I want the file called by the user that is impersonalisated by windoes authentication from the website.
Process process = new Process();
try
{
process.StartInfo.UseShellExecute = false;
process.StartInfo.FileName = ConfigData.PVDToBudgetDBexePath;
process.StartInfo.CreateNoWindow = false;
process.Start();
log.Info("Process started by " + WindowsIdentity.GetCurrent().Name + " with ID: " + process.Id);
process.WaitForExit();
log.Info("After WaitForExit Process ID: " + process.Id);
}
catch (Exception ex)
{
log.Error("Error executing file with message " + ex.Message);
}
Both info log texts are logged correctly. There is no error occuring.
But the called Program does not do anything. No logging, no writing in Database.
The user has executable rights on the file.
When I call the same Code from Development Server it works fine.
I use .Net 4.5 and IIS 7
I found posts concerning this topic only for very old versions of .Net and IIS and that could not help me.
What am i doing wrong?
Or how can I find out whats going wrong?
many thanks,
EDIT:
To better make clear what I intend:
I have this (self made) exe file that imports Data from Excel Sheets into a Database. That needs some time. While doing this it logs its Progress whith log4net also into the database.
I want an UI (web application) were the user can trigger the import.
on this UI there is also an ajax progressbar that shows the progress of the import takten from the log table in the database.
I want maximum one instance of this import process to run in the same time. So I have a function that checks wheter the process is still running.
If so it does not allow to start another process. If not you can start it again.
private bool IsRunning(string name)
{
Process[] processlist = Process.GetProcesses();
if (Process.GetProcessesByName(name).Length > 0 )
{
return true;
}
else
{
return false;
}
}
I solved the problem now by starting the exe file via TimeScheduler.
path = filepath to the exe file
arguments = arguments to start the exe file with
using Microsoft.Win32.TaskScheduler;
using (TaskService taskService = new TaskService())
{
var taskDefinition = taskService.NewTask();
taskDefinition.RegistrationInfo.Author = WindowsIdentity.GetCurrent().Name;
taskDefinition.RegistrationInfo.Description = "Runs exe file";
var action = new ExecAction(path, arguments);
taskDefinition.Actions.Add(action);
taskService.RootFolder.RegisterTaskDefinition("NameOfTask", taskDefinition);
//get task:
var task = taskService.RootFolder.GetTasks().Where(a => a.Name == "NameOfTask").FirstOrDefault();
try
{
task.Run();
}
catch (Exception ex)
{
log.Error("Error starting task in TaskScheduler with message: " + ex.Message);
}
}
If you mean by development server the web server that is launched by Visual Studio, than this gives you a false test case since that server is launched by Visual Studio and uses your Windows account to run, while a standard configured IIS does not run under a "user" account but a very limited system account (luckily !!). Even if the user is logged in with a domain account in your website, the IIS process will not run under this account (that wouldn't make sense anyway). That is the reason why this code will not run in IIS and will run in your development server. Even if you get the exe to launch, it will run using the system account of IIS since you didn't supply any account, which is a limited account which will again run the exe different than you expected.
You will have to use impersonation, if you really want to go this way, but you will have to launch that process "impersonating" the user that is logged in in the website, asuuming that user account used to login even makes sense at that point. E.g. if it is a domain account, this might work, but if you use some other kind of authentication, like forms authentication, this has no meaning on OS level and thus you cannot use those credentials for impersonation in IIS.
In my experience, and I have done this a few times, impersonation in IIS is always a bad thing and is always creating issues, the same goes for launching command line process by the way.Luckily there is always a better/alternative solution when you think about it. Also the wait for a process to end in your code is not really a good practice. What if the process blocks? It will block website.
Luckily there is always a better/alternative solution when you think about it. A better/possible solution here is to use message queuing for example, where you just push a message to execute the task, and on the other end an application which processes the messages, which might use this command line tool then. That application can run under any user account you want, without you having to let IIS run under a different account. Later on you must of course come back to find the result of the operation, but that can be done using a callback in the background of your website. though this solution is a little bigger than what you are trying to do, it will have a better result on almost every field (responsiveness of your site, maintainability, scalability,..) the only thing where it is worse is the lines of code that you will need, but that is seldomly a valid factor to take into account
If you write the appplication for excel processing yourself, you can use a table in the DB as some kind of queue instead of using a message bus. Your web application then just needs to add rows with all necesarry info for the process in that table, the status and progress being one of them. Extend your processing application to monitor this table continuously and as soon as it detects a new record, it can then start to do the necessary task and update the db accordingly progress and status and end result). This avoids the messaging sub-system, will work equally good and will avoid you to have to launch a process with impersonation, which was the evil thing to start with.
You can modify the excel process to a windows service so that it runs continuously and starts with the system, but, if you don't want to, there are also tools to run any command line application as a windows service).
This technique would be much easier than the impersonation and allows your website to run in it's protected environment
I have a scheduled task set up to run Scan.aspx every 3 minutes in IE7. Scan.aspx reads data from 10 files in sequence. These files are constantly being updated. The values from the file are inserted into a database.
Sporadically, the value being read is truncated or distorted. For example, if the value in the file was "Hello World", random entries such as "Hello W", "Hel", etc. will be in the database. The timestamps on these entries appear completely random. Sometimes at 1:00 am, sometimes at 3:30 am. And some nights, this doesn't occur at all.
I'm unable to reproduce this issue when I debug the code. So I know under "normal" circumstances, the code executes correctly.
UPDATE:
Here is the aspx codebehind (in Page_Load) to read a text file (this is called for each of the 10 text files):
Dim filename As String = location
If File.Exists(filename) Then
Using MyParser As New FileIO.TextFieldParser(filename)
MyParser.TextFieldType = FileIO.FieldType.Delimited
MyParser.SetDelimiters("~")
Dim currentrow As String()
Dim valueA, valueB As String
While Not MyParser.EndOfData
Try
currentrow = MyParser.ReadFields()
valueA= currentrow(0).ToUpper
valueB = currentrow(1).ToUpper
//insert values as record into DB if does not exist already
Catch ex As Exception
End Try
End While
End Using
End If
Any ideas why this might cause issues when running multiple times throughout the day (via scheduled task)?
First implement a Logger such as Log4Net in your ASP.NET solution and Log method entry and exit points in your Scan.aspx as well as your method for updating the DB. There is a chance this may provide some hint of what is going on. You should also check the System Event Log to see if any other event is associated with your failed DB entries.
ASP.NET is not the best thing for this scenario especially when paired with a Windows scheduled task; this is not a robust design. A more robust system would run on a timer inside a Windows-Service-Application. Your code for reading the files and updating to the DB could be ported across. If you have access to the server and can install a Windows Service, make sure you also add Logging to the Windows Service too!
Make sure you read the How to Debug below
Windows Service Applications intro on MSDN: has further links to:
How to: Create Windows Services
How to: Install and Uninstall Services
How to: Start Services
How to: Debug Windows Service Applications]
Walkthrough: Creating a Windows Service
Application in the Component Designer
How to: Add Installers to Your Service Application
Regarding your follow up comment about the apparent random entries that sometimes occur at 1am and 3.30am: you should:
Investigate the IIS Log for the site when these occur and find out what hit(visited) the page at that time.
Check if there is an indexing service on the server which is visiting your aspx page.
Check if Anti-Virus software is installed and ascertain if this is visiting your aspx page or impacting the Asp.Net cache; this can cause compilation issues such as file-locks on the aspnet page in the aspnet cache; (a scenario for aspnet websites as opposed to aspnet web applications) which could give weird behavior.
Find out if the truncated entries coincide with the time that the files are updated: cross reference your db entries timestamp or logger timestamp with the time the files are updated.
Update your logger to log the entire contents of the file being read to verify you've not got a 'junk-in > junk-out' scenario. Be careful with diskspace on the server by running this for one night.
Find out when the App-Pool that your web app runs under is recycled and cross reference this with the time of your truncated entries; you can do this with web.config only via ASP.NET Health Monitoring.
Your code is written with a 'try catch' that will bury errors. If you are not going to do something useful with your caught error then do not catch it. Handle your edge cases in code, not a try catch.
See this try-catch question on this site.
I have the following code in a web method of a .NET 2.0 VB web service application:
<WebMethod()> _
Public Function UpdateCall(<System.Xml.Serialization.XmlElement("CallChangeRequest")> ByVal callChange As CallChangeMessage) As Boolean
Dim result As Boolean = False
Dim mq As System.Messaging.MessageQueue = Nothing
Dim msg As System.Messaging.Message = Nothing
Try
mq = New System.Messaging.MessageQueue(System.Configuration.ConfigurationManager.AppSettings("queueName"))
mq.Formatter = New System.Messaging.XmlMessageFormatter(New Type() {GetType(CallChangeMessage)})
msg = New System.Messaging.Message(callChange)
msg.Recoverable = Convert.ToBoolean(System.Configuration.ConfigurationManager.AppSettings("recoverableMessages"))
mq.Send(msg)
result = True
Catch ex As Exception
Finally
If Not (msg Is Nothing) Then msg.Dispose()
If Not (mq Is Nothing) Then mq.Dispose()
End Try
Return result
End Function
This code correctly serializes my object and writes an XML message to the message queue on my old local development machine, which was running Windows XP Pro 32-bit. It also performs correctly on the production server. After upgrading to Windows 7 64-bit, this code will no longer write the XML message in the message queue. The call to msg.Send(msg) does not raise an exception, so my web method returns true to the caller. However, no message is written to the queue, so the code is just quietly failing.
The messages are marked as Recoverable, and the queue name is FormatName:DIRECT=OS:.\private$\inbounddata. I've opened up the permissions for basically everyone to do everything. If there were a permission problem, I would expect to get some sort of exception, so I don't think I have a permission problem. The web service is running in an application pool targeting the 2.0 framework with managed pipeline mode set to Classic.
Up to now, my recourse has been to deploy the code on my old development box, which I still have hanging around, but I cannot do that forever. Is there a problem with my code that is causing this to fail? Does anyone know of any issues with writing to MSMQ on Windows 7 64-bit from a 32-bit .NET 2.0 framework app?
Thanks for any help!
Don't know what happened, but I deleted the existing queue, created a new queue, and now all is well. Thanks for the suggestions.
I have a collection of sites running on a single server. The server also runs a console application which collects data and distributes this data to the websites.
I am not always about to check if the application is running and I would like to give the end user (a select few users!) the option to start/restart this application on the server by using a webform. (click a button and application starts).
I have got the console application to start by using the following code:
ProcessStartInfo info = new ProcessStartInfo(FileName);
Process App1 = null;
App1 = Process.Start(info);
But no console window appears and I would like the console to open a window so that if I log onto the server I can check that the application is running.
I have tried adding:
info.CreateNoWindow = false;
and a few other things, but this is not my area so I am struggling.
Any ideas how I can get the console to open in a normal window? Or am I going about this all the wrong way?
Also, is there a way of finding if the application is running and either kill it before trying to start it, just restarting it, or not allowing the end user the option to do anything.
Many thanks
T
As Aristos says, the console app will open on the server...not the client.
Look here for a start on how to open a process from asp and the security implications
If you need the client to be able to view anything I suggest having the service write to a log in a database and having the aspx page read this log.
Also, maybe write your console app as a windows service, not just an application?
Good example here
Hope this helps.
ProcessStartInfo info = new ProcessStartInfo(FileName);
Process App1 = null;
info.CreateNoWindow = true;
info.UseShellExecute = false;
info.RedirectStandardOutput = true;
App1 = Process.Start(info);
try using info.usershellexecute property