What is the right approach when users (authenticated domain admins) should be able to start batch jobs (usually exe files) from an IIS (7.x) aspx (c#) page? This is an intranet site. The batch jobs have to run on the web server as Domain Admins. The website pool is executed by network service or some similar restricted account in AD.
Approaches I can think of (and their disadvantages):
1. Start exe file with System.Diagnostics.Process.Start with another account. This feature is disabled in IIS 7.x, how do I allow it?
2. Create a sheduled task and call scheduled task-api. This unmanaged dll is giving VS compiler warnings because it's unsafe to call from managed code.
3. I suppose there's a better approach, because the previous suggestions doesn't appear safe or robust.
I would suggest that you have either a Scheduled Task or Windows Service that polls a common storage repository to see if a batch job should be run - as batch jobs are typically used for long running process.
Your could persist the deatils of which batch file and arguments you want to run from your ASP.NET website into a database (MySQL, Oracle, SQL Server, etc) and then have your Windows Service / Scheduled Task poll against this database at regular intervals.
I agree with Kane. However, if you must do...
http://www.dotnetscraps.com/dotnetscraps/post/Run-a-batch-file-as-a-specific-User-(or-Administrator)-from-ASPNET.aspx
Related
I'm new using ASP.Net MVC, so i'm doing an auto-training in order to develop a web portal for an intranet that can receive request from users to deploy Virtual Machines from Azure, the request is received by an administrator who can run a script from the portal to create the Virtual Machine.
For example, The idea is to store the scripts in a database, so when the administrator do the action to create the "virtual Machine 01" (he have limited option of Virtual Machines configurations to create), the software run the script "01" store on the database.
That is possible? I hope I have explained the idea well.
Also, if that is possible, can I also show the possibles error messages is something wrong happened?.
Instead of using Powershell, why not manage it directly from your .NET code?
Azure provides API's that can be called from .NET.
http://azure.microsoft.com/en-us/documentation/api/
You'd probably want to look at their Compute Management API for handling virtual machines.
http://azure.microsoft.com/en-us/documentation/api/management-compute-sdk-net/
We have DLLs that contain hundreds of custom client processes that are kicked off from an ASP.NET application. Our clients run these processes while performing data entry, and typically there's only 1 process per client. On any given day, we might update 2 or 3 of these processes.
Currently these are all housed in a series of DLLs, which means that we are publishing our application a couple times per day. As a result, any logged-in clients get booted out of the system since the publish causes an app restart.
Is there a way that we can update these DLLs without requiring a full publish each time?
If your client processes have a common API then you could host, the DLLs separately in a WCF (or similar) service, and call the client processes remotely. So basically, consider moving to a service oriented architecture.
Check out the Managed Extensibility Framework (MEF) from Microsoft. It provides not only dependency management but also plug-in like library loading. Most likely it's exactly what you're looking for.
You can switch to SQL server or state server sessions in order to perserve session and logged users after app restart. Or store these dlls in APP_DATA and load it dinamically. Then of course you have to think of some refreshing system and refresh loaded dlls with newly uploaded ones.
There is no sensible way to avoid an application restart. Please note the emphasis on the word sensible.
A web application works with the database. Once a day, the database should be scanned and alerts should be sent to users.
From what I've seen out there, additional project has to be created which will be installed on the server and will work with the same database. Executable created by this project has to be installed in Windows scheduler to be activated once a day.
This seems complicated and inefficient: starting additional executable and working on the same database.
Is this the best possible way to do this?
Well you have different possibilities: Windows Scheduler with an executable is a good one. Another possibility is to write a Windows Service which will execute the task in the background. Quartz.NET is a good framework for this but the Windows Scheduler might be sufficient for your scenario. One thing is for sure: it is be better to perform these tasks outside of your ASP.NET application.
I would like to run a bat script on one of the machines on the domain, from my asp.net application. Machine with the batch script is a temporary storage machine, and script synchronizes it with the permanent storage machine. So, what is the optimal way of doing this?
The thing I tried is to use PsExec to run a script on the remote machine. I create a process that makes a PsExec call, and it actually does it's job pretty well. However, since the ASP.NET worker thread runs under ASP.NET account that has restricted privileges, I must hard-code my domain user credentials in PsExec call, and that's something I do not like doing.
Is there a way to overcome this problem, or maybe some other approach that I could try?
Thanks...
You can use the <identity impersonate="true" /> setting in your Web.config to have the application run under the IUSR, or you can set a username/password on the identity tag to an account you'd like to use to run the BAT file.
I had previously found some details on Impersonate over at: http://www.aspdev.org/articles/web.config/
But I'm sure a quick web search will find you even more info on Impersonate.
The answer above doesn't work. New processes will be spawned by default under the ASP.NET account.
see http://support.microsoft.com/kb/889251.
"To spawn a process that runs under the context of the impersonated user, you cannot use the System.Diagnostics.Process.Start method. This is because in ASP.NET, impersonation is performed at the thread level and not at the process level. Therefore, any process that you spawn from ASP.NET will run under the context of the ASP.NET worker process and not under the impersonated context."
What are the advantages/disadvantages to running time based jobs using:
windows services
Application_BeginRequest to start seperate threads / timers.
One disadvantage of running the jobs in the context of a asp.net web appplication is during .net recycling things will have to be setup again, any others?
To my mind, there's no real benefit to doing time-based things in a web app. Go straight to a windows service. You know the process should be up and running all the time.
The ASP.NET site may simply unload, and will only operate again once someone starts browsing. The lifecycle is all wrong -- it's much 'choppier' than a service.
Lastly, services aren't very hard to create.
If you have administrative access to the server, I would either run a Windows Service or a scheduled SQL job depending on what you are trying to achieve.
It is nice to be able to stop/start and log these jobs independent of your web application. Also, if you have problems or errors in the job, it could adversely affect your website.
Finally, you are forcing the web application to go through code at every request to see if the timer has elapsed, which is an unnecessary overhead.
As I said to start with, the implementation depends on what the job is. If it is simply to update a number of database records, I'd use a scheduled job in SQL Server. If you need file I/O or access to external services, then a Windows Service might be more appropriate.
It is worth noting that you need to build in your own scheduling and thread safety into Windows Services. An alternative is to build a console application and use an application like FireDaemon for the scheduling.