I have a very urgent requirement. I have a ASP.net application on framework 4.0 done in MVC architecture. In the application I am calling a Perl script to copy data from MySQL to SQL Server 2010 in the backend. When I run the application from Visual Studio 2010, the Perl script runs successfully and the data is copied. But when I deploy the same application on IIS 7.5, it does not show any change or the Perl scipt does not run. I tried printing each step of the code and found all the paths are coming correct. The perl script is to be run via a batch file.
Below is the code to start the process which runs a batch file which in turn runs the Perl script:
string strPath = string.Empty;
string strDirectory = string.Empty;
try {
strPath = "/k " + ConfigurationManager.AppSettings["UploadTLInfo"];//Path of the batch file comes from here
strDirectory = ConfigurationManager.AppSettings["WorkingDirectory"];
ProcessStartInfo psi = new ProcessStartInfo("cmd.exe",strPath);
psi.UseShellExecute = false;
psi.WorkingDirectory = strDirectory;
//psi.CreateNoWindow = true;
//psi.WindowStyle = ProcessWindowStyle.Hidden;
Process p = new Process();
p.StartInfo = psi;
p.Start();
} catch (Exception ex) {
throw ex;
}
Most probably, your issue will be related to user permissions - when you run from VS, you are probably using ASP.NET Dev server and it users current user's credentials while when you run from IIS, it would be using NETWORK_SERVICE (or similar system users) that may have limited permissions causing the issue. Another (but unlikely) issue can be that batch file and/or pearl might be relying on some environment variable that are not defined at machine level etc.
One of the solution would be to run the process impersonating another user that has specific permission and/or environment (see this to run process as another user).
Related
I have a PowerBuilder.net console application that uses SQLNCLI10 to connect a SQL Server 2008 R2. This applications needs to be executed from a ASP.NET MVC website hosted in the same server, and its output readed.
This is the code for executing the application on the MVC:
var proc = new System.Diagnostics.Process
{
StartInfo = new System.Diagnostics.ProcessStartInfo
{
FileName = ConfigurationManager.AppSettings["PB_EXE"],
Arguments = Credentials.ClientId + " " + reference,
UseShellExecute = false,
RedirectStandardOutput = true,
WindowStyle = System.Diagnostics.ProcessWindowStyle.Hidden
}
};
When the application is executed from the MVC, the PBTransaction objects returns the generic "Transaction not connected" sql error.
The thing is, the application runs well with a .bat that call the executable with parameters. Even more, a simple WinForms using the same code above, gets the transaction connected successfully.
I have alredy tried setting the ApplicationPool of the website to the administrator of the server, with the same results.
What I'd check first to solve connectivity issues on a PB.NET console application.
Putting some logging into the console application and examine the Transaction object immediately before and after attempting the connection. It will likely provide necessary information to identify the culprit.
Make sure PowerBuilder database connectivity run-time libraries (e.g. PBORA010.DLL) are installed on the server and accessible to the role running the application.
If the connection uses ODBC/JDBC make sure any DSNs needed on server are accessible to the console application.
I have a Web API application that needs to run a Python script which in turn runs a Perl script:) does some otehr stuff and get the output results from it.
The way I do this is with starting a Process:
var start = new ProcessStartInfo()
{
FileName = _pythonPath, //#"C:\Python27\python.exe",
Arguments = arguments, //#"D:\apps\scripts\Process.py
UseShellExecute = false,
RedirectStandardOutput = true,
RedirectStandardError = true
};
using (Process process = Process.Start(start))
{
using (StreamReader reader = process.StandardOutput)
{
var result = reader.ReadToEnd();
var err = process.StandardError.ReadToEnd();
process.WaitForExit();
return result;
}
}
The script inside tries to connect to Perforce server using P4 Python API and then Perl script call a P4 command as well. When running this code from Console application, everything goes fine. The program automatically gets the Perforce settings (I've got a P4V client with all the settings specified). But when running from ASP.NET Web API, it doesn't get the settigns and says that it cannot conenct to perforce:1666 server (I guess this is the standard value when no settign specified).
I do understand that not so many people use Perforce, especially in such way and can help here, but would like to know what is the difference between running this script from Console app and Web API app that mich cause this different behaviour.
One of the most obvious differences between running code from a console application and running code in IIS* is that, usually, the two pieces of code will be running under different user accounts.
Frequently, if you're experiencing issues where code works in one of those situations and not the other, it's a permissions or a per-user-settings issue. You can verify whether this is the case by either running the console application under the same user account that is configured for the appropriate IIS application pool, or vice verse, configure the application pool to use your own user account, and then see whether the problem persists.
If you've confirmed that it's a permissions issue and/or per-user-settings, then you need to decide how you want to fix it. I'd usually recommend not running IIS application pools under your own user account - if you cannot seem to get the correct settings configured for the existing application pool user, I'd usually recommend creating a separate user account (either locally on the machine or as part of your domain, depending on what's required) and giving it just the permissions/settings that it requires to do the job.
*IIS Express being the exception here because it doesn't do application pools and the code does end up running under your own user account.
I am currently developing a bridge between a database and a sql server database android (sqlite).
To do this, I use a web service and filtered replications with sqlce databases stored on a server IIS 7. To develop this, I used IIS Express on my pc. Everything works fine on my pc but when deployed, replicated databases create a new subscription for each synchronization (while on IIS Express sync works without creating a new subscription), no exceptions are thrown, nothing.
I don't understand why IIS 7 or IIS Express works in a different way to sync databases.
An idea of the reason of the problem?
I use Sql replication ce 3.5 sp2, Sql server 2008 R2 as mother database, IIS7.
I tried to :
-Replicate motherdatabase with a little software on the webserver, it works.
-Use "SqlCeReplication.loadproperties" to find my sqlce database parameters to avoid the creation of the new subscription, it doesn't work.
Other details :
-When it is not syncing for the first time, webservices don't do "repl.AddSubscription(AddOption.CreateDatabase)", I already check that.
Here is the code used to sync :
' Define the server, publication, and database names.
Dim repl As SqlCeReplication = Nothing
Try
repl = New SqlCeReplication
repl.Publisher = PublisherName
repl.PublisherLogin = login
repl.PublisherPassword = password
repl.PublisherSecurityMode = SecurityType.DBAuthentication
repl.PublisherDatabase = My.Settings.DBName
repl.Publication = PublicationName
repl.InternetUrl = "https://iis.mydomain.com:444/sql/sqlcesa35.dll"
repl.InternetLogin = "sync"
repl.InternetPassword = "dfssd"
repl.Subscriber = "MobileApp - " & login
repl.SubscriberConnectionString = "Data Source=" & My.Settings.folderDB + "\" + subscriptionName + ".sdf"
If Not File.Exists(My.Settings.dossierBDD + "\" + subscriptionName + ".sdf") Then
repl.AddSubscription(AddOption.CreateDatabase)
End If
repl.Synchronize()
Catch err As SqlCeException
Finally
repl.Dispose()
repl = Nothing
End Try
Thanks in advance
I found myself the answer...
I don't really know why but if you want to do merge replication on sqlce, you will need to let the IIS website "Load the user profile" (on the application pool), otherwise subscription are created each time...
I have been trying out running a PowerShell script from asp.net with no success for a few days already.
The C# is:
using (var process = new Process())
{
ProcessStartInfo startInfo = new ProcessStartInfo();
startInfo.FileName = #"powershell.exe";
startInfo.Arguments = "arguments that call the script here";
startInfo.RedirectStandardOutput = false;
startInfo.RedirectStandardError = false;
startInfo.UseShellExecute = true;
startInfo.CreateNoWindow = true;
process.StartInfo = startInfo;
process.Start();
}
The PowerShell script it calls contains the ff:
robocopy "\\network-server\location" "C:\localfolder" "testmovefile.txt"
Obviously the problem here would be the proper credentials. But I have tried doing all sorts of impersonation stuff, whether from C# or in the script level. I tried doing this to the PowerShell script:
$username = "domain\user"
$password = ConvertTo-SecureString –String "password" –AsPlainText -Force
$pp = new-object -typename System.Management.Automation.PSCredential -argumentlist $username,$password
start-process powershell -argument "C:\localfolder\CopyFile.ps1" -Credential $pp
It works when I run the script in the PowerShell console locally, even when using an account that has no permissions to the network, however when called from the web app.. nothing happens.
The App Pool Identity is just set to the default App Pool Identity though.. I found out that if I change the identity to a custom account with the proper rights.. it works.
I am still trying to search for a different solution.. I want a scenario that you can just change anything in the script and it will still still run. Any is OK as long as it does not change the app pool identity.
I tried these as well:
http://huddledmasses.org/using-alternate-credentials-with-the-filesystem-in-powershell/
using runspace in c# instead of process and using an impersonator How do you impersonate an Active Directory user in Powershell?
But it still doesn't work. I keep on getting access denied. Question is, is it possible to make it work by impersonating someone inside PowerShell?
App pool identities have very limited access to the local file system (and none outside the local computer). You will need to modify ACLs on the file system to give the identities the access they need.
In Server 2008 (or Vista) this has to be done with the command line (eg. icacls.exe) as the permissions GUI does not support app pool identity; with later versions this can be done with the GUI.
Process Monitor is a good tool for working out where access is being blocked.
However if you need to access network resources this will not work. App pool identities are purely local, they have no meaning on the network. You need to use a domain account with the applicable access (or multiple local accounts with the same name and password).
I want to use SDelete after some code is run on an asp.net page. SDelete is a command line tool. My specific question is has anyone been able to run this SDelete from an asp.net page? More generic question would be, how do you run a command line utility from an asp.net page?
thanks
You can run a command line utility using the Process Class
Process myProcess = new Process();
myProcess.StartInfo.UseShellExecute = false;
// You can start any process, HelloWorld is a do-nothing example.
myProcess.StartInfo.FileName = "C:\\HelloWorld.exe";
myProcess.StartInfo.CreateNoWindow = true;
myProcess.Start();
One more example closer to asp.net, that I wait to end, and read the output.
Process compiler = new Process();
compiler.StartInfo.FileName = "c:\\hello.exe";
compiler.StartInfo.StandardOutputEncoding = System.Text.Encoding.UTF8;
compiler.StartInfo.UseShellExecute = false;
compiler.StartInfo.CreateNoWindow = false;
compiler.StartInfo.RedirectStandardError = true;
compiler.StartInfo.RedirectStandardOutput = true;
compiler.StartInfo.RedirectStandardInput = true;
// here I run it
compiler.Start();
compiler.StandardInput.Flush();
compiler.StandardInput.Close();
// here I get the output
string cReadOutput = compiler.StandardOutput.ReadToEnd();
compiler.WaitForExit();
compiler.Close();
Aristos' answer will work in cases where user privs are in order and the SysInternals EULA is acknowledged. By that, I mean the sdelete.exe utility from SysInternals will be run under the Asp.Net account assigned in IIS. If that account doesn't have the proper permissions and hasn't accepted the popup EULA, the file isn't deleted. I'm facing that very issue right now.
You can specify the domain/user/password used to run the process. That is outlined here:
http://social.msdn.microsoft.com/Forums/hu-HU/netfxbcl/thread/70b2419e-cb1a-4678-b2ae-cedcfe08d06f
The author of that thread had similar problems, which he cleared up by changing the ownership of the sdelete.exe file.
This thread also has some information about logging in as the user used to execute the process and accepting the SysInternals EULA:
sdelete.exe is not working with cfexecute
However that isn't feasible if you plan on using the built-in Asp.Net system accounts since those user accounts don't allow typ login. I may be forced to create a separate user that I can login with and accept the EULA, then specify those credentials to run the process. Unfort in my case, though, I may not have the option of creating users on my production server.
There are ways to force the EULA accept with a command line param, or a simple registry entry. But I think that only works for "regular" users--not the built in system users.