1) Is it possible to start R sessions on Linux (e.g. Rsession1) and submit multiple jobs in batch mode to the same R Session (e.g. job1 to Rsession1 and then later on based on user action submit job2 to Rsession1)?
This is equivalent to opening interactive R Session and submitting job1 and the user can submit job2 in same session (which will be available until the user closes interactive R session)
2) Is it possible to start two R sessions on Linux (e.g. Rsession1 and Rsession2) and submit multiple jobs in batch mode but specify session-id during job submission?
This is equivalent to opening two interactive R Sessions and submitting jobs to different R sessions by clicking on the window manually submitting the job.
I'm not sure what your end goal is, but have you considered something like the futures package that will allow R to send work off to another thread to be completed? That way the work can be done but doesn't lock up the main R session while the work is being completed. This way, via the main R session, you could launch job1 and then while that was still being worked on, launch job2.
You can use save.image at the end of each job to store the workspace and load at the beginning of the following jobs to restore it.
By choosing different file names the session-id could be specified.
Related
I have a workflow that has many sessions that run in parallel to each other. When one of the session fails, the workflow waits for the other session to complete and then the entire workflow gets failed. We have selected the option "fail parent if this task fails". But we want the workflow to fail and stop immediately if any of the session fails without waiting for other sessions to finish.
ps: We have a unix shell script that calls all the workflows one by one. So if we can solve it using unix shell scripting that would be fine aswell.
Does anyone have any solution for it?
Best thing you can do in Informatica is use a Control Task to Abort the worklfow, and have it connected from all sessions with an OR condition. Something like:
start--S1--S2--S3
\ \ \
\---\---\-(OR)-CTL
I am running a few scheduled jobs on RStudio Server I host on my home computer.
It seems like there is no straightforward way to set an email warning when something goes bust.
I am using package cronR for scheduling jobs and emayili for sending emails.
I can see that it is possible from within Linux, but I have it all already set up in R..
Any way I can combine these two and make this work from within R?
How do I stop my running Control-M jobs from executing? Basically I want them to stop running, remove them from the Monitoring view.
in Control-M version 8 you can un-schedule a whole folder by just selecting Manual Order in Order Method. if it's just one specificjob, you can edit the scheduling properties of that job directly and select Manual Order too.
Right click on your job and select "Hold". Now more actions/delete should be available
at controlM V7 and V8/9 open Desktop/Planning environment, Load the schedule table, select the job individually, at Scheduling tab un-select Months, Verify and Checkin. This method is to disable jobs to load and execute( obviously will not show at monitoring window).
Note: Always at planning load the scheduling table, not individual jobs or subapplications. This will overwrite CtmServer DB.
I run SAS batch jobs on a UNIX server and usually encounter the problem that I cannot overwrite sas datasets in batch that have been created by my user locally without changing the authorization level of each file in Windows. Is it possible to signon using my user id and password when initializing the batch job to enable me to get full authorization (to my own files) in batch?
Another issue is that I don't have authorization to run UNIX commands using PIPE on a local remote session on the server and can hence not terminate my own sessions. It is on the other hand possible to run PIPE in batch, but this only allows me to terminate batch jobs so I also wonder if it is possible to run a pipe command in batch using my id and password as the batch user does not have authorizatio to cancel "local remote sessions" on my user?
Example code for terminating process:
%let processid = 6938710;
%let unixcmd = "kill &processid";
%PUT executing &unixcmd;
filename unixcmd pipe &unixcmd.;
there's a good and complete answer to your first point in the following SAS support page.
You can use the umask Unix command to specify the default file permission policy used for the permanent datasets created during a SAS session (be it batch or not).
If you are lauching a Unix script which invokes a SAS batch session you can put a umask command just before the sas execution.
Otherwise you can adopt a more permanent solution including the umask command in one of the places specified in the above SAS support article.
You are probably interested in something like:
umask 002
This will assign a rw-rw-r-- file permission to all new datasets.
We have an SSIS package that is run via a SQLAgent job. We are initiating the job (via sp_startjob) from within an ASP.NET web page. The user that is logged onto the UI needs to be logged with the SSIS package that the user initiates - hence we require the userId to be passed to the SSIS package. The issue is we cannot pass parameters to sp_startjob.
Does anyone know how this can be achieved and/or know of an alternative to the above approach
It cannot be done through sp_startjob. You can't pass a parameter to a job step so that option is out.
If you have no concern about concurrency, and given that you can't have the same job running at the same time, you could probably hack it by changing your job step from type SQL Server Integration Services to something like a OS Command. Have the OS Command called a batch script that the web page creates/modifies. Net result being you start your package like dtexec.exe /file MyPackage /Set \Package.Variables[User::DomainUser].Properties[Value];\"Domain\MyUser\" At this point, the variable DomainUser in your package would have the value of Domain\MyUser.
I don't know your requirements so perhaps you can just call into the .NET framework and start your package from the web page. Although you'd probably want to make sure that call asynchronously. Otherwise unless your SSIS package is very fast, the users might try and navigate away, spam refresh etc waiting for it to the page to "work".
All of this by the way is simply pushing a value into an SSIS package. In this case, a user name. It doesn't pass along their credentials so calls to things like SYSTEM_USER would report the SQL Agent user account (or the operator of the job step).