How to get the logs from a continuesly running child process in powershell? - unix

I am calling child process from a process in Powershell.
The child process will not end, it will be running continuously in the background.
I need the logs entered by the child process continuously.
process.standardoutput.rradtoend() will enter the logs into file when the child process ends.
But i need the logs continuously.
Please help me in this regard.

You can redirect the log to files as below:
start-process your_executable -ArgumentList "your arguments" -RedirectStandardOutput the_path_you_want_to_put_your_log -RedirectStandardError the_path_to_put_error_log
the logs of the child process will be written to the log files continuously, you can open the files to check them from time to time.
Edit
And I think Unix tail equivalent command in windows Powershell will do you more help, which tell you how to monitor the log file in real time, like use the Get-FileTail Cmdlet from PowerShell Community Extensions

Related

How to stop a workflow immediately when a session fails?

I have a workflow that has many sessions that run in parallel to each other. When one of the session fails, the workflow waits for the other session to complete and then the entire workflow gets failed. We have selected the option "fail parent if this task fails". But we want the workflow to fail and stop immediately if any of the session fails without waiting for other sessions to finish.
ps: We have a unix shell script that calls all the workflows one by one. So if we can solve it using unix shell scripting that would be fine aswell.
Does anyone have any solution for it?
Best thing you can do in Informatica is use a Control Task to Abort the worklfow, and have it connected from all sessions with an OR condition. Something like:
start--S1--S2--S3
\ \ \
\---\---\-(OR)-CTL

When is a bash script passes to 'openstack server create --user-data ...' exactly executed?

I have a bash script that I want to be executed before a user can login to the server. I cannot find any information on when this script is exactly executed for different images. Can I assume that this is before a user is able to login using ssh? I'm using cirros.
openstack server create --user-data before_login.sh ...
As soon as your instance boots up this user-data script "before_login.sh" executes on it before any user login into the instance.
User-scripts run at final stage, this stage runs as late in boot as possible. Any scripts that a user is accustomed to running after logging into a system should run correctly here.
You can check below link for cloud-int behaviors for more information
https://cloudinit.readthedocs.io/en/latest/topics/boot.html

Q: User termination of a script leads to loss of logs?

I've been trying to terminate my scripts but I've found that I usually lose these logs and I don't know why. Specifically they are the .xml, .log, and .report files that are produced after the running of a script. Is there someway I can ensure that the logs aren't deleted?
Looks like you are pressing CTRL+C Twice, making it force exit from test run. If you press once, it will create logs.

Password-less File transfer using RCP and Expect

I have a requirement to transfer files from one server to another. I used RCP command to perform the same and it was working fine. Please find code below:
rcp tst.txt usrname#hostname:/home/username/destination_folder
I tried to automate the same using Expect command so created the below mentioned shell script:
#!/bin/bash
/usr/bin/expect -d<<EOD
spawn rcp tst.txt usrname#hostname:/home/username/destination_folder
expect "*userid#hostname's password:*"
send "mypassword\n"
EOD
I didn't get any error while I executing the shell script but the file was not transferred. Can someone help me figuring out what the issue is?
I have tried password less transfer through key generation but with no luck so I am trying the RCP approach.
Thanks in Advance,
Vijay
After sending the password, wait for the completion of rcp, by expecting for eof.
send "mypassword\r"
expect eof
If password-less, then, after spawning rcp, expect for eof.

Gearman Worker start with ulabox symfony plugin

again, im stucking in Gearman. I was implementing the ulabox gearman Bundle which works nicely. But there are two things which I dont unterstand yet.
How do I start a Worker??
Im the documentation, I should first execute a worker and the start the code.
https://github.com/ulabox/GearmanBundle/blob/master/README.md
Open the first console and run:
$ php app/console gearman:worker:execute --worker=AcmeDemoBundle:AcmeWorker
Now open another console and run:
$ php app/console gearman:client:execute --client=UlaboxGearmanBundle:GearmanClient:hello_world --worker=AcmeDemoBundle:AcmeWorker --params="{\"foo\": \"bar\" }"
So, if I dont start the worker manually, the job would be done by itsself. If I start the worker, everysthin is fine. But at least, it is a bit strange to start in manually, even if there is set an iteration of x so that the worker will kill itsself after that amount of job.
So please, can anyone help me out of this :((((
Heeeeeelp :) lol
thanks in advance an kind regards
Phil
Yes to run some task in background not only Gearman need to be run but also workers.
So you have run "gearman" that wait for some command (e.x. email send).
Additionally you have waiting workers.
When gearman view new command he look for first free worker and pass this command to it.
Next worker process execution for command and after finish return to Gearman server that it finished and ready to process new command.
More worker you have faster commands in queue processed.
You can use "supervisor" for automatic maintenance workers running.
Bellow you can find few links with more information:
http://www.daredevel.com/php-jobs-with-gearman-and-supervisor/
http://www.masnun.com/2011/11/02/gearman-php-and-supervisor-processing-background-jobs-with-sanity.html
Running Gearman Workers in the Background

Resources