When is a bash script passes to 'openstack server create --user-data ...' exactly executed? - openstack

I have a bash script that I want to be executed before a user can login to the server. I cannot find any information on when this script is exactly executed for different images. Can I assume that this is before a user is able to login using ssh? I'm using cirros.
openstack server create --user-data before_login.sh ...

As soon as your instance boots up this user-data script "before_login.sh" executes on it before any user login into the instance.

User-scripts run at final stage, this stage runs as late in boot as possible. Any scripts that a user is accustomed to running after logging into a system should run correctly here.
You can check below link for cloud-int behaviors for more information
https://cloudinit.readthedocs.io/en/latest/topics/boot.html

Related

Testing Gamelift in local, can only create 1 game session?

I am testing our application in local, and it seems like i can only create 1 game session using Gamelift local.
so what i did is I run gamelift local
java -jar GameLiftLocal.jar -p 9080
run the custom gamelift server i wrote in C# and Unity
and use CLI to create game session
AWS gamelift create-game-session --endpoint-url http://localhost:9080 --maximum-player-session-count 2 --fleet-id fleet-123d
and first run, it succeed and creates the game session.
when I create another gamesession by issuing the same command above it results to
HTTP-Dispatcher - No available process.
Why is this? can we only create one Game Session in local?
If you are trying to make another game session, you need to run multiple game server processes.
GameLift can catch the game session's status by receiving Server-side API call from game server process.
I think this diagram can help you.:)
https://docs.aws.amazon.com/gamelift/latest/developerguide/gamelift-sdk-server-api-interaction-vsd.html
According to the docs:
Each server process should only host a single game session.
...
When testing locally with GameLift Local, you can start multiple server processes. Each process will connect to GameLift Local.
Sounds like you need to run multiple instances of GameLiftLocal.
Source: https://docs.aws.amazon.com/gamelift/latest/developerguide/integration-testing-local.html

Control-m batch job is spanning mutliple versions of a singleton ActiveEx server

as part of a batch job I create 4 command lines through control-m which invoke a legacy console application written in VB6. The console application invokes an ActiveEx server which performs a set of analytic jobs calculating outputs. The ActiveEx server was coded as a singleton but when invoked through control-m I get 4 instances running. the ActiveEx server does not tear down once the job has completed and the command line has closed it self.
I created 4 .bat files which once launced manually on the server, simulate the calls made through control-m and the ActiveEx server behaves as expected, i.e. there is only 1 instance ever running and once complete it closes down gracefully.
What am I doing wrong?
Control-M jobs are run under a service account and it same as we login as a user and execute a job. How did you test this? Did you manually executed each batch job one after another or you have executed all the batch job at the same time from different terminals? You can do one thing. Run the control-M jobs with a time interval like first one at 09.00 second one at 09.05, third one at 09.10 and forth one at 09.15 and see if that fix your issue.
Maybe your job cannot use the Desktop environment.
Check your agent service settings:
Log on As:
User account under which Control‑M Agent service will run.
Valid values:
Local System Account – Service logs on as the system account.
Allow Service to Interact with Desktop – This option is valid only if the service is running as a local system account.
Selected – the service provides a user interface on a desktop that can
be used by whoever is logged in when the service is started. Default.
Unselected – the service does not provide a user interface.
This Account – User account under which Control‑M Agent service will run.
NOTE: If the owner of any Control-M/Server jobs has a "roaming profile" or if job output (OUTPUT) will be copied to or from other computers, the Log in mode must be set to This Account.
Default: Local System Account

Qlikview fails to reload on server when executing stored procedure

I cannot seem to figure out this issue. I have a qlikview document that pulls in a bunch of data and aggregates/joins it up. Typical qlikview stuff. At the end of my process I have an oracle stored procedure call. I am not retrieving anything back. This is a simple call to a database to trigger a process. I have setup my ODBC connection and User DSN on my local machine for the connection. When I run my qvw file on my local machine everything works just fine. The proc call is made and the script executes without any errors.
However, when I put the document on our reload server and after I setup a reload task for it the process throws a general script error when the sql proc is called. What could cause this? The user running the document has execute permissions. Do I need to setup a DSN on the reload server?
Really not sure at all here. Hopefully someone here can help me out. Thanks.
Unfortunately QlikView's SQL error messages are not that helpful for debugging purposes. In this case you can try turning on ODBC logging (http://support2.microsoft.com/kb/274551) and then reload the script to try and capture the cause of the error.
Finally, if your script refers to a "local" DSN then this also needs to be present on the machine that will perform the reload, in this case the QlikView server.

Symfony2 calling console command in controller from vendor

I want to use a console command from this bundle within my controller: http://knpbundles.com/dizda/CloudBackupBundle
The developer proposes cronjobs, however I want to use the command to backup my database from within my controller.
How would I do that?
I am getting this error message when i simply try to register this command as a service:
You have requested a non-existent service "backupcommandservice".
Thanks for the help!
commands don't quite work that way. Per the note on http://symfony.com/doc/current/cookbook/console/console_command.html#register-commands-in-the-service-container
registering a command as a service doesn't do much other than control location and dependency injection.
if you want to call a command: http://symfony.com/doc/current/components/console/introduction.html#calling-an-existing-command
that being said you shouldn't call commands from within a controller since you're basically asking to wait for this command to finish executing before you return a response. You'd be better off just sending a request to a queue box (for example beanstalk) and have a worker perform the job.

How to get the logs from a continuesly running child process in powershell?

I am calling child process from a process in Powershell.
The child process will not end, it will be running continuously in the background.
I need the logs entered by the child process continuously.
process.standardoutput.rradtoend() will enter the logs into file when the child process ends.
But i need the logs continuously.
Please help me in this regard.
You can redirect the log to files as below:
start-process your_executable -ArgumentList "your arguments" -RedirectStandardOutput the_path_you_want_to_put_your_log -RedirectStandardError the_path_to_put_error_log
the logs of the child process will be written to the log files continuously, you can open the files to check them from time to time.
Edit
And I think Unix tail equivalent command in windows Powershell will do you more help, which tell you how to monitor the log file in real time, like use the Get-FileTail Cmdlet from PowerShell Community Extensions

Resources