Is there any way to call a xQuery Script stored in eXist which for instance uses transform:transform on a large number of files and let it run in the Background? Right now, I can call my script via the Browser and it will take several minutes - would be nice to have it run in the background. Is this possible?
---Edit
It seems that xQuery Tasks are always running in the background, there is no need to wait from them to finish when calling them via a Webinterface in eXist. So what I'm looking for is merely a way to forward to another url when a query is called.
Can you explain what you mean by background? If you call the script it will run. If running the script makes exist unresponsive you need to increase its memory. You can run multiple scripts, and you can schedule a CRON job to execute a query at specific intervals. You can also execute a script via the REST endpoint, in case you don't want to use a browser?
Related
I am new to Autosys, and looking for a way to achieve reverse of file watching
I am looking for a job similar to file watcher, which keeps on running till the file is present, and will only pass if the file is not present. The dependent job will only if the file is not present.
there are few
1) I am not sure if I can achieve this with fileWatcher.
2) Does FileWatcher job stops running after it finds the file,
3) is there any way to negate the success condition for filewatcher job.
Or if anyone can provide me some good extensive document on FileWatcher, that would be a help too.
Thanks
You cannot achieve this with filewatcher job alone.
Filewatcher jobs stops running and goes to Success state as soon as it finds the file in the defined path. There is no way to negate its Success state.
This is so as its assumed that such functionalities can be easily implemented by scripts.
You can achieve what you want by batch script(Windows) or Shell Script(Unix/Linux). A script can be triggered by the Autosys job which checks file presence at place you intend, then sleeps for some time ( say 20 secs) checks again, and sends exit code 0 if it finally doesn't find the file, or some other exit code if after certain checks file didnt move eventually.
You can keep downstream jobs depended on this Autosys job as per requirement.
Let me know if more clarification is needed on this.
I am struggling to write a script to check a particular service running on my server and then send me mails.
So should this script be the part of bash profile so that its always running..
regards
rick
The .profile, .bashrc and friends will be run on login, so they are of no good use for background monitoring. Two solutions come to mind:
Either use cron to run your script at predefined intervals
Or make it loop and use your system's init environment (SysV, Upstart, SystemD, ...) to control it
My recommendation is to stick with cron - it even makes the mailing of results dead easy - just create output.
We have a BOX scheduled in Autosys. If the BOX gets triggered at the scheduled time, all the PDFs generated out of one of the steps is not getting copied but the job is also not failing. When we are HOLDING the box and running step by step all outputs are getting copied.
A good troubleshooting step would be to either add in a sleep/delay step of a short time between the generation of the files and the downstream jobs.
A better way might be to use a file trigger or file watcher that will only let the below steps proceed if the files are all there (you can trigger on number of files or whatever stat is appropriate).
If your copy step is a simple copy command without any validation (like copy abc_file_*.pdf) then it wouldn't have any trouble copying whatever files it sees, even if not as many as you intend.
I am trying to perform UI testing with Protractor. The application I am testing has a UI where updates happen in real time. These updates are driven by items being placed on a queue from another service. In order to test the updating of the screen I plan to write a small utility that will place items onto the queue to simulate the functionality under test.
In order to do this in a controlled and testable manner I need to be able to trigger when an item is placed on the queue. Ideally I would trigger this during a test.
Is there a mechanism in Protractor where I can call a command line utility from inside a Protractor test, execute a batch file, or otherwise interact with an external application? If so could someone provide an example of such behaviour?
Protractor is written in node js, so you can use any node js library. The library you should be interested is child_process. The document http://nodejs.org/api/child_process.html contains a number of examples.
I use Scrum methodology and deploy functionality in builds every sprint.
There is necessity to perform different changes in the stored data (I mean data in database and on filesystem). I'd like to implement it as a PHP scripts invoked from console. But they should be executed only once, during the deployment.
Is there any way to implement it through app/console without listing it in the list of registered Console commands? Or is there any other way to implement runonce scripts?
DoctrineMigrations covers some part of my requirements, but it's hard to implement complex changes in Model. And it does not cover changes in files on the filesystem.
I don't think symfony has a facility for that, and besides, hiding the command is not the same as securing the command.
Instead, I would make the script determine if it has been run already (could be as simple as adding a version number to a file and checking that number before running) and stop if it detects it has already run before.