How to execute Python Script through Informatica Cloud - informatica-cloud

I have a python script that I need to execute and automate via IICS. The output of the script is a csv file. This output should be loaded to the Target. How can I achieve this via Informatica cloud. Please help with some info and documentations to the same.
Thanks

There are two ways to do this.
You can create an executable(using py2exe or some tool) from your py script. Then put that file in Informatica cloud agent server. Then you can call it using shell command. Please note, you do not need to install python or any packages.
You can also put the .py file in agent server and run it using shell like $PYTHON_HOME/python your_script.py You need to make sure py version is compatible and you have all packages installed in agent server.
You can refer to the below screenshot for how to setup shell command. Then you can run it as part of some workflow. Schedule it if needed.
https://i.stack.imgur.com/wnDOV.png

Related

Using Git to deploy R Package to remote server

I'm barely new to CI/CD piplines with GIT and hope that my questions is related to that functionality. I want to achieve the following:
Whenever a new commit (or tag) of a specific package in our companies local git server is created, I want to execute a shell script on a remote server. This shell script should download this new version of the package and install it into the global R-Installation, making it available for different processes that use this R environment. So far this is more like a manual step but I really would like to automate the process. Is this possible?
Thanks!

Azure Databricks: How do we access R Scripts present on DBFS?

I'm new to DataBricks. I am trying to access a .R file that is present in the DBFS storage but I cannot figure out how to do so. Any help is really appreciated.
I can read data from the storage using the file path /dbfs and also source code from the script but I want to make edits to the script.
You need some editor to do that - for example, you can setup RStudio on your cluster and connect to it via RStudio UI - in this case you can edit R files directly on DBFS.
But really, the simplest for you would be to use Databricks CLI fs command to copy the file to your local machine, make changes in the editor of your choice, and upload file back.

Using Airflow to Run .bat file or PowerShell program located in remote Windows Box

Currently some of the jobs are running in different Windows VM's.
for eg.,
Task Scheduler to run
Powershell files
.bat files
python files.
Sql Agent jobs
To run SSIS packages
We are planning to use Airflow to trigger all these jobs to have better visibility and manage dependencies.
Our Airflow in Ubuntu.
I would like know if there is any way to trigger above mentioned jobs in Windows via Airflow.
Can I get some examples on how to achieve my objectives? Please suggest what packages/libraries/plugins/operators I can use.
Yes there is. I would start by looking into the winrm operator and hook that you find in under Microsoft in providers:
http://airflow.apache.org/docs/apache-airflow-providers-microsoft-winrm/stable/index.html
and maybe also:
https://github.com/diyan/pywinrm

Execute Shell command in tideSDK

I need to run a shell command, that will call the script file that I have written in RUBY
For eg: Lets say, I have a file.sh in my working directory, How can I execute this file using TideSDK( I tried Using the Process)
Titanium Desktop createProcess to run shell script
Is there some other alternatives for this kind of stuffs in tideSDK?
Thanks
UPDATE:
I need to run the ruby script after a click of a button.
I think you don't need to run a process for that. TideSDK allows you to run other scripts like php, ruby and python directly in script tags. On the tag you just need to define the script on the type attribute: text/ruby
There ar several methods to include scripts on TideSDK. Take a look on the documentation at www.tidesdk.org. There is a section related to how to get started.
http://tidesdk.multipart.net/docs/user-dev/generated/#!/guide/using_ruby

Script runs in Unix but not in Informatica command task

My script ran successfully in Unix but not in a command task of an Informatica workflow. The permissions are fine, and the parameter file and variables have been declared in the workflow. Why is this happening?
Make sure that the machine you are running informatica on, is running in a unix box.
If it is on a windows machine, you will have to run the DOS equivalent command for your script.
Check whether informatica repository is pointing to same UNIX server, in which the script to executed from informatica is present.
I to faced same situation ,please check the propertied of script file make it as 755(CHMOD),It should work.
Regards
Rama
This could be a permission issue. If you are executing a shell command from Informatica, right click the file in SFTP and click on "Properties" (I am using Winscp).Give full permission to the file and now it should work.

Resources