nginx - run shell/python script on wget request - http

I want to run a shell/python script on each wget to my nginx server.
For example, if I do wget http:///text.txt?example=100
I want to call a script that generates a new file called text100.txt, and returns it.
In other words, can I pass the GET params to the script, and return arbitrary files from that script so the wget client will download them?
Thank you!

What you're asking about is called cgi-bin.
However, I wouldn't recommend using it;
I would recommend against using shell scripts for web serving altogether, and
for Python I'd suggest one of the web micro-frameworks such as flask.

Related

How to execute Python Script through Informatica Cloud

I have a python script that I need to execute and automate via IICS. The output of the script is a csv file. This output should be loaded to the Target. How can I achieve this via Informatica cloud. Please help with some info and documentations to the same.
Thanks
There are two ways to do this.
You can create an executable(using py2exe or some tool) from your py script. Then put that file in Informatica cloud agent server. Then you can call it using shell command. Please note, you do not need to install python or any packages.
You can also put the .py file in agent server and run it using shell like $PYTHON_HOME/python your_script.py You need to make sure py version is compatible and you have all packages installed in agent server.
You can refer to the below screenshot for how to setup shell command. Then you can run it as part of some workflow. Schedule it if needed.
https://i.stack.imgur.com/wnDOV.png

How to implement gRPC Gateway in a node backend

I'm trying to implement gRPC gateway in a Node.JS server following the installation guide.
The steps are writen for golang but I tried to do in Node.JS.
I found this response, and also some packages like this one so is possible implement in node.
After read "insatallation guide", the main problem seems to be get the binary files:
protoc-gen-grpc-gateway
protoc-gen-openapiv2
protoc-gen-go
protoc-gen-go-grpc
I have downloaded first two binaries from here so the problem maybe are last two.
I assume go is from golang so I have to search something like protoc-gen-node-grpc. And I've seen this npm package, but, I want to implement by myself as much as is possible. I don't want to depends from third-party people.
At this point I have first to binaries into my path but no the last two.
Once defined gRPC service, next step is generate gRPC stubs. I have this line:
RUN protoc -I=. mygrpc.proto --js_out=import_style=commonjs,binary:./my/folder/ --grpc-web_out=import_style=commonjs,mode=grpcwebtext:./my/folder/
And this generate the files ok. I don't know if I have to use --js_out and --grpc-web_out to create client and service files.
Then, next step is to generate reverse-proxy using protoc-gen-grpc-gateway.
I do (as guide says):
protoc -I=./my/path/ myproto.proto \
--grpc-gateway_out ./my/path/ \
--grpc-gateway_opt logtostderr=true \
--grpc-gateway_opt paths=source_relative \
--grpc-gateway_opt generate_unbound_methods=true
And this generate a .go file: myproto.pb.gw.go.
Into the file says:
// Code generated by protoc-gen-grpc-gateway. DO NOT EDIT.
// source: myproto.proto
/*
Package myproto is a reverse proxy.
It translates gRPC into RESTful JSON APIs.
*/
So I assume steps are correctly done but: How can I execute this into my Node.JS server?
I have a node project using a Express API, I want to use grpc-gateway instead of express API endpoint... but I only have a .go file.
My proto version is:
libprotoc 3.14.0
Thanks in advance.
As the issue comment, grpc-gateway only generate Go code. Feel free to use Go, there only need generate code from proto and add service. You can reference my sample code in Java helloworlde/grpc-gateway; And buf is better than protoc to generate code. If you want write all by yourself, you can reference ReflectionCall.java wirte in Java, it can call server without generate code.

Penthao Kettle Download File from URL

I want to download a File from a URL. (ex. http://www.webadress.com/service/servicedata?ID=xxxxxx)
I found the HTTP Step for Job executables but I am forced to define a Target file name instead of just accepting the filename the Webdownload offers. (ex. ServiceData20200101.PDF)
Other Problem is that it creates a File even when the Webcall actually wouldn't supply a File.
Is the REST Client or HTTP client Step in Transformations able to download a file over a URL call that accepts the File as is?
The HTTP steps in Pentaho are somewhat limited. In similar use cases in the past I've done this by using an external shell script with arguments that then calls wget or curl and saves the result. Then Pentaho picks up the file in the temp dir and processes it from there.
The Shell job step allows you to specify a script file and pass fields from the stream as arguments.
Note that if you paste shell commands directly into the step on the second tab, they will execute in the embedded shell with older versions of curl and wget. You will also be missing environment config and certificates/keys.

Execute Shell command in tideSDK

I need to run a shell command, that will call the script file that I have written in RUBY
For eg: Lets say, I have a file.sh in my working directory, How can I execute this file using TideSDK( I tried Using the Process)
Titanium Desktop createProcess to run shell script
Is there some other alternatives for this kind of stuffs in tideSDK?
Thanks
UPDATE:
I need to run the ruby script after a click of a button.
I think you don't need to run a process for that. TideSDK allows you to run other scripts like php, ruby and python directly in script tags. On the tag you just need to define the script on the type attribute: text/ruby
There ar several methods to include scripts on TideSDK. Take a look on the documentation at www.tidesdk.org. There is a section related to how to get started.
http://tidesdk.multipart.net/docs/user-dev/generated/#!/guide/using_ruby

Auto triggering a UNIX shell script

I have a main script in a folder called main.ksh (in /home/pkawar/folder), and its input input file inputfile.xls (in /home/pkawar/folder/ipfile).
When I run main.ksh, it uses inputfile.xls and deliver the output to a mail address.
The inputfile.xls is loaded to path /home/pkawar/folder/ipfile via ftp commands.
Is it possible to run main.ksh automatically and output will be sent via mail when the file inputfile.xls is loaded successfully?
The first option would be to use cron, but from your question it doesn't seem that you want to go that path.
My question would be, what is creating the *.xml file? Is it possible that whatever is creating that file to know when its finished and then calling the shell script, or better yet, have the xml file streamed to the shell script on the fly?
The first thing you should do is write a script that does whatever it is you want done. If your script performs correctly, you can use cron via a crontab file to have the script executed on whatever schedule you desire.
See man crontab for details.

Resources