I'm using cloud-functions-emulator, 1 Param is OK, but 2 Param? - firebase

I'm trying to use cloud-functions-emulator, When I try to call with 1 Parameter,
like
functions call auth --data {\"token\":\"1234ssss\"}
everythings fine, but When I try to call with 2 Param,
like
functions call auth --functions call hell --data '{\"names\":\"test.txt\",\"buket\":\"my-bucketssssssss\"}'
is making an error = Error: "data" must be a valid JSON string!
how can I go with 2 Params??

Try the following two commands:
echo "{\"hello\": \"world\"}"
and
echo '{\"hello\": \"world\"}'
The outputs will be:
{"hello": "world"}
and
{\"hello\": \"world\"}
Notice that supplying your command in single quotes cancels the escaping of the double quotes resulting in un-desired content. Try your command as:
functions call auth --functions call hell --data "{\"names\":\"test.txt\",\"buket\":\"my-bucketssssssss\"}"
or
functions call auth --functions call hell --data '{"names":"test.txt","buket":"my-bucketssssssss"}'

Related

How to pass a parameter to a shell script in nginx lua

I'm doing a simple test to see if I can run a shell script in nginx easily. Lua seems a good solution. So I added
content_by_lua_block {
local myvar = "abcd"
local file = io.popen("myshellscript.sh myvar")
local result = file:read("*a")
ngx.say(result)
}
in my nginx.conf. In myshellscript.sh, I use echo "#1" to print the value of the parameter. But instead of seeing "abcd", I see "myvar" from the output. So what's the proper way to pass a variable as a parameter to the shell script in nginx lua?
I know os.execute("myshellscript.sh myvar") can do the similar job as popen, but I don't know how to get the output to the stdout or the exit code from the shell script.
You need to substitute myvar with the value of your variable - otherwise you're passing the literal string myvar. Since abcd does not need to be quoted, simply concatenating myvar will work for this example:
local file = io.popen("myshellscript.sh " .. myvar)
or for a more robust solution that quotes strings using single quotes:
local function quote(param)
return "'" .. param:gsub("'", [['\'']]) .. "'"
end
local file = io.popen("myshellscript.sh " .. quote(myvar))

jq: error (at <stdin>:0): Cannot iterate over null (null)

I've been working with an API call to structure it in JSON format so I might later push it into a database. Then code looks like this:
getPage() {
curl --fail -X GET 'https://api.app.com/v1/test?page=1&pageSize=1000&sort=desc' \
-H 'Authorization: Bearer 123abc456pickupsticks789' \
-H 'cache-control: no-cache'
}
getPage \
| jq -c '.items | .[] | {landing_id: .landing_id, submitted_at: .submitted_at, answers: .answers, email: .hidden.email}' \
> testpush.json
When I run it though, it produces this error: jq: error (at <stdin>:0): Cannot iterate over null (null)
I've looked at solutions such as this one, or this one from this site, and this response.
The common solution seemed to be using a ? in front of [] and I tried it in the jq line towards the bottom, but it still does not work. It just produces an empty json file.
Am I misreading the takeaway from those other answers and not putting my ? in the right place?>
To protect against the possibility that .items is not an array, you could write:
.items | .[]?
or even more robustly:
try .items[]
which is equivalent to (.items[])?.
In summary:
try E is equivalent to try E catch empty
try E is equivalent to (E)?
(Note that the expressions .items[]? and (.items[])? are not identical.)
However none of these will provide protection against input that is invalid JSON.
p.s. In future, please follow the mcve guidelines (http://stackoverflow.com/help/mcve); in the present case, it would have helped if you had provided an illustrative JSON snippet based on the output produced by the curl command.
It is necessary to let JSON know that it can continue after an unexpected value while parsing that array. try or ? are perfect options for that.
Bear in mind that it is either necessary to guarantee the data or let the interpreter to know that it is ok to continue. It may sounds redundant, but it is something like a fail-safe approach to prevent unexpected results that are harder to track/notice.
Also, it is necessary to be aware about the differences for "testing" between ? vs try.
Assuming that $sample meets JSON standards the code bellow will work always:
sample='{"myvar":1,"var2":"foo"}'
jq '{newVar: ((.op[]? | .item) // 0)}' <<< $sample
so, the op array is null for $sample as above, but it is clear to jq that it can continue without asking for your intervention/fix.
But if you do assume ? as the same as try, you may get an error (took me a loot to learn this, and it is not clear in the documentation). As an example of improper use of ? we have:
sample='{"myvar":1,"var2":"foo"}'
jq '{newVar: (.op[].item? // 0)}' <<< $sample
So, as op is null it will lead to an error, because you are telling to jq to ignore an error while retrieving .item, while there is mention about the possibility of an error during the attempt to iterate over null (in this case .op[]), and that attempt happened before that point checking for .item.
On the other hand, try would work in this case:
sample='{"myvar":1,"var2":"foo"}'
jq '{newVar: (try .op[].item catch 0)}' <<< $sample
This is a small use difference that can lead to a large difference in the result

Premature exit from script after fork call in Python (creating pipeline)

Code fragment inside call(argv) function
if '|' in argv:
# Split argv into two commands, lst[0] and lst[1]
r,w=os.pipe()
pid=fork()
# Parent
if pid >0:
os.close(w)
os.dup2(r,0)
run(lst[0])
os.close(r)
os.wait()
# Child
if pid==0:
os.close(r)
os.dup2(w,1)
run(lst[1])
os.close(w)
os._exit(1)
The code above gives the result from a simple pipeline of only two commands, but it causes the shell to exit prematurely. How can I get this code to stop exiting my script and have it return to command prompt?
How the program works
The child executes the second command. Its output is sent to the pipe by using the dup2() call to redirect the output along the pipe. This is accomplished through changing pipeline write file descriptor with the value sys.stdout.
The parent then uses input redirection with the dup2() call. This produces the final output which is then displayed on screen, but directly after the script exits.
The run function call takes in the command and its arguments. It executes the command given. It also runs globing and input and output redirection.
It's probably something simple, but I can't seem to spot what's causing the problem...

How to pass parameters in a call to test using quality-center?

Is it possible to pass a parameter of a test case, in the testplan module of hp quality center 10, to a "call to test"? If I add in the call the parameter <<< parameter_name >>> the test runner won't evaluate the parameter to its value.
I think you should use triple <<< and >>> signs, have a look at this tutorial video:
http://www.youtube.com/watch?v=vCrJcHrosio

Using RCurl/httr for Github Basic Authorization

I am trying to create an OAuth token from the command line using the instructions here. I am able to use curl from the command line, and get the correct response
curl -u 'username:pwd' -d '{"scopes":["user", "gist"]}' \
https://api.github.com/authorizations
Now, I want to replicate the same in R using RCurl or httr. Here is what I tried, but both commands return an error. Can anyone point out what I am doing wrong here?
httr::POST(
'https://api.github.com/authorizations',
authenticate('username', 'pwd'),
body = list(scopes = list("user", "gist"))
)
RCurl::postForm(
uri = 'https://api.github.com/authorizations',
.opts = list(
postFields = '{"scopes": ["user", "gist"]}',
userpwd = 'username:pwd'
)
)
The question is ages old, but maybe still helpfull to some: The problem should be that the opts arguments are passed in the wrong way (lacking a curlOptions function call). The following worked for me in a different context:
result <- getURL(url,.opts=curlOptions(postfields=postFields))
(and yes, as far as I know you can use getURL function for POST requests).

Resources