Convert string to map in lua - nginx

I am quite new to lua. I trying to convert a string of the form
{"result": "success", "data":{"shouldLoad":"true"}"}
into lua map. So that I can access it like json. e.g. someMap[data][shouldLoad] => true
I dont have any json bindings in lua. I also tried loadstring to convert string of the form {"result" = "success", "data"={"shouldLoad"="true"}"}, which is not working.
Following, is the code snippet, where I am calling getLocation hook, which in turn returns json stringified map. Now I want to access some keys from this response body and take some decisions accordingly.
access_by_lua "
local res = ngx.location.capture('/getLocation')
//res.body = {"result"= "success", "data" = {"shouldLoad" = "true"}}
local resData = loadstring('return '..res.body)()
local shoulLoad = resData['data']['shouldLoad']
"
When I try to load shouldLoad value, nginx error log reports error saying trying to index nil value.
How do I access key value with either of the string formats. Please help.

The best answer is to consider a pre-existing JSON module, as suggested by Alexey Ten. Here's the list of JSON modules from Alexey.
I also wrote a short pure-Lua json module that you are free to use however you like. It's public domain, so you can use it, modify it, sell it, and don't need to provide any credit for it. To use that module you would write code like this:
local json = require 'json' -- At the top of your script.
local jsonStr = '{"result": "success", "data":{"shouldLoad":"true"}"}'
local myTable = json.parse(jsonStr)
-- Now you can access your table in the usual ways:
if myTable.result == 'success' then
print('shouldLoad =', myTable.data.shouldLoad)
end

Related

Web2py: Sending JSON Data via a Rest API post call in Web2py scheduler

I have a form whose one field is type IS_JSON
db.define_table('vmPowerOpsTable',
Field('launchId',label=T('Launch ID'),default =datetime.datetime.now().strftime("%d%m%y%H%M%S")),
Field('launchDate',label=T('Launched On'),default=datetime.datetime.now()),
Field('launchBy',label=T('Launched By'),default = auth.user.email if auth.user else "Anonymous"),
Field('inputJson','text',label=T('Input JSON*'),
requires = [IS_NOT_EMPTY(error_message='Input JSON is required'),IS_JSON(error_message='Invalid JSON')]),
migrate=True)
When the user submits this Form, this data is also simultaneously inserted to another table.
db.opStatus.insert(launchId=vmops_launchid,launchDate=vmops_launchdate
,launchBy=vmops_launchBy,opsType=operation_type,
opsData=vmops_inputJson,
statusDetail="Pending")
db.commit()
Now from the scheduler, I am trying to retrieve this data and make a POST request.
vm_power_opStatus_row_data = vm_power_opStatus_row.opsData
Note in the above step I am able to retrieve the data. (I inserted it in a DB and saw the field exactly matches what the user has entered.
Then from the scheduler, I do a POST.
power_response = requests.post(vm_power_op_url, json=vm_power_opStatus_row_data)
The POST request is handled by a function in my controller.
Controller Function:
#request.restful()
def vmPowerOperation():
response.view = 'generic.json'
si = None
def POST(*args, **vars):
jsonBody = request.vars
print "Debug 1"+ str(jsonBody) ##-> Here it returns blank in jsonBody.
But if I do the same request from Outside(POSTMAN client or even python request ) I get the desired result.
Is anything going wrong with the data type when I am trying to fetch it from the table.
power_response = requests.post(vm_power_op_url,
json=vm_power_opStatus_row_data)
It appears that vm_power_opStatus_row_data is already a JSON-encoded string. However, the json argument to requests.post() should be a Python object, not a string (requests will automatically encode the Python object to JSON and set the content type appropriately). So, the above should be:
power_response = requests.post(vm_power_op_url,
json=json.loads(vm_power_opStatus_row_data))
Alternatively, you can use the data argument and set the content type to JSON:
power_response = requests.post(vm_power_op_url,
data=vm_power_opStatus_row_data,
headers={'Content-Type': 'application/json')
Also, note that in your REST POST function, request.vars is already passed to the function as **vars, so within the function, you can simply refer to vars rather than request.vars.

How to traverse object with dynamic keys in Paw?

Let's say we have the following JSON response:
{
"abcd1234": {
"foo": "bar"
}
}
How would "bar" be accessed in a response parsed body value? In the response, "abcd1234" could be anything. But we want the first key in the object (in JavaScript this would be Object.keys(res)[0]).
Paw makes it easy to parse JSON (and XML) responses and access subfields via their key-path.
This documentation article may help: https://paw.cloud/docs/advanced/reuse-values-from-previous-responses
Insert the Response Parsed Body dynamic value
Set the input request and extract the needed value
In your example, the key path will be:
abcd1234.foo
Though, it seems like you need to access the path without knowing the key before hand. If so, one way would be to use a JavaScript snippet to be able to achieve the behavior you want.
On any field, you may right-click and pick Extensions > JS Script.
Here's a snippet that may fit your needs:
function evaluate(context){
var request = context.getCurrentRequest();
var exchange = request.getLastExchange();
var body = JSON.parse(exchange.responseBody);
var key = Object.keys(body)[0];
var value = body[key].foo;
return value;
};

how to write an Axios query where I don't know a parent value?

I have a simple firebase DB which looks like
someNode: {
pushId-A: {param1: 'some string'},
pushId-B: {param1: 'some other string')
}
Using Axios GET, is there a way to query someNode for the value of param1 where I don't know the value of the pushId?
I want it to return the pushId of the node that contains "param1: 'some string'.
[EDIT}
I understand now that this is not an Axios question, but rather a Firebase question.
I've read the firebase docs here:
Filtering Data
But when I send the get request with any paramaters other than the auth token, I get back a 400 code. Which tells me it is incorrectly syntaxed.
here is the last part of the DB url
a8/data/houses/-L4OiszP7IOzkfh1f1NY/houseName
where houseName = "Aubergine"
Trying to filter for houseName I am passing:
axios.get('/houses.json/' + '?orderBy="houseName"&startAt="A"' + '&auth=' + token)
I'm keeping the params separate so I can more easily read and change them. Concatenating the strings has no effect.
No matter what combination of params I pass I get the 400 error code. If I leave them off, then the data comes through as expected.
What am I doing wrong????

How to parse a collection's sub-object to find a unique result from many possibilities?

In my user's schema, I have a TokAuth Array with token sub-objects (like multiple mails addresses).
So in a method, when I search the tokens in the current user :
var id = Meteor.userId();
var usercurrent = Meteor.users.findOne({_id: id}, {fields: {"TokAuth": 1}});
var userToken = usercurrent.TokAuth.token;
I got in console.log(userToken)
[ 'fyAyXkXYrQdAlNpjuQfJ8RLU2TpfVGLnptlBs-m1h7xk',
I20170224-20:36:23.202(1)? 'YTwtUbhNTgiEfzFbJq7mESnOoOHeLYxWlqEeJJIG_GiV',
I20170224-20:36:23.206(1)? 'ViA4ydDITJtHDi2c_sArkNtpRYTjFqGL1ju2v00_-rFJ',
I20170224-20:36:23.206(1)? '51ImZcxRADLJr-FPCUL7EFGnTZYjHSZk3XxdqtBV2_fd',
I20170224-20:36:23.207(1)? 'S5aEvqjJ5zTUJqLFCPY1aZ1ZhsQppZTJtYKULM9aS2B3',
I20170224-20:36:23.207(1)? 'mhBs3oxHf2SxZfu2vCZhtiyPfg25fKMY8bKMZD8fx6IG',
I20170224-20:36:23.207(1)? '-rv0FiP-lxoqe8INyCJASV6rZpbgy3euEqB9sO9HsZSV',
I20170224-20:36:23.207(1)? 'zacr6_VBjHTsArov1LmQyZFLwI40fx4J7sygpLosTrli' ]
Beside, I've got a var who is equal to the last token in the userToken sub-object (that's of course expected : not to be the last one, but to be in the sub-object).
console.log (editAuth);
zacr6_VBjHTsArov1LmQyZFLwI40fx4J7sygpLosTrli
So how can I parse userToken to find a match with editAuth? If userToken was just a String, it will be simple but here...
Thanks
Is there a reason you are storing all the tokens as an array as opposed to just updating a single string each time?
That aside, you can check if an array contains a value by using the handy underscore function _.contains
Example:
_.contains( userToken, editAuth ); //returns true or false
In this case, you are simply trying to search for a string within an array of strings. #Sean already provided one solution.
If you are using the meteor ecmascript package then you can just simply use the native Array.includes method.
userToken.includes(editAuth);
On a side note, after using ECMAScript 2015+ for some time now, I find that I can use the native API for almost everything that I used to use underscore or lodash for. Check it out!

How to get the table name in AWS dynamodb trigger function?

I am new with AWS and working on creating a lambda function on Python. The function will get the dynamodb table stream and write to a file in s3. Here the name of the file should be the name of the table.
Can someone please tell me how to get the table name if the trigger that is invoking the lambda function?
Thanks for help.
Since you mentioned you are new to AWS, I am going to answer descriptively.
I am assuming that you have set 'Stream enabled' setting for your DynamoDB table to 'Yes', and have set up this as an event source to your lambda function.
This is how I got the table name from the stream that invoked my lambda function -
def lambda_handler(event, context):
print(json.dumps(event, indent=2)) # Shows what's in the event object
for record in event['Records']:
ddbARN = record['eventSourceARN']
ddbTable = ddbARN.split(':')[5].split('/')[1]
print("DynamoDB table name: " + ddbTable)
return 'Successfully processed records.'
Basically, the event object that contains all the information about a particular DynamoDB stream that was responsible for that particular lambda function invoke, contains a parameter eventSourceARN. This eventSourceARN is the ARN (Amazon Resource Number) that uniquely identifies your DynamoDB table from which the event occurred.
This is a sample value for eventSourceARN -
arn:aws:dynamodb:us-east-1:111111111111:table/test/stream/2020-10-10T08:18:22.385
Notice the bold text above - test; this is the table name you are looking for.
In the line ddbTable = ddbARN.split(':')[5].split('/')[1] above, I have tried to split the entire ARN by ':' first, and then by '/' in order to get the value test. Once you have this value, you can call S3 APIs to write to a file in S3 with the same name.
Hope this helps.
Please note that eventSourceArn is not always provided. From my testing today, I didn't see eventSourceArn presented in record. You can also refer to the links:
Issue: https://github.com/aws/aws-sdk-js/issues/2226
API: https://docs.aws.amazon.com/amazondynamodb/latest/APIReference/API_streams_Record.html
One way to do it will be via pattern matching in Scala using regex:
val ddbArnRegex: Regex = """arn:aws:dynamodb:(.+):(.+):table/(.+)/stream/(.+)""".r
def parseTableName(ddbARN: String): Option[String] = {
if (null == ddbARN) None
ddbARN match {
case ddbArnRegex(_, _, table, _) => Some(table)
case _ => None
}
}

Resources