I'm working on a simple web service in Prolog and wanted to respond to my users with data formatted as JSON. A nice facility is reply_json_dict/1 which takes a dictionary and converts it in a HTTP response with well formatted JSON body.
My trouble is that building the response dictionary itself seems a little cumbersome. For example, when I return some data, I have data id but may/may not have data properties (possibly an unbound variable). At the moment I do the following:
OutDict0 = _{ id : DataId },
( nonvar(Props) -> OutDict1 = OutDict0.put(_{ attributes : Props }) ; OutDict1 = OutDict0 ),
reply_json_dict(OutDict1)
Which works fine, so output is { "id" : "III" } or { "id" : "III", "attributes" : "AAA" } depending whether or not Props is bound, but... I'm looking for an easier approach. Primarily because if I need to add more optional key/value pairs, I end up with multiple implications like:
OutDict0 = _{ id : DataId },
( nonvar(Props) -> OutDict1 = OutDict0.put(_{ attributes : Props }) ; OutDict1 = OutDict0 ),
( nonvar(Time) -> OutDict2 = OutDict1.put(_{ time : Time }) ; OutDict2 = OutDict1 ),
( nonvar(UserName) -> OutDict3 = OutDict2.put(_{ userName : UserName }) ; OutDict3 = OutDict2 ),
reply_json_dict(OutDict3)
And that seems just wrong. Is there a simpler way?
Cheers,
Jacek
Instead of messing with dictionaries, my recommendation in this case is to use a different predicate to emit JSON.
For example, consider json_write/2, which lets you emit JSON, also on current output as the HTTP libraries require.
Suppose your representation of data fields is the common Name(Value) notation that is used throughout the HTTP libraries for option processing:
Fields0 = [attributes(Props),time(Time),userName(UserName)],
Using the meta-predicate include/3, your whole example becomes:
main :-
Fields0 = [id(DataId),attributes(Props),time(Time),userName(UserName)],
include(ground, Fields0, Fields),
json_write(current_output, json(Fields)).
You can try it out yourself, by plugging in suitable values for the individual elements that are singleton variables in the snippet above.
For example, we can (arbitrarily) use:
Fields0 = [id(i9),attributes(_),time('12:00'),userName(_)],
yielding:
?- main.
{"id":"i9", "time":"12:00"}
true.
You only need to emit the suitable Content-Type header, and have the same output that reply_json_dict/1 would have given you.
You can do it in one step if you use a list to represent all values that need to go into the dict.
?- Props = [a,b,c], get_time(Time),
D0 = _{id:001},
include(ground, [props:Props,time:Time,user:UserName], Fs),
D = D0.put(Fs).
D0 = _17726{id:1},
Fs = [props:[a, b, c], time:1477557597.205908],
D = _17726{id:1, props:[a, b, c], time:1477557597.205908}.
This borrows the idea in mat's answer to use include(ground).
Many thanks mat and Boris for suggestions! I ended up with a combination of your ideas:
dict_filter_vars(DictIn, DictOut) :-
findall(Key=Value, (get_dict(Key, DictIn, Value), nonvar(Value)), Pairs),
dict_create(DictOut, _, Pairs).
Which then I can use as simple as that:
DictWithVars = _{ id : DataId, attributes : Props, time : Time, userName : UserName },
dict_filter_vars(DictWithVars, DictOut),
reply_json_dict(DictOut)
Related
I am writing some code where I have multiple dictionaries for my data. The reason being, I have multiple core objects and multiple smaller assets and the user must be able to choose a smaller asset and have some function off in the distance run the code with the parent noted.
An example of one of the dictionaries: (I'm working in ROBLOX Lua 5.1 but the syntax for the problem should be identical)
local data = {
character = workspace.Stores.NPCs.Thom,
name = "Thom", npcId = 9,
npcDialog = workspace.Stores.NPCs.Thom.Dialog
}
local items = {
item1 = {
model = workspace.Stores.Items.Item1.Main,
npcName = "Thom",
}
}
This is my function:
local function function1(item)
if not items[item] and data[items[item[npcName]]] then return false end
end
As you can see, I try to index the dictionary using a key from another dictionary. Usually this is no problem.
local thisIsAVariable = item[item1[npcName]]
but the method I use above tries to index the data dictionary for data that is in the items dictionary.
Without a ton of local variables and clutter, is there a way to do this? I had an idea to wrap the conflicting dictionary reference in a tostring() function to separate them - would that work?
Thank you.
As I see it, your issue is that:
data[items[item[npcName]]]
is looking for data[“Thom”] ... but you do not have such a key in the data table. You have a “name” key that has a “Thom” value. You could reverse the name key and value in the data table. “Thom” = name
I have a need to crawl a set of data of indeterminate size and build a table key/value index of it. Since I don't know the dimensions in advance, seems I have to use a recursive function. My Lua skills are very new and superficial. I'm having difficulty understanding how to deal with returning a table from the function call.
Note this is for a Lua 5.1 script processor
API = function(tbl)
local table_api = {}
-- do stuff here with target data and add to table_api
table_api["key"] = value
-- then later there is a need to recurse deeper into target data
table_api["lower"] = API(var)
return table_api
end
result = API(source_data)
In most every other language I know there would be some way to make the recurse line table_api["lower"] = API(var) work but since Lua does table variables by reference, my return sub-table just keeps getting overwritten and my result is a tiny last bit of what it should be.
Just for background on my purpose: there's a commercial application I'm working with that has a weakly documented Lua scripting interface. It's running Lua 5.1 and the API is not well documented and frequently updated. As I understand it, everything is stored in _G, so I wanted to write something to reverse engineer the API. I have a working recursive function (not shown here) that enumerates all of _G. For that return value, it just builds up an annotated string and progressively builds on the string. That all works fine and is really useful, but it shows much more that the API interface; all the actual data elements are included, so I have to sift through like 30,000 records to determine an API set of about 500 terms. In order to determine the API, I am trying to use this sub-table return value recursive function being discussed in this question. The code I'm showing here is just a small distilled subset of the larger function.
I'll go ahead and include the full code here. I was hoping to incrementally build a large'ish table of each API level, any sublevels, and finally whatever keys used at the lowest level.
In the end, I was expecting to have a table that I could address like this:
result["api"]["label"]["api"]["sublabel"]["value"]["valuename"]
Full code:
tableAPIShow = function(tbl, table_track)
table_track = table_track or {}
local table_api = {}
if type(tbl) == 'table' then
-- Check if values are tables.
local parent_table_flag = true
for ind,val in pairs(tbl) do
if type(val) ~= 'table' then
parent_table_flag = false
break
end
end
-- If all children are table type, check each of them for subordinate commonality
local api_flag = false
if parent_table_flag == true then
local child_table = {}
local child_table_flag = false
api_flag = true
for ind,val in pairs(tbl) do
-- For each child table, store the names of the indexes.
for sub_ind,sub_val in pairs(val) do
if child_table_flag == false then -- First time though, create starting template view of typical child table.
child_table[sub_ind] = true -- Store the indexes as a template table.
elseif child_table[sub_ind] == nil then -- Otherwise, test this child table compared to the reference template.
api_flag = false
break
end
end
if api_flag == false then -- need to break out of nested loop
break
end
child_table_flag = true
end
end
if api_flag == true then
-- If everything gets to here, then this level is an API with matching child tables below.
for ind,val in pairs(tbl) do
if table_api["api"] == nil then
table_api["api"] = {}
end
table_api["api"][ind] = tableAPIShow(val, table_track)
end
else
-- This level is not an API level, determine how to process otherwise.
for ind,val in pairs(tbl) do
if type(val) == 'table' then
if table_track[val] ~= nil then -- Have we already recursed this table?
else
table_track[val] = true
if table_api["table"] == nil then
table_api["table"] = {}
end
table_api["table"][ind] = tableAPIShow(val, table_track)
end
else -- The children are not tables, they are values
if table_api["value"] == nil then
table_api["value"] = {}
end
table_api["value"][ind] = val
end
end
end
else
-- It's not a table, just return it.
-- Probably never use this portion because it's caught on upper level recurse and not called
return tbl
end
return table_api
end
And I was calling this function in the main script like this:
local str = tableAPIShow(_G)
I've got another function that recursively shows a table so I can look inside my results and see I only get a return value that contains only the values of the top-level of _G (I have it excluded built-in Lua functions/values because I'm only interested in the Application API):
{
[value] = table: 00000000F22CB700 {
[value][_VERSION] = Application/5.8.1 (x86_64; Windows NT 10.0.16299),
[value][tableAPIShow] = "function: 00000000F22C6DE0, defined in (121-231) C:\\Users\\user\\AppData\\Local\\Temp\\APP\\/~mis00002690 ",
[value][_FINAL_VERSION] = true,
[value][Path] = ./Scripts/Database/elements/,
[value][class] = "function: 00000000F1953C40, defined in (68-81) Scripts/Common/Class.lua ",
[value][db_path] = ./Scripts/Database/,
[value][merge_all_units] = "function: 00000000F20D20C8, defined in (2242-2250) Scripts/Database/db_merge.lua ",
}
You just need to localize the variable you store your table in and it will work as you expect:
local table_api = {}
(note that you are passing table variable that conflicts with the global table variable and is not currently used in the function.)
I am inclined to believe that your tableAPIShow function code is working correctly and the
another function that recursively shows a table
fails to serialize your table fully. Hence you don't see the deeper levels of the table returned by tableAPIShow().
I got your initial and current code (tableAPIShow) to work with my simple table serialize function: the whole _Global table is exported completely and formatted as you implemented it in your tableAPIShow().
Code for testing:
apiMassiveTable = {
api = {
get = {
profile = {"profileID", "format"},
name = {"profileID", "encoding"},
number = {"profileID", "binary"}
},
set = {
name = {"apikey", "profileID", "encoding", "newname"},
number = {"apikey", "profileID", "binary", "newnumber"}
},
retrieve = {}
},
metadata = {version="1.4.2", build="nightly"}
}
-- tableAPIShow implemenation here
table.serialize = dofile("serialize.lua")
print(table.serialize("myNameForAHugeTable", tableAPIShow(_G)))
PS: Whatever serialize function you're using, it should enquote strings like Application/5.8.1 (x86_64; Windows NT 10.0.16299) which it does not.
Like #Paul Kulchenko said, you need to learn to use locals (https://www.lua.org/pil/4.2.html). Global variable post-exist until a new lua_State is loaded (a new environment, could be a new process depending on what interpreter you are using). So a tip is to always use local variables for anything you don't want to leave the function or leave the compilation unit.
Think of tables like dictionaries: a word is attached to a definition. Thusly the definition is the data.
What I think you are trying to do is serialization of a table of data. However, that isn't really necessary. You can either shadow copy or deep copy the given table. A shadow copy is when you don't delve into the depths of tables found in the keys, etc. A deep copy is when you copy tables in keys of tables in keys of tables... etc.
local shallow_copy = function(tab)
local rep_tab = {}
for index, value in pairs(tab)do
rep_tab[index] = value
end
return rep_tab
end
-- because local variable is not defined or declared immediately on a 1-liner,
-- a declaration has to exist so that deep_copy can be used
-- lets metatable execute functions
local deep_copy
deep_copy = function(tab)
local rep_tab = {}
for index, value in pairs(tab)do
if(type(value) == "table")then
rep_tab[index] = deep_copy(value)
else
rep_tab[index] = value
end
end
return rep_tab
end
Deco's deepcopy.lua
https://gist.github.com/Deco/3985043
You can also index tables using periods:
local tab = {}
tab.abc = 123
tab["def"] = 456
print(tab.abc, tab["def"])
To serialize the entire _G, you would just filter out the junk you don't need and recurse each table encountered. Watch out for _G, package, and _ENV because if defined it will recurse back to the start.
-- use cap as a whitelist
enumerate_dir = function(dir, cap)
local base = {}
for i, v in pairs(dir) do
-- skip trouble
if(i ~= "_G" and i ~= "_ENV" and i ~= "package")then
if(type(v) == "table")then -- if we have a table
base[i] = enumerate_dir(v, cap)
else
for k, n in pairs(cap) do
if(type(v) == n)then -- if whitelisted
base[i] = tostring(v)
end
end
end
end
end
return base
end
-- only functions and tables from _G please
local enumeration = enumerate_dir(_G, {"function", "table"})
-- do some cool quips to get a basic tree system
prefix = ""
recursive_print = function(dir)
for i, v in pairs(dir)do
if(type(v) == "table")then
print(prefix, i, v)
prefix = prefix .. ">"
recursive_print(v)
else
print(prefix, i, v)
end
end
if(#prefix > 0)then
prefix = prefix:sub(1, #prefix-1)
end
end
recursive_print(test)
I'm currently implementing a SBT plugin for Gatling.
One of its features will be to open the last generated report in a new browser tab from SBT.
As each run can have a different "simulation ID" (basically a simple string), I'd like to offer tab completion on simulation ids.
An example :
Running the Gatling SBT plugin will produce several folders (named from simulationId + date of report generaation) in target/gatling, for example mysim-20140204234534, myothersim-20140203124534 and yetanothersim-20140204234534.
Let's call the task lastReport.
If someone start typing lastReport my, I'd like to filter out tab-completion to only suggest mysim and myothersim.
Getting the simulation ID is a breeze, but how can help the parser and filter out suggestions so that it only suggest an existing simulation ID ?
To sum up, I'd like to do what testOnly do, in a way : I only want to suggest things that make sense in my context.
Thanks in advance for your answers,
Pierre
Edit : As I got a bit stuck after my latest tries, here is the code of my inputTask, in it's current state :
package io.gatling.sbt
import sbt._
import sbt.complete.{ DefaultParsers, Parser }
import io.gatling.sbt.Utils._
object GatlingTasks {
val lastReport = inputKey[Unit]("Open last report in browser")
val allSimulationIds = taskKey[Set[String]]("List of simulation ids found in reports folder")
val allReports = taskKey[List[Report]]("List of all reports by simulation id and timestamp")
def findAllReports(reportsFolder: File): List[Report] = {
val allDirectories = (reportsFolder ** DirectoryFilter.&&(new PatternFilter(reportFolderRegex.pattern))).get
allDirectories.map(file => (file, reportFolderRegex.findFirstMatchIn(file.getPath).get)).map {
case (file, regexMatch) => Report(file, regexMatch.group(1), regexMatch.group(2))
}.toList
}
def findAllSimulationIds(allReports: Seq[Report]): Set[String] = allReports.map(_.simulationId).distinct.toSet
def openLastReport(allReports: List[Report], allSimulationIds: Set[String]): Unit = {
def simulationIdParser(allSimulationIds: Set[String]): Parser[Option[String]] =
DefaultParsers.ID.examples(allSimulationIds, check = true).?
def filterReportsIfSimulationIdSelected(allReports: List[Report], simulationId: Option[String]): List[Report] =
simulationId match {
case Some(id) => allReports.filter(_.simulationId == id)
case None => allReports
}
Def.inputTaskDyn {
val selectedSimulationId = simulationIdParser(allSimulationIds).parsed
val filteredReports = filterReportsIfSimulationIdSelected(allReports, selectedSimulationId)
val reportsSortedByDate = filteredReports.sorted.map(_.path)
Def.task(reportsSortedByDate.headOption.foreach(file => openInBrowser((file / "index.html").toURI)))
}
}
}
Of course, openReport is called using the results of allReports and allSimulationIds tasks.
I think I'm close to a functioning input task but I'm still missing something...
Def.inputTaskDyn returns a value of type InputTask[T] and doesn't perform any side effects. The result needs to be bound to an InputKey, like lastReport. The return type of openLastReport is Unit, which means that openLastReport will construct a value that will be discarded, effectively doing nothing useful. Instead, have:
def openLastReport(...): InputTask[...] = ...
lastReport := openLastReport(...).evaluated
(Or, the implementation of openLastReport can be inlined into the right hand side of :=)
You probably don't need inputTaskDyn, but just inputTask. You only need inputTaskDyn if you need to return a task. Otherwise, use inputTask and drop the Def.task.
I trying to learn captures in Lasso 9, but I am struggling to figure out how to access the #1 local variable from within a conditional that's inside an array->forEach capture. Maybe my approach is all wrong. Is there a reference to the parent capture that I need to use? Following is the working code:
define paramstovars() => {
local(p = web_request->params)
#p->foreach => {
local(i = #1)
if(#i->type == 'pair') => {
var(#i->first->asstring = #i->second->asstring)
}
}
}
Following is the code I am trying to get working without relying on a redundant local variable definition:
define paramstovars() => {
local(p = web_request->params)
#p->foreach => {
if(#1->type == 'pair') => {
var(#1->first->asstring = #1->second->asstring)
}
}
}
In this second example, I receive an error that Position was out of range: 1 max is 0 (Error Code -1) on the line calling var().
Obvious security concerns with this custom method aside, what's the most efficient way to make #1 available inside nested conditionals?
#1 is replaced within each capture — so yes, you will need to assign it to another local in order to use it in deeper captures. If you need to work with the local again try using query expressions instead:
with i in web_request->params do {
if(#i->type == 'pair') => {
var(#i->first->asstring = #i->second->asstring)
}
}
Also, I wouldn't recommend settings variables in this fashion — it posses a security risk. It would be better to store the parameters in a single variable and then potentially set specific variables from that. There's a set of tags that does something similar here: getparam / postparam
It is my experience that #1 is consumed when called first time. At least I have never ben able to call it twice in the same capture.
If I need the value more than once I make it a local first. As you do in your example 1.
Some experiments later.
You can call #1 several times, contrary to what I wrote, but what trips your effort is that you have a capture inside the capture (the conditional).
The second capture will have it's own input params.
Here's a working, tested example to do what you want to do:
local(
myarray = array(1, 2 = 'two', 3 = 'four', 4),
mypairs = map
)
#myarray -> foreach => {
if(#1-> isa(::pair)) => {
#mypairs -> insert(#1 -> first -> asstring = #1 -> second -> asstring)
}(#1)
}
#my pairs
The result will be map(2 = two, 3 = four)
The trick is the sending of the foreach param to the conditional capture : `{some code}(#1)
Now, with all that worked out. I recommend that you take a look at Ke Carltons latest addition to tagswap. It will solve the same problem way better than creating a bunch of dynamic vars as you are attempting to do:
www.lassosoft.com/tagswap/detail/web_request_params
I'm having trouble retrieving a filtered list from google app engine datastore (using python for server side). My data entity is defined as the following
class Course_Table(db.Model):
course_name = db.StringProperty(required=True, indexed=True)
....
head_tags_1=db.ListProperty(db.Key)
So the head_tags_1 property is a list of keys (which are the keys to a different entity called Headings_1).
I'm in the Handler below to spin through my Course_Table entity to filter the courses that have a particular Headings_1 key as a member of the head_tags_1 property. However, it doesn't seem like it is retrieving anything when I know there is data there to fulfill the request since it never displays the logs below when I go back to iterate through the results of my query (below). Any ideas of what I'm doing wrong?
def get(self,level_num,h_key):
path = []
if level_num == "1":
q = Course_Table.all().filter("head_tags_1 =", h_key)
for each in q:
logging.info('going through courses with this heading name')
logging.info("course name filtered is %s ", each.course_name)
MANY MANY THANK YOUS
I assume h_key is key of headings_1, since head_tags_1 is a list, I believe what you need is IN operator. https://developers.google.com/appengine/docs/python/datastore/queries
Note: your indentation inside the for loop does not seem correct.
My bad apparently '=' for list is already check membership. Using = to check membership is working for me, can you make sure h_key is really a datastore key class?
Here is my example, the first get produces result, where the 2nd one is not
import webapp2 from google.appengine.ext import db
class Greeting(db.Model):
author = db.StringProperty()
x = db.ListProperty(db.Key)
class C(db.Model): name = db.StringProperty()
class MainPage(webapp2.RequestHandler):
def get(self):
ckey = db.Key.from_path('C', 'abc')
dkey = db.Key.from_path('C', 'def')
ekey = db.Key.from_path('C', 'ghi')
Greeting(author='xxx', x=[ckey, dkey]).put()
x = Greeting.all().filter('x =',ckey).get()
self.response.write(x and x.author or 'None')
x = Greeting.all().filter('x =',ekey).get()
self.response.write(x and x.author or 'None')
app = webapp2.WSGIApplication([('/', MainPage)],
debug=True)