Groovy Map Issue with Variable Properties and String INterpolation - dictionary

I have been navigating map structures fine for a long time now. Yet, for some reason, the root of this problem escapes me. I've tried bracket notation as well, no luck.
Why doesn't the final output (null) return "[serverinfo:[listenPort:19001]]"
If I replace the two instances of ' "$instanceName" ' with simply ' services ', it works.
String instanceName = "Services"
Map serverNode = [
instances:[
"$instanceName":[
serverinfo:[
listenPort:19001
]
]
]
]
println "$instanceName"
println serverNode.instances
println serverNode.instances."$instanceName"
//output
Services
[Services:[serverinfo:[listenPort:19001]]]
null

The type of "$instanceName" is GStringImpl, not String. It's a common mistake (and hard to find!)
def serverNode = [
instances:[
("$instanceName" as String):[
serverinfo:[
listenPort:19001
]
]
]
]
as stated by #tim_yates in comment, if your interpolated string is as simple as in this example (ie ,"${property}"), then you can use the (property) syntax : Groovy put the value of the property as a key, not the word "property"

Related

Kusto extractjson not working with email address

I am attempting to use the extractjson() method that includes email addresses in the source data (specifically the # symbol).
let T = datatable(MyString:string)
[
'{"user#domain.com": {"value":10}, "userdomain.com": { "value": 5}}'
];
T
| project extractjson('$.["user#domain.com"].value', MyString)
This results in a null being returned, changing the JSONPath to '$.["userdomain.com"].value' does return the correct result.
Results
I know the # sign is a used as the current node in a filter expression, does this need to be escaped when used with KQL?
Just as a side note, I run the same test using nodes 'jsonpath' package and this worked as expected.
const jp = require('jsonpath');
const data = {"user#domain.com": {"value":10}, "name2": { "value": 5}};
console.log(jp.query(data, '$["user#domain.com"].score'));
you can use the parse_json() function instead, and when you don't have to use extract_json():
print MyString = '{"user#domain.com": {"value":10}, "userdomain.com": { "value": 5}}'
| project parse_json(MyString)["user#domain.com"].value
MyString_user#domain.com_value
10
From the documentation: https://learn.microsoft.com/en-us/azure/data-explorer/kusto/query/extractjsonfunction

Passing JSON file as string in environment variable for composer airflow from terraform script

I am creating a composer from terraform where I want to pass a json as input variable
Terraform code:
software_config{
env_variables{
AIRFLOW_VAR_MYJSON ="{'__comment1__': 'This the global section', 'project_id':'testproject', 'gce_zone':'us-east1-c', 'gce_region':'us-east1','networkname':'vpc1', 'subnetwork':'https://www.googleapis.com/compute/v1/projects/testproject/regions/us-east1/subnetworks/subnet1'}"
}
}
I am trying to read the value of AIRFLOW_VAR_MYJSON in DAG , but it is not working as the value is not recognized as JSON.
I tried converting it and then deserializing it with following code:
JSONList = Variable.get("MYJSON")
jsonvar = json.dumps(JSONList)
setting_var = Variable.set("settings", jsonvar)
dag_config = Variable.get("settings", deserialize_json=True)
but it is not working.
I have also tried using
dag_config =json.loads(jsonvar)
then reading value as
project_id = dag_config["project_id"]
but I get error : "string indices must be integers"
Please suggest a way to resolve this.
NOTE : I know the gcloud command to set variables from json file but that is not working in my case as the project is in VPC and kubernetes clusters are giving timeout or handshake error, so I have ruled out use of this option
Valid JSON can only be " not '. Try switching the quotes.
A value can be a string in double quotes, or a number, or true or false or null, or an object or an array.
software_config{
env_variables{
AIRFLOW_VAR_MYJSON ="{\"__comment1__\": \"This the global section\", \"project_id\":\"testproject\", \"gce_zone\":\"us-east1-c\", \"gce_region\":\"us-east1\",\"networkname\":\"vpc1\", \"subnetwork\":\"https://www.googleapis.com/compute/v1/projects/testproject/regions/us-east1/subnetworks/subnet1\"}"
}
}
Or a little nicer way:
software_config {
env_variables {
AIRFLOW_VAR_MYJSON = jsonencode({
"__comment1__" = "This the global section",
"project_id" = "testproject",
"gce_zone" = "us-east1-c",
"gce_region" = "us-east1",
"networkname" = "vpc1",
"subnetwork" = "https://www.googleapis.com/compute/v1/projects/testproject/regions/us-east1/subnetworks/subnet1",
})
}
}

Getting .key from Firebase using VueFire

I am trying to get a single key from firebase nodes and I can't get any from the code I have right now. Here is my code:
let app = Firebase.initializeApp(config)
let db = app.database()
let bdRef = db.ref()
export default {
name: 'hello',
firebase: {
businesses: bdRef.orderByChild('.key').equalTo('306')
}
}
I get this error when doing this:
validation.js?5c80:234 Uncaught Error: Query.orderByChild failed: First argument was an invalid path = ".key". Paths must be non-empty strings and can't contain ".", "#", "$", "[", or "]"`
When I do this with my code:
businesses: bdRef.orderByChild('title').equalTo('Feather Animation Wood Carving Supplies')
It comes with this array:
0:Object
.key:"3021"
address:"Hello Avenue"
city:""
description:"Wood carving tools and supplies. Please contact us by phone or internet."
email:"hi#gmail.com"
employees:"1"
How do I get the .key property?
Did you try use this command:
businesses['.key']
Its very simple answer that you have json with ".key" key and "3021" as its value. But in ".key" you have included "." which refers that you are giving empty path or maybe invalid path.
So if you name it as "key" or any name as "keyid" would be cool, unless you include as your error says can't contain ".", "#", "$", "[", or "]"
Hope this explanation helped!
As stated, you cannot query a property with a dot in its name. From the documentation, you need to use the built in orderByKey() filter instead:
export default {
name: 'hello',
firebase: {
businesses: bdRef.equalTo('306').orderByKey()
}
}

using dictionaries in swi-prolog

I'm working on a simple web service in Prolog and wanted to respond to my users with data formatted as JSON. A nice facility is reply_json_dict/1 which takes a dictionary and converts it in a HTTP response with well formatted JSON body.
My trouble is that building the response dictionary itself seems a little cumbersome. For example, when I return some data, I have data id but may/may not have data properties (possibly an unbound variable). At the moment I do the following:
OutDict0 = _{ id : DataId },
( nonvar(Props) -> OutDict1 = OutDict0.put(_{ attributes : Props }) ; OutDict1 = OutDict0 ),
reply_json_dict(OutDict1)
Which works fine, so output is { "id" : "III" } or { "id" : "III", "attributes" : "AAA" } depending whether or not Props is bound, but... I'm looking for an easier approach. Primarily because if I need to add more optional key/value pairs, I end up with multiple implications like:
OutDict0 = _{ id : DataId },
( nonvar(Props) -> OutDict1 = OutDict0.put(_{ attributes : Props }) ; OutDict1 = OutDict0 ),
( nonvar(Time) -> OutDict2 = OutDict1.put(_{ time : Time }) ; OutDict2 = OutDict1 ),
( nonvar(UserName) -> OutDict3 = OutDict2.put(_{ userName : UserName }) ; OutDict3 = OutDict2 ),
reply_json_dict(OutDict3)
And that seems just wrong. Is there a simpler way?
Cheers,
Jacek
Instead of messing with dictionaries, my recommendation in this case is to use a different predicate to emit JSON.
For example, consider json_write/2, which lets you emit JSON, also on current output as the HTTP libraries require.
Suppose your representation of data fields is the common Name(Value) notation that is used throughout the HTTP libraries for option processing:
Fields0 = [attributes(Props),time(Time),userName(UserName)],
Using the meta-predicate include/3, your whole example becomes:
main :-
Fields0 = [id(DataId),attributes(Props),time(Time),userName(UserName)],
include(ground, Fields0, Fields),
json_write(current_output, json(Fields)).
You can try it out yourself, by plugging in suitable values for the individual elements that are singleton variables in the snippet above.
For example, we can (arbitrarily) use:
Fields0 = [id(i9),attributes(_),time('12:00'),userName(_)],
yielding:
?- main.
{"id":"i9", "time":"12:00"}
true.
You only need to emit the suitable Content-Type header, and have the same output that reply_json_dict/1 would have given you.
You can do it in one step if you use a list to represent all values that need to go into the dict.
?- Props = [a,b,c], get_time(Time),
D0 = _{id:001},
include(ground, [props:Props,time:Time,user:UserName], Fs),
D = D0.put(Fs).
D0 = _17726{id:1},
Fs = [props:[a, b, c], time:1477557597.205908],
D = _17726{id:1, props:[a, b, c], time:1477557597.205908}.
This borrows the idea in mat's answer to use include(ground).
Many thanks mat and Boris for suggestions! I ended up with a combination of your ideas:
dict_filter_vars(DictIn, DictOut) :-
findall(Key=Value, (get_dict(Key, DictIn, Value), nonvar(Value)), Pairs),
dict_create(DictOut, _, Pairs).
Which then I can use as simple as that:
DictWithVars = _{ id : DataId, attributes : Props, time : Time, userName : UserName },
dict_filter_vars(DictWithVars, DictOut),
reply_json_dict(DictOut)

One HTTP Delimiter to Rule Them All

I have a configuration file in the format of blah = foo. I would like to have entries like:
http = https://stackoverflow.com/questions,header keys and values,string to search for.
I'm okay requiring that the the url be urlecncoded. Is there any ASCII character I can use that won't be valid value anywhere in the above example (After splitting once on =)? My example uses a comma but I think that is valid in a header value?
After pouring through some RFCs I figure someone is more familiar with this can save me some pain.
Also my project is in Go if there are existing std library that might help with this...
You can use a non-ascii character and urlencode, for example using the middle dot (compose + ^ + . on linux):
const sep = `·`
const t = `http = https://stackoverflow.com/questions·string to search for·header=value·header=value`
func parseLine(line string) (name, url, search string, headers []string) {
idx := strings.Index(line, " = ")
if idx == -1 {
return
}
name = line[:idx]
parts := strings.Split(line[idx+3:], sep)
if len(parts) < 3 {
// handle invalid line
}
url, search = parts[0], parts[1]
headers = parts[2:]
return
}
Although, using JSON is probably the best and most long-term maintainable option.
For completeness sake, a json version would look like:
type Site struct {
Url string
Query string
Headers map[string]string
}
const t = `[
{
"url": "https://stackoverflow.com/questions",
"query": "string to search for",
"headers": {"header": "value", "header2": "value"}
},
{
"url": "https://google.com",
"query": "string to search for",
"headers": {"header": "value", "header2": "value"}
}
]`
func main() {
var sites []Site
err := json.Unmarshal([]byte(t), &sites)
fmt.Printf("%+v (%v)\n", sites, err)
}
Essentially you have to look at RFC 3986, RFC 7230 and friends to see what can occur.
URIs are simple if you insist on them to be valid, just use the space character or "<" and ">" as delimiters.
Field values can be almost anything; HTTP forbids control characters though, so you might be able to use horizontal TABs (if you're ok with getting into trouble with invalid field values).

Resources