ANTLR3 mutually left-recursive Rule - recursion

Every solution I found on SO was "switch to ANTLR4" which isn't really an option because I am using antlr4ruby (which is ANTLR3, the 4 is meant as "for").
I want to build a rule for property access, it should match something like this:
variable
variable.property
variable.prop.prop
etc.
here's what I have:
variable: NAME -> ^(VARIABLE NAME) | variable DOT NAME -> ^(ACCESS variable NAME);
(VARIABLE and ACCESS are parser tokens for use later, NAME is a kind of string).
This is obviously left-recursive, but I have no idea how to fix this.

Related

In KQL how can I use bag_unpack to turn a serialized dictionary object in customDimensions into columns?

I'm trying to write a KQL query that will, among other things, display the contents of a serialized dictionary called Tags which has been added to the Application Insights traces table customDimensions column by application logging.
An example of the serialized Tags dictionary is:
{
"Source": "SAP",
"Destination": "TC",
"SAPDeliveryNo": "0012345678",
"PalletID": "(00)312340123456789012(02)21234987654(05)123456(06)1234567890"
}
I'd like to use evaluate bag_unpack(...) to evaluate the JSON and turn the keys into columns. We're likely to add more keys to the dictionary as the project develops and it would be handy not to have to explicitly list every column name in the query.
However, I'm already using project to reduce the number of other columns I display. How can I use both a project statement, to only display some of the other columns, and evaluate bag_unpack(...) to automatically unpack the Tags dictionary into columns?
Or is that not possible?
This is what I have so far, which doesn't work:
traces
| where datetime_part("dayOfYear", timestamp) == datetime_part("dayOfYear", now())
and message has "SendPalletData"
| extend TagsRaw = parse_json(customDimensions.["Tags"])
| evaluate bag_unpack(TagsRaw)
| project timestamp, message, ActionName = customDimensions.["ActionName"], TagsRaw
| order by timestamp desc
When it runs it displays only the columns listed in the project statement (including TagsRaw, so I know the Tags exist in customDimensions).
evaluate bag_unpack(TagsRaw) doesn't automatically add extra columns to the result set unpacked from the Tags in customDimensions.
EDIT: To clarify what I want to achieve, these are the columns I want to output:
timestamp
message
ActionName
TagsRaw
Source
Destination
SAPDeliveryNo
PalletID
EDIT 2: It turned out a major part of my problem was that double quotes within the Tags data are being escaped. While the Tags as viewed in the Azure portal looked like normal JSON, and copied out as normal JSON, when I copied out the whole of a customDimensions record the Tags looked like "Tags": "{\"Source\":\"SAP\",\"Destination\":\"TC\", ... with the double quotes escaped with backslashes.
The accepted answer from David Markovitz handles this situation in the line:
TagsRaw = todynamic(tostring(customDimensions["Tags"]))
A few comments:
When filtering on timestamp, better use the timestamp column As Is, and do the manipulations on the other side of the equation.
When using the has[...] operators, prefer the case-sensitive one (if feasable)
Everything extracted from dynamic value is also dynamic, and when given a dynamic value parse_json() (or its equivalent, todynamic()), simply returns it, As Is.
Therefore, we need to treet customDimensions.["Tags"] in 2 steps:
1st, convert it to string. 2nd, convert the result to dynamic.
To reference a field within a dynamic type you can use X.Y, X["Y"], or "X['Y'].
No need to combine them as you did with customDimensions.["Tags"].
As the bag_unpack plugin doc states:
"The specified input column (Column) is removed."
In other words, TagsRaw does not exist following the bag_unpack operation.
Please note that you can add prefix to the columns generated by bag_unpack. Might make it easier to differentiate them from the rest of the columns.
While you can use project, using project-away is sometimes easier.
// Data sample generation. Not part of the solution.
let traces =
print c1 = "some columns"
,c2 = "we"
,c3 = "don't need"
,timestamp = ago(now()%1d * rand())
,message = "abc SendPalletData xyz"
,customDimensions = dynamic
(
{
"Tags":"{\"Source\":\"SAP\",\"Destination\":\"TC\",\"SAPDeliveryNo\":\"0012345678\",\"PalletID\":\"(00)312340123456789012(02)21234987654(05)123456(06)1234567890\"}"
,"ActionName":"Action1"
}
)
;
// Solution starts here
traces
| where timestamp >= startofday(now())
and message has_cs "SendPalletData"
| extend TagsRaw = todynamic(tostring(customDimensions["Tags"]))
,ActionName = customDimensions.["ActionName"]
| project-away c*
| evaluate bag_unpack(TagsRaw, "TR_")
| order by timestamp desc
timestamp
message
ActionName
TR_Destination
TR_PalletID
TR_SAPDeliveryNo
TR_Source
2022-08-27T04:15:07.9337681Z
abc SendPalletData xyz
Action1
TC
(00)312340123456789012(02)21234987654(05)123456(06)1234567890
0012345678
SAP
Fiddle
If I understand correctly, you want to use project to limit the number of columns that are displayed, but you also want to include all of the unpacked columns from TagsRaw, without naming all of the tags explicitly.
The easiest way to achieve this is to switch the order of your steps, so that you first do the project (including the TagsRaw column) and then you unpack the tags. If desired, you can then use project-away to specifically remove the TagsRaw column after you've unpacked it.

Prepared SQLite statement with named parameter that contains spaces

In SQLite you can use named parameters in statements, like this (Python example):
cur.execute("insert into lang values (:foo, :bar)", {'foo': 'a', 'bar': 2})
Is there any way to have parameter names containing spaces? I.e:
cur.execute("insert into lang values (:'foo bar')", {'foo bar': 'a'})
The documentation suggests not but you never know.
Apparently for the #AAA form you can:
The identifier name in this case can include one or more occurrences of "::" and a suffix enclosed in "(...)" containing any text at all.
But that doesn't let you have an arbitrary name since the brackets are still part of the name. So the answer appears to be no.

How to access map in template string?

I want to use values from gradle.properties that should go into a template string.
A naive first:
println("${project.properties[somekey]}")
doesn't work: Unresolved reference: somekey
So, quotes required?
println("${project.properties[\"somekey\"]}")
is completely broken syntax: Expecting an expression for the first .
I couldn't find any example how to do this, yet the official documentation says expressions.
Question: is it possible to access a map in string template, and if so, how?
Yes and as follows:
"${project.properties["someKey"]}"
assuming the Map has the following signature: Map<String, Any?> (or Map<Any...)
Alternatives:
"${project.properties.getValue("someKey")}"
"${project.properties.getOrElse("someKey") { "lazy-evaluation-default-value" }}"
"${project.properties.getOrDefault("someKey", "someFixedDefaultValue")}"
Basically all the code you put in the ${} is just plain Kotlin code... no further quoting/escaping required, except for the dollar sign $ itself, e.g. use "\$test" if you do not want it to be substituted with a variable named test or """${"$"}test""" if you use a raw string
Note that in this println case the following would have sufficed as well (which also goes for all the shown alternatives above. You may omit the outer surrounding quotes and ${} altogether):
println(project.properties["someKey"])
See also Basic types - String templates

Programmatically getting a list of variables

Is it possible to get a list of declared variables with a VimL (aka VimScript) expression? I'd like to get the same set of values that will be presented for a command using -complete=expression. The goal is to augment that list for use in a user-defined command completion function.
You can use g: as a dictionary that holds all global variables, so:
let globals = keys(g:)
will give you all the names. The same applies to the other scopes: b:, s:, w:, etc. See :help internal-variables for the complete list.
You can get something similar using keys of g:, b:, t:, w: and v: dictionaries, but beware of the following facts:
There is no equivalent to this dictionaries if you want to complete options.
Some variables like count (but not g:count or l:count), b:changedtick and, maybe, others are not present in this dictionaries.
Some vim hacker may add key ### to dictionary g:, but it won't make expression g:### valid variable name (but adding 000 there will). Though g:["###"] will be a valid expression.

"Variable variable" syntax

This is a question related to getting Drupal CCK fields (just in case that happens to change anything).
I have several Drupal CCK fields with similar names. They have the same name with a number at the end. that I'd like to pull values from these fields (ten fields total). This is the syntax for accessing the fields values:
$node->cck_field_1[0]['value']
$node->cck_field_2[0]['value']
$node->cck_field_3[0]['value']
…etc.
Since they're all separate fields, but they're numbered, I'd like to just loop through incrementally to write out what I need (there's a lot more to what I'm writing than just accessing these fields' data, but they're the determining factors of the rest), but I can't figure out how to insert a variable into that part of the code.
e.g., (if $i were the incremental number variable), I'd like to be able to write the following string as a variable:
'$node->cck_field_' . $i . '[0]["value"]'
I understand about using the curly brackets to create a variable name from a string, but the part I need the variable in needs to be outside of the string. e.g. this works:
${node}->cck_field_1[0]['value']
but this doesn't:
${node->cck_field_1}[0]['value']
(so I can't write ${'node->cck_field'.$i}[0]['value'] )
So how can write this so that I can use $i in place of the number?
This should work:
$node->{'cck_field_' . $i}[0]['value']

Resources