Why is the expression `super()` in the Java AST in Rascal? - abstract-syntax-tree

One of the expressions in the Java AST declaration is Expression::super().
For which Java expression(s) is super() used?
Take this example class:
import java.util.ArrayList;
import java.util.List;
public class SuperTests<T> extends ArrayList<T> {
public SuperTests() {
super();
}
public SuperTests(int capacity) {
super(capacity);
}
#Override
public void clear() {
super.clear();
}
public <T extends Integer> void addSupers(List<? super T> list) {
}
}
The AST in rascal is:
compilationUnit(
[
import("java.util.ArrayList")[
#src=|project://TestThing/src/SuperTests.java|(0,27,<1,0>,<1,27>)
],
import("java.util.List")[
#src=|project://TestThing/src/SuperTests.java|(28,22,<2,0>,<2,22>)
]
],
[class(
"SuperTests",
[parameterizedType(simpleType(simpleName("ArrayList")[
#src=|project://TestThing/src/SuperTests.java|(87,9,<4,35>,<4,44>),
#decl=|java+class:///java/util/ArrayList|,
#typ=class(
|java+class:///java/util/ArrayList|,
[typeArgument(|java+typeVariable:///SuperTests/T|)])
]))],
[],
[
constructor(
"SuperTests",
[],
[],
block([constructorCall(
true,
[])[
#src=|project://TestThing/src/SuperTests.java|(128,8,<7,2>,<7,10>),
#decl=|java+constructor:///java/util/ArrayList/ArrayList()|
]])[
#src=|project://TestThing/src/SuperTests.java|(124,15,<6,21>,<8,2>)
])[
#modifiers=[public()],
#src=|project://TestThing/src/SuperTests.java|(104,35,<6,1>,<8,2>),
#decl=|java+constructor:///SuperTests/SuperTests()|,
#typ=constructor(
|java+constructor:///SuperTests/SuperTests()|,
[])
],
constructor(
"SuperTests",
[parameter(
int(),
"capacity",
0)[
#src=|project://TestThing/src/SuperTests.java|(159,12,<9,19>,<9,31>),
#decl=|java+parameter:///SuperTests/SuperTests(int)/capacity|,
#typ=int()
]],
[],
block([constructorCall(
true,
[simpleName("capacity")[
#src=|project://TestThing/src/SuperTests.java|(183,8,<10,8>,<10,16>),
#decl=|java+parameter:///SuperTests/SuperTests(int)/capacity|,
#typ=int()
]])[
#src=|project://TestThing/src/SuperTests.java|(177,16,<10,2>,<10,18>),
#decl=|java+constructor:///java/util/ArrayList/ArrayList(int)|
]])[
#src=|project://TestThing/src/SuperTests.java|(173,23,<9,33>,<11,2>)
])[
#modifiers=[public()],
#src=|project://TestThing/src/SuperTests.java|(141,55,<9,1>,<11,2>),
#decl=|java+constructor:///SuperTests/SuperTests(int)|,
#typ=constructor(
|java+constructor:///SuperTests/SuperTests(int)|,
[int()])
],
method(
void(),
"clear",
[],
[],
block([expressionStatement(methodCall(
true,
"clear",
[])[
#src=|project://TestThing/src/SuperTests.java|(234,13,<15,2>,<15,15>),
#decl=|java+method:///java/util/ArrayList/clear()|,
#typ=void()
])[
#src=|project://TestThing/src/SuperTests.java|(234,14,<15,2>,<15,16>)
]])[
#src=|project://TestThing/src/SuperTests.java|(230,21,<14,21>,<16,2>)
])[
#modifiers=[
annotation(markerAnnotation("Override")[
#src=|project://TestThing/src/SuperTests.java|(199,9,<13,1>,<13,10>),
#typ=interface(
|java+interface:///java/lang/Override|,
[])
]),
public()
],
#src=|project://TestThing/src/SuperTests.java|(199,52,<13,1>,<16,2>),
#decl=|java+method:///SuperTests/clear()|,
#typ=method(
|java+method:///SuperTests/clear()|,
[],
void(),
[])
],
method(
void(),
"addSupers",
[parameter(
parameterizedType(simpleType(simpleName("List")[
#src=|project://TestThing/src/SuperTests.java|(297,4,<18,43>,<18,47>),
#decl=|java+interface:///java/util/List|,
#typ=interface(
|java+interface:///java/util/List|,
[wildcard(super([typeArgument(|java+typeVariable:///SuperTests/addSupers(java/util/List)/T|)]))])
])),
"list",
0)[
#src=|project://TestThing/src/SuperTests.java|(297,20,<18,43>,<18,63>),
#decl=|java+parameter:///SuperTests/addSupers(java.util.List)/list|,
#typ=interface(
|java+interface:///java/util/List|,
[wildcard(super([typeArgument(|java+typeVariable:///SuperTests/addSupers(java/util/List)/T|)]))])
]],
[],
block([])[
#src=|project://TestThing/src/SuperTests.java|(319,7,<18,65>,<20,2>)
])[
#modifiers=[public()],
#src=|project://TestThing/src/SuperTests.java|(255,71,<18,1>,<20,2>),
#decl=|java+method:///SuperTests/addSupers(java.util.List)|,
#typ=method(
|java+method:///SuperTests/addSupers(java.util.List)|,
[typeParameter(
|java+typeVariable:///SuperTests/addSupers(java/util/List)/T|,
extends([class(
|java+class:///java/lang/Integer|,
[])]))],
void(),
[interface(
|java+interface:///java/util/List|,
[wildcard(super([typeArgument(|java+typeVariable:///SuperTests/addSupers(java/util/List)/T|)]))])])
]
])[
#modifiers=[public()],
#src=|project://TestThing/src/SuperTests.java|(52,278,<4,0>,<23,1>),
#decl=|java+class:///SuperTests|,
#typ=class(
|java+class:///SuperTests|,
[typeParameter(
|java+typeVariable:///SuperTests/T|,
unbounded())])
]])[
#src=|project://TestThing/src/SuperTests.java|(0,331,<1,0>,<23,2>),
#decl=|java+compilationUnit:///src/SuperTests.java|,
#messages=[
warning(
"The serializable class SuperTests does not declare a static final serialVersionUID field of type long",
|project://TestThing/src/SuperTests.java|(65,10,<4,0>,<4,0>)),
warning(
"The type parameter T is hiding the type T",
|project://TestThing/src/SuperTests.java|(263,1,<18,0>,<18,0>)),
warning(
"The type parameter T should not be bounded by the final type Integer. Final types cannot be further extended",
|project://TestThing/src/SuperTests.java|(273,7,<18,0>,<18,0>))
]
]
It does not contain the super() constructor with no arguments. Only in case of the wildcard bound (wildcard(super() is there a super(_) but it has one argument and is described in the lang::java::m3::TypeSymbol module.
So what is happening, is there a Java construct I'm missing, or is the Rascal ADT definition incorrect with respect to what is generated in the AST?

It isn't used to represent any Java expression anymore. It is a part of the code before we decided to change how super constructor invocation and super method invocation are represented in the Java AST in Rascal.
All constructor calls are represented by constructorCall(bool isSuper, _). A value of true in the isSuper field indicates that it is a super call. Similarly, all method calls are represented by methodCall(bool isSuper,_) with the same logic.
The wildcard(_) and super(_) from lang::java::m3::TypeSymbol are used to represent type information that is used in ASTs as well as M3.

Related

After Custom transformer, typescript still uses reference to old import

I'm using a CustomTransformer to update imports from:
import { global_spacer_form_element } from '#patternfly/react-tokens';
export const disabledLabelClassNameEx = global_spacer_form_element.var;
to
import global_spacer_form_element from '#patternfly/react-tokens/dist/js/global_spacer_form_element';
export const disabledLabelClassNameEx = global_spacer_form_element.var;
However, when using with ts-loader I get the following output (directly from ts-loader):
"use strict";
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.disabledLabelClassNameEx = void 0;
const global_spacer_form_element_1 = __importDefault(require("#patternfly/react-tokens/dist/js/global_spacer_form_element"));
exports.disabledLabelClassNameEx = react_tokens_1.global_spacer_form_element.var;
//# sourceMappingURL=Recipient2.js.map
Instead of using global_spacer_form_element directly, it is using react_tokens_1.global_spacer_form_element.
I suppose there is something missing in the transformer that the typescript compiler is using to build that react_tokens_1 variable.
The transformer is doing the following in its visitor (I'm simplifying the transformer code for the sake of showing the path it takes, full code can be see here):
const visitor: ts.Visitor = (node) => {
if (ts.isSourceFile(node)) {
return ts.visitEachChild(node, visitor, context)
}
if (!ts.isImportDeclaration(node) /* or if the lib name is not '#patternfly/react-tokens' */) {
return node
}
// for simplicity assume we take all NamedImports and the only found is...
const elements = ['global_spacer_form_element']
const importPath = '#patternfly/react-tokens/dist/js/global_spacer_form_element'
return elements.map((e) => {
return ts.factory.createImportDeclaration(
undefined,
undefined,
ts.factory.createImportClause(
false,
ts.factory.createIdentifier(e),
undefined,
),
ts.factory.createStringLiteral(importPath),
)
})
}
My tsconfig.json
{
"compilerOptions": {
"module": "commonjs",
"target": "es6",
"allowJs": true,
"checkJs": false,
"jsx": "react",
"outDir": "./build",
"removeComments": true,
"pretty": true,
"skipLibCheck": true,
"strict": true,
"moduleResolution": "node",
"esModuleInterop": true,
"noImplicitAny": false,
"sourceMap": true,
"resolveJsonModule" : true
},
"include": [
"./src/**/*"
],
"exclude": [
"./node_modules/*",
"**/*.js"
]
}
and finally the ts-loader config:
{
test: /src\/.*\.tsx?$/,
loader: 'ts-loader',
exclude: /(node_modules)/i,
options: {
getCustomTransformers: () => ({
before: [
tsImportPluginFactory({
libraryName: '#patternfly/react-tokens',
libraryDirectory: 'dist/js',
camel2DashComponentName: false
})
]
})
}
Any idea of what else I need to update or what I could check to ensure this transformer works as I am expecting?
edit: Reference to old import is gone, but I didn't notice before that the new import also gets transformed: e.g. from foobar to foobar_1.
The TypeScript compiler has four main phases—parsing, binding, type checking, and emitting. Binding is where the relationships between identifiers are resolved, but transformation happens during the "emitting" phase. So by the time you're transforming it's too late and the compiler has already figured out what identifiers it's going to transform.
One way to do what you want to do, is to traverse all the nodes in the file, find the identifiers that match one of the ones in your import, then recreate those identifiers by returning context.factory.createIdentifier(node.escapedText) in the visitor for that node. That will make the compiler leave the node as-is when emitting.
The trouble though may be figuring out which identifiers in a file reference the named import identifier. Generally I don't recommend using the type checker in transforms because it can lead to unexpected results when there are multiple transformations happening on a file, but you might be able to get away with first checking if the identifier's escapeText matches, then checking if typeChecker.getSymbolAtLocation(node)?.declarations[0] equals the named export identifier found in the original import declaration. Alternatively, I think you would have to implement your own scope analysis.

PHPUnit: testing is_a on multiple conditions using data provider

I'm trying to write a test for the following method:
/**
* #dataProvider attributesValuesProvider
*/
public function myFunction($entityObject, $diffArr, $prevArr)
{
....
....
if (is_a($entityObject, Customer::class)) {
$entityType = CustomerMetadataInterface::ENTITY_TYPE_CUSTOMER;
} elseif (is_a($entityObject, Address::class)) {
$entityType = AddressMetadataInterface::ENTITY_TYPE_ADDRESS;
} else {
$entityType = null;
}
....
....
return $entityType;
}
I have defined the following data provider:
public function attributesValuesProvider()
{
return [
[null, [], []],
[Customer::class, [], []],
[Address::class, [], []],
];
}
I've twisted this on all sides and I still can't think of a way to write this test. I don't have relevant experience with unit tests so I might be on a wrong path.
Your data provider needs to provide the expected result as well as the method parameters. You can see a simple example in the PHPUnit documentation.
public function attributesValuesProvider()
{
return [
[null, [], [], null],
[new Customer, [], [], CustomerMetadataInterface::ENTITY_TYPE_CUSTOMER],
[new Address, [], [], AddressMetadataInterface::ENTITY_TYPE_ADDRESS],
];
}
The test that uses the data provider will be executed once for each row in the provider, with all the values in the row passed as its arguments. So your test just needs to take all four arguments, call the method and verify that the expected result was returned.
/**
* #dataProvider attributesValuesProvider
*/
public function testMyFunction($object, $diff, $prev, $expected_result) {
$example = new YourClass();
// or maybe you already created this object in your setUp method?
$actual_result = $example->myFunction($object, $diff, $prev);
$this->assertSame($expected_result, $actual_result);
}

flow function declaration syntax

I am trying to document a factory function in my declaration file.
My goal is to make flow aware of my simple factory.
It's used in koa v2 routes and it's a way to inject some options in my service.
Here is the factory:
ctx.compose = function Compose<T: *>(service: Class<T>, options: ?Object): T {
return new service(_.extend({}, ctx._requestOptions, options));
};
Because I use koa v2 I created a type KoaCtx in a declaration that look like this:
declare type KoaCtx = {
params: { [key: string]: string },
request: {
query: { [key: string]: string },
body: { [key: string]: string | boolean | Array<any> | Number | Date | Object },
},
body: any,
req: any,
res: any,
state: any,
...
compose: compose: function <T: *>(service: Class<T>, options: ?Object): T
}
I tried different syntaxes but I keep getting errors.
compose: function <T: *>(service: Class<T>, options: ?Object): T
^ Unexpected token :
If I put the first snippet of code inside my koa route it's working fine!
I tried to add the file with the ctx.compose definition in [include] tag in flow config but it's not working.
Update
Tried with this declaration:
declare function Compose<T: *>(service: Class<T>, options: ?Object): T;
declare type KoaCtx = {
...
compose: Compose<Class<*>>
};
But unfortunately it's still not working.
A function type looks like
(argName1: Type1, argName2: Type2, ...restName: Array<RestType>) => ReturnType
a function type with generics looks like
<T>(argName1: Type1, argName2: Type2, ...restName: Array<RestType>) => ReturnType
So you probably should write
declare type KoaCtx = {
...
compose: <T: *>(service: Class<T>, options: ?Object) => T
}
That said, you're using the existential type * as an upper bound for T. I'm not sure if that makes much sense here. So I'd just recommend writing
declare type KoaCtx = {
...
compose: <T>(service: Class<T>, options: ?Object) => T
}
As for your update, you can only explicitly instantiate type parameters for type aliases, interfaces, and classes. So you could write
declare type Compose = <T>(service: Class<T>, options: ?Object) => T;
declare type KoaCtx = {
...
compose: Compose<MyClass>
};
but you can't do that if Compose is a generic function.

Aurelia how to do async work in attached method of custom component

This seems like a basic flow, but I am unable to find examples.
I have this custom component that loads a list of items from a backend service.
I tried writing this async code below, but I get 'Unexpected token' error in the browser at let.
import {customElement, bindable, inject} from 'aurelia-framework';
import {ItemsService} from 'Services/ItemsService';
#customElement('itemslist')
export class ItemsList {
static inject() { return [Element, ItemsService]; }
constructor(element, itemsService) {
this.element = element;
this.itemsService = itemsService;
}
async attached() {
let this.items = await this.itemsService.getItemList();
}
}
How should I do async work to load items and set it on my View-Model items property?
Thanks
let keyword is used to declare local variables, you can't use it before this. Just remove let. Declare your items property in constructor or with ES7 syntax.
Babel's async/await transformer must be enabled- Change this:
config.js
"babelOptions": {
"optional": [
"es7.decorators",
"es7.classProperties"
]
},
To this:
"babelOptions": {
"optional": [
"es7.decorators",
"es7.classProperties",
"es7.asyncFunctions"
]
},
Or this:
"babelOptions": {
"stage": 0
"optional": ["runtime"]
},

Build a Map from a Set in Groovy

We have the following legacy data structure of Parent and Child objects of the same type
Parent1(name1,code1,null)
Child11(name11,code11,Parent1)
Child12(name12,code12,Parent1)
Parent2(name2,code2,null)
Child21(name21,code21,Parent2)
Child22(name22,code22,Parent2)
etc.
We have a legacy service available that returns a Set of all the Child objects. The Parent objects are not returned but we can call a getParent() getter for a particular Child in the Set to get its Parent. We need to call this service from a Groovy class, and afterwards build a Map that reflects the original structure
def dataMap = [data:[["name":"name1", "code":"code1",
"children":[["name":"name11", "code":"code11"],
["name":"name12", "code":"code12"]]],
["name":"name2", "code":"code2",
"children":[["name":"name21", "code":"code21"],
["name":"name22", "code":"code22"]]]]]
So basically the Map keys are the Parent (name,code) pairs, and the values are Lists of the respective Child objects' (name,code) pairs (the Map will be rendered to JSON afterwards actually)
Being quite novice to Groovy I could probably solve this using Java syntax, but I wonder whether there is a more concise solution using Groovy specific features? Any ideas are appreciated
So as I understand it, this is the setup you have:
import groovy.transform.*
import groovy.json.*
#TupleConstructor(includeFields=true)
class Node {
String name
String code
private Node parent
String getParentName() { parent?.name }
String getParentCode() { parent?.code }
}
def parent1 = new Node( 'name1', 'code1', null )
def child11 = new Node( 'name11', 'code11', parent1 )
def child12 = new Node( 'name12', 'code12', parent1 )
def parent2 = new Node( 'name2', 'code2', null )
def child21 = new Node( 'name21', 'code21', parent2 )
def child22 = new Node( 'name22', 'code22', parent2 )
// This is returned by a call to your API
Set nodes = [ child11, child12, child21, child22 ]
Then, you can do the following (there are probably other routes, and this will only work for single depth trees)
// Get a set of parent nodes
Set parents = nodes.collect { [ name:it.parentName, code:it.parentCode ] }
// Utility closure to return a name and code in a Map
def format = { Node n ->
[ name: n.name, code: n.code ]
}
// Collect the formatted parents with their formatted children into a Map
def dataMap = [ data:parents.collect { p ->
p + [ children:nodes.findAll {
it.parentName == p.name && it.parentCode == p.code
}.collect { format( it ) } ]
} ]
// Print the JSON representation of this
println new JsonBuilder( dataMap ).toPrettyString()
That should print:
{
"data": [
{
"name": "name2",
"code": "code2",
"children": [
{
"name": "name21",
"code": "code21"
},
{
"name": "name22",
"code": "code22"
}
]
},
{
"name": "name1",
"code": "code1",
"children": [
{
"name": "name11",
"code": "code11"
},
{
"name": "name12",
"code": "code12"
}
]
}
]
}

Resources