Can I use JSDoc annotations with Flow to enforce typing on a per-file basis? - flowtype

What I want is to Flow to understand JSDoc syntax, without any transpiling.
Example:
// #flow
/**
* #param {string} str
* #return {string}
*/
function foo(str) {
return str + str;
}
foo(1); // Flow shows error
And then using flow check-contents < foo.js to get the error.
Possible somehow? Preferably without using Babel or any other transpiler. (And more importantly - why would Flow choose to invent its own syntax when there's already a standard available? This will basically force us to choose TypeScript over Flow.)
Related question: Include Flow Types into JSDoc

No, but you can use inline type declarations like described here: https://flow.org/en/docs/types/comments/
Example:
// #flow
/*::
type MyAlias = {
foo: number,
bar: boolean,
baz: string,
};
*/
function method(value /*: MyAlias */) /*: boolean */ {
return value.bar;
}
method({ foo: 1, bar: true, baz: ["oops"] });

Related

What is a good practice for adding flow type annotations on project specific named exports

Please correct me if i am wrong. As far as i understand it up till now; type annotations can be added to a file or in libdefs (for shareable code)
For example in a project specific file helpers.js
// #flow
export function square(value: number): number {
return value * value
}
export function someOtherFunction(arg: string): string {
}
etc...
And in a libdef helpers.js
declare module 'helpers' {
declare export function square(value: number): number;
declare export function someOtherFunction(arg: string): string;
}
What would be a good practice for writing flow annotations on project specific code and especially lots of code. For example helpers exposing 20+ named exports, as this is the point where i am starting to think having a libdef would be more clearer to reason about.
And is it at all possible to use that libdef file as the single entry? I've fooled around a bit and i always had to annotate in the file itself even though i had added the libdef and told flow through the config to include these libdefs.
In our project, we use the following approach:
// #flow
export const square: SquareType = (value) => {
return value * value;
}
So you can declare SquareType in the helpers.js file just above the function or you can move it to a separate file and import it then into helpers.js
Many third-party modules don’t have types or only TypeScript types.
And libdefs need for one reason. To declare types for untyped modules!
More info: https://flow.org/en/docs/libdefs/

How to create a Flow Union runtime refinement without embedding literals

Hello kind Stackoverflow folks,
I'm trying to create a function to guard off code from being executed at run-time with an incorrect Flow type present.
My understanding is that the way to do this at run-time is by refining, or checking, that the type matches what is required and using Flow to keep an eye that no cases are missed along the way.
A simple case is where I have a string input that I would like to confirm matches to a enum/Union type. I have this working as I would expect with literals e.g.
/* #flow */
type typeFooOrBaa = "foo"| "baa"
const catchType = (toCheck: string): void => {
// Working check
if (toCheck === "foo" || toCheck === "baa") {
// No Flow errors
const checkedValue: typeFooOrBaa = toCheck
// ... do something with the checkedValue
}
};
Try it over here
Naturally, I would like to avoid embedding literals.
One of the things I've tried is the equivalent object key test, which doesn't work :-( e.g.
/* #flow */
type typeFooOrBaa = "foo"| "baa"
const fooOrBaaObj = {"foo": 1, "baa": 2}
const catchType = (toCheck: string): void => {
// Non working check
if (fooOrBaaObj[toCheck]) {
/*
The next assignment generates the following Flow error
Cannot assign `toCheck` to `checkedVariable` because: Either string [1] is incompatible
with string literal `foo` [2]. Or string [1] is incompatible with string literal `baa` [3].",
"type"
*/
const checkedVariable: typeFooOrBaa = toCheck
}
};
Try it over here
Is it possible to achieve something like this without having to go down the full flow-runtime route? If so how is it best done?
Thanks for your help.
One approach that appears to works is to use the const object which defines the allowed values, to:
Generate a union type using the $keys utility.
Use that union type to create a map object where the keys are the desired input (our case strings) and the values are "maybe"s of the type that needs refining.
Here's the example from earlier reworked so that it:
Sets the type up as we'd expect to allow either "foo" or "baa" but nothing else.
Detects when a string is suitably refined so that it only contains "foo" or "baa".
Detects when a string might contain something else other than what's expected.
Credit to #vkurchatkin for his answer that helped me crack this (finally).
/* #flow */
// Example of how to persuade Flow to detect safe adequately refined usage of a Union type
// at runtime and its unsafe, inadequately refined counterparts.
const fooOrBaaObj = {foo: 'foo', baa: 'baa'}
type typeFooOrBaa = $Keys<typeof fooOrBaaObj>
// NB: $Keys used inorder for the type definition to avoid aliasing typeFooOrBaa === string
// which allows things like below to correctly spot problems.
//const testFlowSpotsBadDefition: typeFooOrBaa = "make_flow_barf"
const fooOrBaaMap: { [key: string]: ?typeFooOrBaa } = fooOrBaaObj;
// NB: Use of the "?" maybe signifier in the definition a essential to inform Flow that indexing into
// the map "might" produce a "null". Without it the subsequent correct detection of unsafe
// unrefined variables fails.
const catchType = (toCheck: string): void => {
const myValue = fooOrBaaMap[toCheck];
if (myValue) {
// Detects refined safe usage
const checkedVariable: typeFooOrBaa = myValue
}
// Uncommenting the following line correctly causes Flow to flag the unsafe type. Must have the
// "?" in the map defininiton to get Flow to spot this.
//const testFlowSpotsUnrefinedUsage: typeFooOrBaa = myValue
}
Have a play with it over here
You can type the object as {[fooOrBaa]: number}, but flow will not enforce that all members of fooOrBaa exist in the object.

How to declare a flowtype library definition for polymorphic functions

What is the proper way to specify the type definitions a polymorphic method that depending on the parameter types has different return types?
index.js:
// #flow
import {func1} from './lib1';
const s: string = func1('string');
const b: boolean = func1(); // should cause type error but does not!
lib1.js:
export function func1(p) {
return (typeof p === 'string') ? p : 0;
}
defs/lib1.js.flow
// #flow
declare module "lib1" {
declare export function func1(p: string): string;
declare export function func1(_: void): number;
}
.flowconfig:
[libs]
defs/
I would have hoped to received an error message in index.js(4) but flow does not complain!
Yes, the example you gave is how to declare an overloaded function. However, you may want to change the second line to:
declare function myFunc(_: void): number;
Since Flow allows a function to be called with too many arguments (though not for much longer), it may select the second overload even if the function is called with a string. The modification I suggest makes it so the argument must be undefined (which is what is implicitly passed if you just leave off an argument).

How to avoid using literal strings to narrow disjoint unions in flow

All the examples I find online for narrowing the disjoint union in flowtype uses string literals, like the official one. I would like to know if there is a way to check against a value from an enum like:
const ACTION_A = 'LITERAL_STRING_A';
const ACTION_B = 'LITERAL_STRING_B';
type ActionA = {
// This is not allowed
type: ACTION_A,
// type: 'LITERAL_STRING_A' is allowed
dataA: ActionAData,
}
type ActionB = {
// This is not allowed
type: ACTION_B,
// type: 'LITERAL_STRING_B' is allowed
dataB: ActionBData,
}
type Action = ActionA | ActionB;
function reducer(state: State, action: Action): State {
// Want to narrow Action to ActionA or ActionB based on type
switch (action.type) {
// case 'LITERAL_STRING_A': -- successfully narrow the type
case ACTION_A: // doesn't work
// action.dataA is accessible
...
}
...
}
Unfortunately you can't do these because strings are ineligible as type annotations.
If there is any other way around this that doesn't force typing the string literals everywhere I would love to know.
If there isn't a way around this, also accept suggestions on a higher level how to not need to define these disjoint sets for redux actions.
I'm not in my best shape right now, so sorry if I read your question wrong. I'll try to help anyway. Is this what you're looking for?
const actionTypes = {
FOO: 'FOO',
BAR: 'BAR'
}
type ActionType = $Keys<actionTypes> // one of FOO, BAR
function buzz(actionType: ActionType) {
switch(actionType) {
case actionTypes.FOO:
// blah
}
This should work. Sorry if my syntax is a bit off.
If you're asking how to avoid listing all action types in type Action = ActionA | ActionB then sorry, I don't know, I think this is the way you do it. If I recall correctly, a slightly nicer syntax for defining long unions was recently introduce in Flow:
type Action =
| ActionA
| ActionB
| ActionC
Also, if you don't need individual action types, you can just do
type Action =
| {type: ACTION_A; dataA: ActionAData;}
| {type: ACTION_B; dataB: ActionBData;}
The better way would be to use string literal types for const values:
Try flow...
const ACTION_A:'LITERAL_STRING_A' = 'LITERAL_STRING_A';
const ACTION_B:'LITERAL_STRING_B' = 'LITERAL_STRING_B';

Issues with Paw's DynamicValueInput type JSON and Checkbox

I want to create a DynamicValue plugin for Paw generating Json Web Tokens. The full source can be found here: https://github.com/choffmeister/Paw-JsonWebTokenDynamicValue
Relevant file:
// JsonWebTokenDynamicValue.js
import jsrsasign from 'jsrsasign';
#registerDynamicValueClass
class JsonWebTokenDynamicValue {
static identifier = 'de.choffmeister.PawExtensions.JsonWebTokenDynamicValue';
static title = 'Json Web Token';
static help = 'https://github.com/choffmeister/Paw-JsonWebTokenDynamicValue';
static inputs = [
DynamicValueInput('signatureSecret', 'Secret', 'SecureValue'),
DynamicValueInput('signatureSecretIsBase64', 'Secret is Base64', 'Checkbox'),
DynamicValueInput('payload', 'Payload', 'JSON')
];
evaluate() {
console.log(JSON.stringify(this.payload, null, 2));
console.log(JSON.stringify(this.signatureSecretIsBase64, null, 2));
const now = Math.floor((new Date()).getTime() / 1000);
const header = {
typ: 'JWT',
alg: 'HS256'
};
const payload = {
...this.payload,
exp: now + (60 * 60 * 24 * 7),
iat: now
};
const secret = this.signatureSecretIsBase64
? {b64: jsrsasign.b64utob64(this.signatureSecret)}
: this.signatureSecret;
return jsrsasign.jws.JWS.sign(null, header, payload, secret);
}
}
How it looks in the GUI:
I searched https://luckymarmot.com/paw/doc/extensions/create-dynamic-value, the surrounding documentation and all plugin examples I could find on the web, but I still have two problems I cannot solve:
When using the DynamicValueInput of type Checkbox then the input field is not visible (see screenshot). I get a value (empty string), but just cannot see it. How can I make the checkbox appear?
When using the DynamicValueInput of type JSON then used dynamic values inside the JSON (see screenshot) are not resolved, but instead I get kind of a description object (stringified), what this dynamic value is. Logging the this.payload object looks like this:
{
"foo": "[{\"data\":{\"environmentVariable\":\"2925ABDA-8AAC-440B-B2CA-DA216CD37A09\"},\"identifier\":\"com.luckymarmot.EnvironmentVariableDynamicValue\"}]"
}
Maybe it is worth to note: When using DynamicInputValue of type KeyValueList then the inner dynamic values are resolved properly. How can I achive this with the JSON type, too?
This issue has been solved in Paw 2.3.3 and #Thekwasti's extension has actually been published here: https://luckymarmot.com/paw/extensions/JsonWebTokenDynamicValue

Resources