Ramda applySpec - keep unmodified props - functional-programming

Let's say I have an object const foo = { a: 1, b: 2 } and I want to add a prop c which is based on b.
I could do:
applySpec({
a: prop('a'),
b: prop('b'),
c: ({ b }) => b + 1
}, foo)
and get an object like: { a: 1, b: 2, c: 3 }
Is there a nicer way to do this?
I've looked at evolve, assoc and applySpec but none of them seems to be fit for purpose.

You can use R.chain to create a function that apply the spec, and then merges the new object with the original one.
If R.chain is used with function (f & g):
chain(f, g)(x) is equivalent to f(g(x), x)
In this case chain(mergeLeft, applySpec({})) is equal to mergeLeft(applySpec({}), originalObject).
const { chain, mergeLeft, applySpec } = R
const fn = chain(mergeLeft, applySpec({
c: ({ b }) => b + 1
}))
const foo = { a: 1, b: 2 }
const result = fn(foo)
console.log(result)
<script src="https://cdnjs.cloudflare.com/ajax/libs/ramda/0.27.1/ramda.min.js" integrity="sha512-rZHvUXcc1zWKsxm7rJ8lVQuIr1oOmm7cShlvpV0gWf0RvbcJN6x96al/Rp2L2BI4a4ZkT2/YfVe/8YvB2UHzQw==" crossorigin="anonymous"></script>
You can make this a generic function that allows adding to existing object, by using R.pipe to pass a curried R.applySpec to the chain:
const { pipe, chain, mergeLeft, applySpec } = R
const fn = pipe(applySpec, chain(mergeLeft))
const addCProp = fn({
c: ({ b }) => b + 1
})
const foo = { a: 1, b: 2 }
const result = addCProp(foo)
console.log(result)
<script src="https://cdnjs.cloudflare.com/ajax/libs/ramda/0.27.1/ramda.min.js" integrity="sha512-rZHvUXcc1zWKsxm7rJ8lVQuIr1oOmm7cShlvpV0gWf0RvbcJN6x96al/Rp2L2BI4a4ZkT2/YfVe/8YvB2UHzQw==" crossorigin="anonymous"></script>

Related

How to pass data to a later stage of a compose pipeline

If I have a bunch of functions which strictly chain together, then it's easy enough to use compose to combine them:
f1 : A -> B
f2 : B -> C
f3 : C -> D
pipe(f1, f2, f3) : A -> D
Often I find that things aren't quite so perfect, and the information contained in A is needed again in a later stage:
f1 : A -> B
f2 : B -> C
f3 : (A, C) -> D
How do I elegantly compose these functions? I feel like I want some kind of "stash" to tuck the A into a Pair or something, map pipe(f1,f2) over the second element and then I have everything ready for f3. I can't come up with a very elegant way of doing this though, and it feels like a common enough situation that there must be an estalished pattern!
As an concrete example, say I have a string and I want to return it if it has an even length, otherwise I want to return none.
f1 = len
f2 = mod(2)
f3 = (s, m) => m == 0 ? Just(s) : None
How do I compose these together?
The type of function composition doesn't allow this. I think that a lambda along with currying is straightforward and more explicit than passing a tuple type through the composition:
const f1 = s => s.length;
const f2 = n => n % 2;
const f3 = s => m => m === 0 ? s : null;
const comp3 = f => g => h => x => f(g(h(x)));
const main = s => comp3(f3(s)) (f2) (f1) (s);
console.log(main("hallo"));
console.log(main("halloo"));
If you absolutely want it point free you can also utilize the fact that function composition may yield another function:
const f1 = s => s.length;
const f2 = n => n % 2;
const f3 = s => m => m === 0 ? s : null;
const comp3 = f => g => h => x => f(g(h(x)));
const join = f => x => f(x) (x); // monadic join
const flip = f => y => x => f(x) (y);
const main = join(comp3(flip(f3)) (f2) (f1));
console.log(main("hallo"));
console.log(main("halloo"));
Pretty hard to read though.
Just to elaborate a little on my comment on the original question - I have found a way of achieving what I want in quite a (IMO) nice style. It still feels like I'm reinventing the wheel though, so another way to rephrase the original question might be: do you recognise the function signatures below?
// stash :: A => [A,A]
const stash = x => [x, x];
// map :: (A => C) => [A,B] => [A,C]
const map = f => ([a, b]) => [a, f(b)];
// unstash :: ((A,B) => C) => [A,B] => C
const unstash = f => ([a, b]) => f(a, b);
const f1 = s => s.length;
const f2 = n => n % 2 === 0;
const f3 = (s, x) => x ? Option.some(s) : Option.none;
const getEvenName =
pipe(
stash,
map(f1),
map(f2),
unstash(f3)
);
getEvenName("Lucy") // Some("Lucy");
getEvenName("Tom") // None

flattening an array via the AST [duplicate]

I have a JavaScript array like:
[["$6"], ["$12"], ["$25"], ["$25"], ["$18"], ["$22"], ["$10"]]
How would I go about merging the separate inner arrays into one like:
["$6", "$12", "$25", ...]
ES2019
ES2019 introduced the Array.prototype.flat() method which you could use to flatten the arrays. It is compatible with most environments, although it is only available in Node.js starting with version 11, and not at all in Internet Explorer.
const arrays = [
["$6"],
["$12"],
["$25"],
["$25"],
["$18"],
["$22"],
["$10"]
];
const merge3 = arrays.flat(1); //The depth level specifying how deep a nested array structure should be flattened. Defaults to 1.
console.log(merge3);
Older browsers
For older browsers, you can use Array.prototype.concat to merge arrays:
var arrays = [
["$6"],
["$12"],
["$25"],
["$25"],
["$18"],
["$22"],
["$10"]
];
var merged = [].concat.apply([], arrays);
console.log(merged);
Using the apply method of concat will just take the second parameter as an array, so the last line is identical to this:
var merged = [].concat(["$6"], ["$12"], ["$25"], ["$25"], ["$18"], ["$22"], ["$10"]);
Here's a short function that uses some of the newer JavaScript array methods to flatten an n-dimensional array.
function flatten(arr) {
return arr.reduce(function (flat, toFlatten) {
return flat.concat(Array.isArray(toFlatten) ? flatten(toFlatten) : toFlatten);
}, []);
}
Usage:
flatten([[1, 2, 3], [4, 5]]); // [1, 2, 3, 4, 5]
flatten([[[1, [1.1]], 2, 3], [4, 5]]); // [1, 1.1, 2, 3, 4, 5]
There is a confusingly hidden method, which constructs a new array without mutating the original one:
var oldArray = [[1],[2,3],[4]];
var newArray = Array.prototype.concat.apply([], oldArray);
console.log(newArray); // [ 1, 2, 3, 4 ]
It can be best done by javascript reduce function.
var arrays = [["$6"], ["$12"], ["$25"], ["$25"], ["$18"], ["$22"], ["$10"], ["$0"], ["$15"],["$3"], ["$75"], ["$5"], ["$100"], ["$7"], ["$3"], ["$75"], ["$5"]];
arrays = arrays.reduce(function(a, b){
return a.concat(b);
}, []);
Or, with ES2015:
arrays = arrays.reduce((a, b) => a.concat(b), []);
js-fiddle
Mozilla docs
There's a new native method called flat to do this exactly.
(As of late 2019, flat is now published in the ECMA 2019 standard, and core-js#3 (babel's library) includes it in their polyfill library)
const arr1 = [1, 2, [3, 4]];
arr1.flat();
// [1, 2, 3, 4]
const arr2 = [1, 2, [3, 4, [5, 6]]];
arr2.flat();
// [1, 2, 3, 4, [5, 6]]
// Flatten 2 levels deep
const arr3 = [2, 2, 5, [5, [5, [6]], 7]];
arr3.flat(2);
// [2, 2, 5, 5, 5, [6], 7];
// Flatten all levels
const arr4 = [2, 2, 5, [5, [5, [6]], 7]];
arr4.flat(Infinity);
// [2, 2, 5, 5, 5, 6, 7];
Most of the answers here don't work on huge (e.g. 200 000 elements) arrays, and even if they do, they're slow.
Here is the fastest solution, which works also on arrays with multiple levels of nesting:
const flatten = function(arr, result = []) {
for (let i = 0, length = arr.length; i < length; i++) {
const value = arr[i];
if (Array.isArray(value)) {
flatten(value, result);
} else {
result.push(value);
}
}
return result;
};
Examples
Huge arrays
flatten(Array(200000).fill([1]));
It handles huge arrays just fine. On my machine this code takes about 14 ms to execute.
Nested arrays
flatten(Array(2).fill(Array(2).fill(Array(2).fill([1]))));
It works with nested arrays. This code produces [1, 1, 1, 1, 1, 1, 1, 1].
Arrays with different levels of nesting
flatten([1, [1], [[1]]]);
It doesn't have any problems with flattening arrays like this one.
Update: it turned out that this solution doesn't work with large arrays. It you're looking for a better, faster solution, check out this answer.
function flatten(arr) {
return [].concat(...arr)
}
Is simply expands arr and passes it as arguments to concat(), which merges all the arrays into one. It's equivalent to [].concat.apply([], arr).
You can also try this for deep flattening:
function deepFlatten(arr) {
return flatten( // return shalowly flattened array
arr.map(x=> // with each x in array
Array.isArray(x) // is x an array?
? deepFlatten(x) // if yes, return deeply flattened x
: x // if no, return just x
)
)
}
See demo on JSBin.
References for ECMAScript 6 elements used in this answer:
Spread operator
Arrow functions
Side note: methods like find() and arrow functions are not supported by all browsers, but it doesn't mean that you can't use these features right now. Just use Babel — it transforms ES6 code into ES5.
You can use Underscore:
var x = [[1], [2], [3, 4]];
_.flatten(x); // => [1, 2, 3, 4]
Generic procedures mean we don't have to rewrite complexity each time we need to utilize a specific behaviour.
concatMap (or flatMap) is exactly what we need in this situation.
// concat :: ([a],[a]) -> [a]
const concat = (xs,ys) =>
xs.concat (ys)
// concatMap :: (a -> [b]) -> [a] -> [b]
const concatMap = f => xs =>
xs.map(f).reduce(concat, [])
// id :: a -> a
const id = x =>
x
// flatten :: [[a]] -> [a]
const flatten =
concatMap (id)
// your sample data
const data =
[["$6"], ["$12"], ["$25"], ["$25"], ["$18"], ["$22"], ["$10"]]
console.log (flatten (data))
foresight
And yes, you guessed it correctly, it only flattens one level, which is exactly how it should work
Imagine some data set like this
// Player :: (String, Number) -> Player
const Player = (name,number) =>
[ name, number ]
// team :: ( . Player) -> Team
const Team = (...players) =>
players
// Game :: (Team, Team) -> Game
const Game = (teamA, teamB) =>
[ teamA, teamB ]
// sample data
const teamA =
Team (Player ('bob', 5), Player ('alice', 6))
const teamB =
Team (Player ('ricky', 4), Player ('julian', 2))
const game =
Game (teamA, teamB)
console.log (game)
// [ [ [ 'bob', 5 ], [ 'alice', 6 ] ],
// [ [ 'ricky', 4 ], [ 'julian', 2 ] ] ]
Ok, now say we want to print a roster that shows all the players that will be participating in game …
const gamePlayers = game =>
flatten (game)
gamePlayers (game)
// => [ [ 'bob', 5 ], [ 'alice', 6 ], [ 'ricky', 4 ], [ 'julian', 2 ] ]
If our flatten procedure flattened nested arrays too, we'd end up with this garbage result …
const gamePlayers = game =>
badGenericFlatten(game)
gamePlayers (game)
// => [ 'bob', 5, 'alice', 6, 'ricky', 4, 'julian', 2 ]
rollin' deep, baby
That's not to say sometimes you don't want to flatten nested arrays, too – only that shouldn't be the default behaviour.
We can make a deepFlatten procedure with ease …
// concat :: ([a],[a]) -> [a]
const concat = (xs,ys) =>
xs.concat (ys)
// concatMap :: (a -> [b]) -> [a] -> [b]
const concatMap = f => xs =>
xs.map(f).reduce(concat, [])
// id :: a -> a
const id = x =>
x
// flatten :: [[a]] -> [a]
const flatten =
concatMap (id)
// deepFlatten :: [[a]] -> [a]
const deepFlatten =
concatMap (x =>
Array.isArray (x) ? deepFlatten (x) : x)
// your sample data
const data =
[0, [1, [2, [3, [4, 5], 6]]], [7, [8]], 9]
console.log (flatten (data))
// [ 0, 1, [ 2, [ 3, [ 4, 5 ], 6 ] ], 7, [ 8 ], 9 ]
console.log (deepFlatten (data))
// [ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9 ]
There. Now you have a tool for each job – one for squashing one level of nesting, flatten, and one for obliterating all nesting deepFlatten.
Maybe you can call it obliterate or nuke if you don't like the name deepFlatten.
Don't iterate twice !
Of course the above implementations are clever and concise, but using a .map followed by a call to .reduce means we're actually doing more iterations than necessary
Using a trusty combinator I'm calling mapReduce helps keep the iterations to a minium; it takes a mapping function m :: a -> b, a reducing function r :: (b,a) ->b and returns a new reducing function - this combinator is at the heart of transducers; if you're interested, I've written other answers about them
// mapReduce = (a -> b, (b,a) -> b, (b,a) -> b)
const mapReduce = (m,r) =>
(acc,x) => r (acc, m (x))
// concatMap :: (a -> [b]) -> [a] -> [b]
const concatMap = f => xs =>
xs.reduce (mapReduce (f, concat), [])
// concat :: ([a],[a]) -> [a]
const concat = (xs,ys) =>
xs.concat (ys)
// id :: a -> a
const id = x =>
x
// flatten :: [[a]] -> [a]
const flatten =
concatMap (id)
// deepFlatten :: [[a]] -> [a]
const deepFlatten =
concatMap (x =>
Array.isArray (x) ? deepFlatten (x) : x)
// your sample data
const data =
[ [ [ 1, 2 ],
[ 3, 4 ] ],
[ [ 5, 6 ],
[ 7, 8 ] ] ]
console.log (flatten (data))
// [ [ 1. 2 ], [ 3, 4 ], [ 5, 6 ], [ 7, 8 ] ]
console.log (deepFlatten (data))
// [ 1, 2, 3, 4, 5, 6, 7, 8 ]
To flatten an array of single element arrays, you don't need to import a library, a simple loop is both the simplest and most efficient solution :
for (var i = 0; i < a.length; i++) {
a[i] = a[i][0];
}
To downvoters: please read the question, don't downvote because it doesn't suit your very different problem. This solution is both the fastest and simplest for the asked question.
Another ECMAScript 6 solution in functional style:
Declare a function:
const flatten = arr => arr.reduce(
(a, b) => a.concat(Array.isArray(b) ? flatten(b) : b), []
);
and use it:
flatten( [1, [2,3], [4,[5,[6]]]] ) // -> [1,2,3,4,5,6]
const flatten = arr => arr.reduce(
(a, b) => a.concat(Array.isArray(b) ? flatten(b) : b), []
);
console.log( flatten([1, [2,3], [4,[5],[6,[7,8,9],10],11],[12],13]) )
Consider also a native function Array.prototype.flat() (proposal for ES6) available in last releases of modern browsers. Thanks to #(Константин Ван) and #(Mark Amery) mentioned it in the comments.
The flat function has one parameter, specifying the expected depth of array nesting, which equals 1 by default.
[1, 2, [3, 4]].flat(); // -> [1, 2, 3, 4]
[1, 2, [3, 4, [5, 6]]].flat(); // -> [1, 2, 3, 4, [5, 6]]
[1, 2, [3, 4, [5, 6]]].flat(2); // -> [1, 2, 3, 4, 5, 6]
[1, 2, [3, 4, [5, 6]]].flat(Infinity); // -> [1, 2, 3, 4, 5, 6]
let arr = [1, 2, [3, 4]];
console.log( arr.flat() );
arr = [1, 2, [3, 4, [5, 6]]];
console.log( arr.flat() );
console.log( arr.flat(1) );
console.log( arr.flat(2) );
console.log( arr.flat(Infinity) );
You can also try the new Array.flat() method. It works in the following manner:
let arr = [["$6"], ["$12"], ["$25"], ["$25"], ["$18"], ["$22"], ["$10"]].flat()
console.log(arr);
The flat() method creates a new array with all sub-array elements concatenated into it recursively up to the 1 layer of depth (i.e. arrays inside arrays)
If you want to also flatten out 3 dimensional or even higher dimensional arrays you simply call the flat method multiple times. For example (3 dimensions):
let arr = [1,2,[3,4,[5,6]]].flat().flat().flat();
console.log(arr);
Be careful!
Array.flat() method is relatively new. Older browsers like ie might not have implemented the method. If you want you code to work on all browsers you might have to transpile your JS to an older version. Check for MDN web docs for current browser compatibility.
A solution for the more general case, when you may have some non-array elements in your array.
function flattenArrayOfArrays(a, r){
if(!r){ r = []}
for(var i=0; i<a.length; i++){
if(a[i].constructor == Array){
flattenArrayOfArrays(a[i], r);
}else{
r.push(a[i]);
}
}
return r;
}
What about using reduce(callback[, initialValue]) method of JavaScript 1.8
list.reduce((p,n) => p.concat(n),[]);
Would do the job.
const common = arr.reduce((a, b) => [...a, ...b], [])
You can use Array.flat() with Infinity for any depth of nested array.
var arr = [ [1,2,3,4], [1,2,[1,2,3]], [1,2,3,4,5,[1,2,3,4,[1,2,3,4]]], [[1,2,3,4], [1,2,[1,2,3]], [1,2,3,4,5,[1,2,3,4,[1,2,3,4]]]] ];
let flatten = arr.flat(Infinity)
console.log(flatten)
check here for browser compatibility
Please note: When Function.prototype.apply ([].concat.apply([], arrays)) or the spread operator ([].concat(...arrays)) is used in order to flatten an array, both can cause stack overflows for large arrays, because every argument of a function is stored on the stack.
Here is a stack-safe implementation in functional style that weighs up the most important requirements against one another:
reusability
readability
conciseness
performance
// small, reusable auxiliary functions:
const foldl = f => acc => xs => xs.reduce(uncurry(f), acc); // aka reduce
const uncurry = f => (a, b) => f(a) (b);
const concat = xs => y => xs.concat(y);
// the actual function to flatten an array - a self-explanatory one-line:
const flatten = xs => foldl(concat) ([]) (xs);
// arbitrary array sizes (until the heap blows up :D)
const xs = [[1,2,3],[4,5,6],[7,8,9]];
console.log(flatten(xs));
// Deriving a recursive solution for deeply nested arrays is trivially now
// yet more small, reusable auxiliary functions:
const map = f => xs => xs.map(apply(f));
const apply = f => a => f(a);
const isArray = Array.isArray;
// the derived recursive function:
const flattenr = xs => flatten(map(x => isArray(x) ? flattenr(x) : x) (xs));
const ys = [1,[2,[3,[4,[5],6,],7],8],9];
console.log(flattenr(ys));
As soon as you get used to small arrow functions in curried form, function composition and higher order functions, this code reads like prose. Programming then merely consists of putting together small building blocks that always work as expected, because they don't contain any side effects.
ES6 One Line Flatten
See lodash flatten, underscore flatten (shallow true)
function flatten(arr) {
return arr.reduce((acc, e) => acc.concat(e), []);
}
or
function flatten(arr) {
return [].concat.apply([], arr);
}
Tested with
test('already flatted', () => {
expect(flatten([1, 2, 3, 4, 5])).toEqual([1, 2, 3, 4, 5]);
});
test('flats first level', () => {
expect(flatten([1, [2, [3, [4]], 5]])).toEqual([1, 2, [3, [4]], 5]);
});
ES6 One Line Deep Flatten
See lodash flattenDeep, underscore flatten
function flattenDeep(arr) {
return arr.reduce((acc, e) => Array.isArray(e) ? acc.concat(flattenDeep(e)) : acc.concat(e), []);
}
Tested with
test('already flatted', () => {
expect(flattenDeep([1, 2, 3, 4, 5])).toEqual([1, 2, 3, 4, 5]);
});
test('flats', () => {
expect(flattenDeep([1, [2, [3, [4]], 5]])).toEqual([1, 2, 3, 4, 5]);
});
Using the spread operator:
const input = [["$6"], ["$12"], ["$25"], ["$25"], ["$18"], ["$22"], ["$10"]];
const output = [].concat(...input);
console.log(output); // --> ["$6", "$12", "$25", "$25", "$18", "$22", "$10"]
I recommend a space-efficient generator function:
function* flatten(arr) {
if (!Array.isArray(arr)) yield arr;
else for (let el of arr) yield* flatten(el);
}
// Example:
console.log(...flatten([1,[2,[3,[4]]]])); // 1 2 3 4
If desired, create an array of flattened values as follows:
let flattened = [...flatten([1,[2,[3,[4]]]])]; // [1, 2, 3, 4]
If you only have arrays with 1 string element:
[["$6"], ["$12"], ["$25"], ["$25"]].join(',').split(',');
will do the job. Bt that specifically matches your code example.
I have done it using recursion and closures
function flatten(arr) {
var temp = [];
function recursiveFlatten(arr) {
for(var i = 0; i < arr.length; i++) {
if(Array.isArray(arr[i])) {
recursiveFlatten(arr[i]);
} else {
temp.push(arr[i]);
}
}
}
recursiveFlatten(arr);
return temp;
}
A Haskellesque approach
function flatArray([x,...xs]){
return x ? [...Array.isArray(x) ? flatArray(x) : [x], ...flatArray(xs)] : [];
}
var na = [[1,2],[3,[4,5]],[6,7,[[[8],9]]],10];
fa = flatArray(na);
console.log(fa);
ES6 way:
const flatten = arr => arr.reduce((acc, next) => acc.concat(Array.isArray(next) ? flatten(next) : next), [])
const a = [1, [2, [3, [4, [5]]]]]
console.log(flatten(a))
ES5 way for flatten function with ES3 fallback for N-times nested arrays:
var flatten = (function() {
if (!!Array.prototype.reduce && !!Array.isArray) {
return function(array) {
return array.reduce(function(prev, next) {
return prev.concat(Array.isArray(next) ? flatten(next) : next);
}, []);
};
} else {
return function(array) {
var arr = [];
var i = 0;
var len = array.length;
var target;
for (; i < len; i++) {
target = array[i];
arr = arr.concat(
(Object.prototype.toString.call(target) === '[object Array]') ? flatten(target) : target
);
}
return arr;
};
}
}());
var a = [1, [2, [3, [4, [5]]]]];
console.log(flatten(a));
if you use lodash, you can just use its flatten method: https://lodash.com/docs/4.17.14#flatten
The nice thing about lodash is that it also has methods to flatten the arrays:
i) recursively: https://lodash.com/docs/4.17.14#flattenDeep
ii) upto n levels of nesting: https://lodash.com/docs/4.17.14#flattenDepth
For example
const _ = require("lodash");
const pancake = _.flatten(array)
I was goofing with ES6 Generators the other day and wrote this gist. Which contains...
function flatten(arrayOfArrays=[]){
function* flatgen() {
for( let item of arrayOfArrays ) {
if ( Array.isArray( item )) {
yield* flatten(item)
} else {
yield item
}
}
}
return [...flatgen()];
}
var flatArray = flatten([[1, [4]],[2],[3]]);
console.log(flatArray);
Basically I'm creating a generator that loops over the original input array, if it finds an array it uses the yield* operator in combination with recursion to continually flatten the internal arrays. If the item is not an array it just yields the single item. Then using the ES6 Spread operator (aka splat operator) I flatten out the generator into a new array instance.
I haven't tested the performance of this, but I figure it is a nice simple example of using generators and the yield* operator.
But again, I was just goofing so I'm sure there are more performant ways to do this.
just the best solution without lodash
let flatten = arr => [].concat.apply([], arr.map(item => Array.isArray(item) ? flatten(item) : item))
I would rather transform the whole array, as-is, to a string, but unlike other answers, would do that using JSON.stringify and not use the toString() method, which produce an unwanted result.
With that JSON.stringify output, all that's left is to remove all brackets, wrap the result with start & ending brackets yet again, and serve the result with JSON.parse which brings the string back to "life".
Can handle infinite nested arrays without any speed costs.
Can rightly handle Array items which are strings containing commas.
var arr = ["abc",[[[6]]],["3,4"],"2"];
var s = "[" + JSON.stringify(arr).replace(/\[|]/g,'') +"]";
var flattened = JSON.parse(s);
console.log(flattened)
Only for multidimensional Array of Strings/Numbers (not Objects)
Ways for making flatten array
using Es6 flat()
using Es6 reduce()
using recursion
using string manipulation
[1,[2,[3,[4,[5,[6,7],8],9],10]]] - [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
// using Es6 flat()
let arr = [1,[2,[3,[4,[5,[6,7],8],9],10]]]
console.log(arr.flat(Infinity))
// using Es6 reduce()
let flatIt = (array) => array.reduce(
(x, y) => x.concat(Array.isArray(y) ? flatIt(y) : y), []
)
console.log(flatIt(arr))
// using recursion
function myFlat(array) {
let flat = [].concat(...array);
return flat.some(Array.isArray) ? myFlat(flat) : flat;
}
console.log(myFlat(arr));
// using string manipulation
let strArr = arr.toString().split(',');
for(let i=0;i<strArr.length;i++)
strArr[i]=parseInt(strArr[i]);
console.log(strArr)
I think array.flat(Infinity) is a perfect solution. But flat function is a relatively new function and may not run in older versions of browsers. We can use recursive function for solving this.
const arr = ["A", ["B", [["B11", "B12", ["B131", "B132"]], "B2"]], "C", ["D", "E", "F", ["G", "H", "I"]]]
const flatArray = (arr) => {
const res = []
for (const item of arr) {
if (Array.isArray(item)) {
const subRes = flatArray(item)
res.push(...subRes)
} else {
res.push(item)
}
}
return res
}
console.log(flatArray(arr))

Recursion call async func with promises gets Possible Unhandled Promise Rejection

const PAGESIZE = 1000;
const DEFAULTLINK = `${URL}/stuff?pageSize=${PAGESIZE}&apiKey=${APIKEY}`;
export const getAllStuff = (initialLink = DEFAULTLINK) => {
let allStuff = {};
return getSuffPage(initialLink)
.then(stuff => {
allStuff = stuff;
if (stuff.next) {
return getAllStuff(stuff.next)
.then(nextStuff => {
allStuff = Object.assign({}, stuff, nextStuff);
return allStuff;
});
} else {
return allStuff;
}
});
};
const getSuffPage = nextPageLink => {
fetch(nextPageLink).then(res => {
return res.json();
});
};
Calling getAllStuff throws:
Possible Unhandled Promise Rejection (id: 0):
TypeError: Cannot read property 'then' of undefined
TypeError: Cannot read property 'then' of undefined
at getAllStuff
I think it is usually because I do not return from a promise then or something but where don't I?
I've been working with anamorphisms or unfold in JavaScript lately and I thought I might share them with you using your program as a context to learn them in
const getAllStuff = async (initUrl = '/0') =>
asyncUnfold
( async (next, done, stuff) =>
stuff.next
? next (stuff, await get (stuff.next))
: done (stuff)
, await get (initUrl)
)
const get = async (url = '') =>
fetch (url) .then (res => res.json ())
To demonstrate that this works, we introduce a fake fetch and database DB with a fake delay of 250ms per request
const fetch = (url = '') =>
Promise.resolve ({ json: () => DB[url] }) .then (delay)
const delay = (x, ms = 250) =>
new Promise (r => setTimeout (r, ms, x))
const DB =
{ '/0': { a: 1, next: '/1' }
, '/1': { b: 2, next: '/2' }
, '/2': { c: 3, d: 4, next: '/3' }
, '/3': { e: 5 }
}
Now we just run our program like this
getAllStuff () .then (console.log, console.error)
// [ { a: 1, next: '/1' }
// , { b: 2, next: '/2' }
// , { c: 3, d: 4, next: '/3' }
// , { e: 5 }
// ]
And finally, here's asyncUnfold
const asyncUnfold = async (f, init) =>
f ( async (x, acc) => [ x, ...await asyncUnfold (f, acc) ]
, async (x) => [ x ]
, init
)
Program demonstration 1
const asyncUnfold = async (f, init) =>
f ( async (x, acc) => [ x, ...await asyncUnfold (f, acc) ]
, async (x) => [ x ]
, init
)
const getAllStuff = async (initUrl = '/0') =>
asyncUnfold
( async (next, done, stuff) =>
stuff.next
? next (stuff, await get (stuff.next))
: done (stuff)
, await get (initUrl)
)
const get = async (url = '') =>
fetch (url).then (res => res.json ())
const fetch = (url = '') =>
Promise.resolve ({ json: () => DB[url] }) .then (delay)
const delay = (x, ms = 250) =>
new Promise (r => setTimeout (r, ms, x))
const DB =
{ '/0': { a: 1, next: '/1' }
, '/1': { b: 2, next: '/2' }
, '/2': { c: 3, d: 4, next: '/3' }
, '/3': { e: 5 }
}
getAllStuff () .then (console.log, console.error)
// [ { a: 1, next: '/1' }
// , { b: 2, next: '/2' }
// , { c: 3, d: 4, next: '/3' }
// , { e: 5 }
// ]
Now say you wanted to collapse the result into a single object, we could do so with a reduce – this is closer to what your original program does. Note how the next property honors the last value when a key collision happens
getAllStuff ()
.then (res => res.reduce ((x, y) => Object.assign (x, y), {}))
.then (console.log, console.error)
// { a: 1, next: '/3', b: 2, c: 3, d: 4, e: 5 }
If you're sharp, you'll see that asyncUnfold could be changed to output our object directly. I chose to output an array because the sequence of the unfold result is generally important. If you're thinking about this from a type perspective, each foldable type's fold has an isomorphic unfold.
Below we rename asyncUnfold to asyncUnfoldArray and introduce asyncUnfoldObject. Now we see that the direct result is achievable without the intermediate reduce step
const asyncUnfold = async (f, init) =>
const asyncUnfoldArray = async (f, init) =>
f ( async (x, acc) => [ x, ...await asyncUnfoldArray (f, acc) ]
, async (x) => [ x ]
, init
)
const asyncUnfoldObject = async (f, init) =>
f ( async (x, acc) => ({ ...x, ...await asyncUnfoldObject (f, acc) })
, async (x) => x
, init
)
const getAllStuff = async (initUrl = '/0') =>
asyncUnfold
asyncUnfoldObject
( async (next, done, stuff) =>
, ...
)
getAllStuff ()
.then (res => res.reduce ((x, y) => Object.assign (x, y), {}))
.then (console.log, console.error)
// { a: 1, next: '/3', b: 2, c: 3, d: 4, e: 5 }
But having functions with names like asyncUnfoldArray and asyncUnfoldObject is completely unacceptable, you'll say - and I'll agree. The entire process can be made generic by supplying a type t as an argument
const asyncUnfold = async (t, f, init) =>
f ( async (x, acc) => t.concat (t.of (x), await asyncUnfold (t, f, acc))
, async (x) => t.of (x)
, init
)
const getAllStuff = async (initUrl = '/0') =>
asyncUnfoldObject
asyncUnfold
( Object
, ...
, ...
)
getAllStuff () .then (console.log, console.error)
// { a: 1, next: '/3', b: 2, c: 3, d: 4, e: 5 }
Now if we want to build an array instead, just pass Array instead of Object
const getAllStuff = async (initUrl = '/0') =>
asyncUnfold
( Array
, ...
, ...
)
getAllStuff () .then (console.log, console.error)
// [ { a: 1, next: '/1' }
// , { b: 2, next: '/2' }
// , { c: 3, d: 4, next: '/3' }
// , { e: 5 }
// ]
Of course we have to concede JavaScript's deficiency of a functional language at this point, as it does not provide consistent interfaces for even its own native types. That's OK, they're pretty easy to add!
Array.of = x =>
[ x ]
Array.concat = (x, y) =>
[ ...x, ...y ]
Object.of = x =>
Object (x)
Object.concat = (x, y) =>
({ ...x, ...y })
Program demonstration 2
Array.of = x =>
[ x ]
Array.concat = (x, y) =>
[ ...x, ...y ]
Object.of = x =>
Object (x)
Object.concat = (x, y) =>
({ ...x, ...y })
const asyncUnfold = async (t, f, init) =>
f ( async (x, acc) => t.concat (t.of (x), await asyncUnfold (t, f, acc))
, async (x) => t.of (x)
, init
)
const getAllStuff = async (initUrl = '/0') =>
asyncUnfold
( Object // <-- change this to Array for for array result
, async (next, done, stuff) =>
stuff.next
? next (stuff, await get (stuff.next))
: done (stuff)
, await get (initUrl)
)
const get = async (url = '') =>
fetch (url).then (res => res.json ())
const fetch = (url = '') =>
Promise.resolve ({ json: () => DB[url] }) .then (delay)
const delay = (x, ms = 250) =>
new Promise (r => setTimeout (r, ms, x))
const DB =
{ '/0': { a: 1, next: '/1' }
, '/1': { b: 2, next: '/2' }
, '/2': { c: 3, d: 4, next: '/3' }
, '/3': { e: 5 }
}
getAllStuff () .then (console.log, console.error)
// { a: 1, next: '/3', b: 2, c: 3, d: 4, e: 5 }
Finally, if you're fussing about touching properties on the native Array or Object, you can skip that and instead pass a generic descriptor in directly
const getAllStuff = async (initUrl = '/0') =>
asyncUnfold
( { of: x => [ x ], concat: (x, y) => [ ...x, ...y ] }
, ...
)
getAllStuff () .then (console.log, console.error)
// [ { a: 1, next: '/1' }
// , { b: 2, next: '/2' }
// , { c: 3, d: 4, next: '/3' }
// , { e: 5 }
// ]

Why does the compiler warn about an uninitialized variable even though I've assigned each field of that variable?

I'm completely assigning the fields of the MyStruct instance named x in every possible brace of the match:
enum MyEnum {
One,
Two,
Three,
}
struct MyStruct {
a: u32,
b: u32,
}
fn main() {
f(MyEnum::One);
f(MyEnum::Two);
f(MyEnum::Three);
}
fn f(y: MyEnum) -> MyStruct {
let mut x: MyStruct;
match y {
MyEnum::One => {
x.a = 1;
x.b = 1;
}
MyEnum::Two => {
x.a = 2;
x.b = 2;
}
MyEnum::Three => {
x.a = 3;
x.b = 3;
}
}
x
}
Why does the compiler return the following error?
error[E0381]: use of possibly uninitialized variable: `x`
--> src/main.rs:37:5
|
37 | x
| ^ use of possibly uninitialized `x`
I think this is a known issue (see also its related issue).
let x: MyStruct; doesn't set x to an empty value, it declares a variable. You still need to assign a value to it.
fn f(y: MyEnum) -> MyStruct {
let x;
match y {
MyEnum::One => {
x = MyStruct { a: 1, b: 1 };
}
MyEnum::Two => {
x = MyStruct { a: 2, b: 2 };
}
MyEnum::Three => {
x = MyStruct { a: 3, b: 3 };
}
}
x
}
In other words, let x; creates an unbound variable, a variable which doesn't have a value associated with it. Thus you need to bind some value to it later.
If you only want to return a value from the function, you can take advantage of the fact that almost every statement in Rust produces a value, and a value of the last statement is the return value of a function.
fn f(y: MyEnum) -> MyStruct {
use MyEnum::*;
let x = match y {
One => MyStruct { a: 1, b: 1 },
Two => MyStruct { a: 2, b: 2 },
Three => MyStruct { a: 3, b: 3 },
};
x
}
You can also completely eliminate x, if you so choose.
fn f(y: MyEnum) -> MyStruct {
use MyEnum::*;
match y {
One => MyStruct { a: 1, b: 1 },
Two => MyStruct { a: 2, b: 2 },
Three => MyStruct { a: 3, b: 3 },
}
}

Curried functions: how to optimize them

I'm relatively new to functional programming and libraries such as ramda.js but one thing I found very useful is the possibility of currying functions.
Using curried functions I write very often things as the following
const myFun = R.curry(
(arg1, arg2) => {
let calculated = anotherFun(arg1)
//do something with calculated and arg2
return calculated * 5 + arg2
}
)
const anotherFun = (arg) => {
console.log("calling anotherFun");
return arg + 1
}
var partial = myFun(1)
console.log(partial(2))
console.log(partial(3))
<script src="//cdn.jsdelivr.net/ramda/0.22.1/ramda.min.js"></script>
but clearly in this situation anotherFun is called every time I call partial even if in arg1 and as a consequence calculated are always the same.
Is there a way to optimize this behaviour and call anotherFun only when its args change?
The only way that crosses my mind is this
const myFun = R.curry(
(calculated, arg2) => {
return calculated * 5 + arg2
}
)
const anotherFun = (arg) => {
console.log("calling anotherFun");
return arg + 1
}
var calculated = anotherFun(1)
var partial = myFun(calculated)
console.log(partial(2))
console.log(partial(3))
<script src="//cdn.jsdelivr.net/ramda/0.22.1/ramda.min.js"></script>
but in this way I have to change the arguments passed to myFun and this complicates the external API
If you do the currying manually like this
const myFun = arg1 => arg2 => {
let calculated = anotherFun(arg1)
// do something with calculated and arg2
return calculated * 5 + arg2
};
you can also make this optimisation:
const myFun = arg1 => {
let calculated = anotherFun(arg1);
return arg2 => {
// do something with calculated and arg2
return calculated * 5 + arg2
};
};
I don't think Ramda will help you here with anything; and JavaScript compilers certainly are not doing this kind of optimisation.
#Bergi is right that Ramda will not offer you any help with this. If you want a Ramda-style result, where you can call with one parameter to get a function back or both to get the result you can do this:
const myFun = function(arg1, arg2) {
let calculated = anotherFun(arg1);
const newFunc = arg2 => {
return calculated * 5 + arg2
};
return (arguments.length < 2) ? newFunc : newFunc(arg2);
};
const with3 = myFun(3);
//: calling anotherFun
with3(1); //=> 21
with3(2); //=> 22
with3(4); //=> 23
myFun(2, 7);
//: calling anotherFun
//=> 22
myFun(2, 8);
//: calling anotherFun
//=> 23
This comes at the cost of not being able to use ES2015 arrow functions. But it might be worth it to you.
You can also rework this slightly to not build the internal function if both parameters are supplied, if that is important to you.
How about useWith and memoize from Ramda?
const myFun = R.useWith(
(a, b) => a * 5 + b,
[R.memoize(anotherFun), R.identity]
);

Resources