How do I pass in multiple parameters into a Ramda compose chain? - functional-programming

Here are four functions I am trying to compose into a single endpoint string:
const endpoint = str => `${str}` || 'default'
const protocol = str => `https://${str}`
const params = str => `${str}?sort=desc&part=true&`
const query = str => `${str}query={ some:'value', another:'value'}`
let finalEndpoint = R.compose(query, params, protocol, endpoint)
var result = finalEndpoint('api.content.io')
This composition works and returns the result I want which is:
https://api.content.io?sort=desc&part=true&query={ some:'value', another:'value'}
But notice how I have hard coded the values for params and query inside their function body. I see only one value going up the value in this R.compose chain.
How and where exactly do I pass in parameters to the params and query parameters?
UPDATE:
What I did was curried those functions like this:
var R = require('ramda');
const endpoint = str => `${str}` || 'default'
const protocol = str => `https://${str}`
const setParams = R.curry ( (str, params) => `${str}?${params}` )
const setQuery = R.curry ( (str, query) => `${str}&query=${JSON.stringify(query)}` )
and then
let finalEndpoint = R.compose(protocol, endpoint)
var result = setQuery(setParams(finalEndpoint('api.content.io'), 'sort=desc&part=true'), { some:'value', another:'value'})
console.log(result);
But the final call to get result still seems pretty hacked and inelegant. Is there any way to improve this?

How and where exactly do I pass in parameters to the params and query parameters?
Honestly, you don't, not when you're building a compose or pipe pipeline with Ramda or similar libraries.
Ramda (disclaimer: I'm one of the authors) allows the first function to receive multiple arguments -- some other libraries do, some don't -- but subsequent ones will only receive the result of the previous calls. There is one function in Sanctuary, meld, which might be helpful with this, but it does have a fairly complex API.
However, I don't really understand why you are building this function in this manner in the first place. Are those intermediate functions actually reusable, or are you building them on spec? The reason I ask is that this seems a more sensible version of the same idea:
const finalEndpoint = useWith(
(endpoint, params, query) =>`https://${endpoint}?${params}&query=${query}`, [
endpoint => endpoint || 'default',
pipe(toPairs, map(join('=')), join('&')),
pipe(JSON.stringify, encodeURIComponent)
]
);
finalEndpoint(
'api.content.io',
{sort: 'desc', part: true},
{some:'value', another:'value'}
);
//=> "https://api.content.io?sort=desc&part=true&query=%7B%22some%22%3A%22value%22%2C%22another%22%3A%22value%22%7D"
I don't really know your requirements for that last parameter. It looked strange to me without that encodeUriComponent, but perhaps you don't need it. And I also took liberties with the second parameter, assuming that you would prefer actual data in the API to a string encapsulating that data. But if you want to pass 'sort=desc&part=true', then replace pipe(toPairs, map(join('=')), join('&')) with identity.
Since the whole thing is far from points-free, I did not use a points-free version of the first function, perhaps or(__, 'default'), as I think what's there is more readable.
Update
You can see a version of this on the Ramda REPL, one that adds some console.log statements with tap.
This does raise an interesting question for Ramda. If those intermediate functions really are desirable, Ramda offers no way to combine them. Obviously Ramda could offer something like meld, but is there a middle ground? I'm wondering if there is a useful function (curried, of course) that we should include that works something like
someFunc([f0], [a0]); //=> f0(a0)
someFunc([f0, f1], [a0, a1]); //=> f1(f0(a0), a1)
someFunc([f0, f1, f2], [a0, a1, a2]); //=> f2(f1(f0(a0), a1), a2)
someFunc([f0, f1, f2, f3], [a0, a1, a2, a3]); //=> f3(f2(f1(f0(a0), a1), a2), a3)
// ...
There are some serious objections: What if the lists are of different lengths? Why is the initial call unary, and should we fix that by adding a separate accumulator parameter to the function? Nonetheless, this is an intriguing function, and I will probably raise it for discussion on the Ramda boards.

I wrote a little helper function for situations like this.
It is like compose, but with the rest params also passed in. The first param is the return value of the previous function. The rest params remain unchanged.
With it, you could rewrite your code as follows:
const compound = require('compound-util')
const endpoint = str => `${str}` || 'default'
const protocol = str => `https://${str}`
const params = (str, { params }) => `${str}?${params}`
const query = (str, { query }) => `${str}query=${query}`
const finalEndpoint = compound(query, params, protocol, endpoint)
const result = finalEndpoint('api.content.io', {
params: 'sort=desc&part=true&',
query: JSON.stringify({ some:'value', another:'value'})
})

If you have params and query as curried functions then you can:
EDIT: code with all the bells and whistles, needed to change parameter order or use R.__ and stringify object
const endpoint = R.curry( str => `${str}` || 'default' )
const protocol = R.curry( str => `https://${str}` )
const params = R.curry( (p, str) => `${str}?${p}` )
const query = R.curry( (q, str) => `${str}&query=${q}` )
let finalEndpoint =
R.compose(
query(JSON.stringify({ some:'value', another:'value' })),
params('sort=desc&part=true'),
protocol,
endpoint
)
var result = finalEndpoint('api.content.io')
console.log(result)

Related

Generate a predicate out of two predicates (job for monoid, fold?)

I have two predicates
interface Foo {}
interface Bar {}
declare const isFoo: (a:unknown):a is Foo
declare const isBar: (a:unknown):a is Bar
What is the functional way to combine two predicates to create a new predicate (for simplicity, let's assume it's a => isFoo(a) && isBar(a)?
With fp-ts, I initially thought I could fold(monoidAll)([isFoo, isBar]), but fold expects the array to be of booleans, not of functions that evaluate to boolean.
This works
import { monoid as M, function as F, apply as A, identity as I, reader as R } from 'fp-ts'
interface Foo{}
interface Bar{}
declare const isFoo:(a:unknown) => a is Foo
declare const isBar:(a:unknown) => a is Bar
const isFooAndBar = F.pipe(A.sequenceT(R.reader)(isFoo, isBar), R.map(M.fold(M.monoidAll)))
But boy howdy is that convoluted. I thought there could be another way. I ended up writing my own monoid that takes two predicates and combines them, calling it monoidPredicateAll:
const monoidPredicateAll:M.Monoid<Predicate<unknown>> = {
empty: ()=>true,
concat: (x,y) => _ => x(_) && y(_)
}
Is there a canonical FP way of combining two predicates? I know I could do something like
xs.filter(x => isFoo(x) && isBar(x))
But it can get complicated with more predicates, and re-using a monoid makes it less likely I'll do a typo like isFoo(x) || isBar(x) && isBaz(x) when I meant all && (and that's where a xs.filter(fold(monoidPredicateAll)(isFoo,isBar,isBaz)) would help out.
I found a discussion about this on SO, but it was about Java and a built-in Predicate type, so didn't directly address my question.
Yes, I'm overthinking this :)
I ended up doing this:
export const monoidPredicateAll:Monoid<Predicate<unknown>> = {
empty: ()=>true,
concat: (x,y) => _ => x(_) && y(_)
}
Then I could do
import {monoid as M} from 'fp-ts'
declare const isFoo: Predicate<number>
declare const isBar: Predicate<number>
const isFooAndBar = M.fold(monoidPredicateAll)([isFoo,isBar])
For others looking for a working solution, based on #user1713450's answer
import * as P from 'fp-ts/lib/Predicate';
import * as M from 'fp-ts/Monoid';
const createMonoidPredicateAll = <T>(): M.Monoid<P.Predicate<T>> => ({
empty: () => true,
concat: (x, y) => (_) => x(_) && y(_),
});
export const combine = <T>(predicates: P.Predicate<T>[]) =>
M.concatAll(createMonoidPredicateAll<T>())(predicates);

Can impurity affect the associativity of an operation?

Associativity is a desirable property and quite common for many operations in FP. Now I was wondering if an impure function can interfere with it. The only example I found isn't really convincing, because I'm not sure if nullary functions count as proper functions (riegorously speaking) and additionally, the example looks rather contrived.
The following is written in JS but is hopefully self-explanatory:
// function composition
const comp = f => g => x => f(g(x));
const foo = () => 1;
const bar = () => Math.round(Math.random() * 100);
// Set functor (ignore the hideous implementation)
const map = f => s => {
const r = new Set();
s.forEach(x => r.add(f(x)));
return r;
};
const lhs = map(comp(bar) (foo));
const rhs = comp(map(bar)) (map(foo));
const set1 = lhs(new Set([1, 2, 3]));
const set2 = rhs(new Set([1, 2, 3]));
console.log(Array.from(set1)); // yields an array filled with up to three random integers
console.log(Array.from(set2)); // yields an array filled with a single random integer
I'm not sure whether this example can be considered as evidence. Are there more convincing examples?

Creating a composePipe function for Futures from Fluture

I wanted to make a compose function for piping and im stuck. I managed to make a pointfree pipe but cant figure out composing.
// pointfree
const pipe = fn => future => future.pipe(fn)
// compose pipes // not working
const composePipe = (...fns) => (...args) => fns.reduceRight( (future, fn) => future.pipe(fn), args)[0];
I'll answer your question eventually, but let's take a step back first.
An important thing to understand is that the pipe method is just function application. In other terms: future.pipe (f) == f (future).
This means that your pipe function can be redefined as such:
const pipe = fn => future => future.pipe(fn)
//to:
const pipe = fn => value => fn (value)
This new version of pipe works exactly the same way, except that it works on any values, not just Futures. But let's take a step back further even.
The signature of this function is as follows: pipe :: (a -> b) -> a -> b. It takes a function from A to B, and returns a function from A to B.
Wait a minute....
const pipe = fn => value => fn (value)
//to:
const pipe = fn => fn
That new definition does the same thing. Except that it works on anything, not just Functions. Actually it's just the identity function. So a curried (you said point-free, but I think you meant curried) version of future.pipe is just the identity function.
So why is this? Because all .pipe does is function application. And you can apply your functions yourself.
Now to answer your next question about composing pipes. What you're actually looking for is something that takes a number of functions, and applies them in sequence.
If you're using Ramda, that's pipe. We can implement this ourselves though:
const pipe = (...fns) => (...args) => fns.reduce ((args, f) => [f (...args)], args)[0]

Make parameters available to all functions inside Ramda's pipe function

I'm using Ramda in node with express. I have a standard route:
app.get('/api/v1/tours', (req, res) => {
}
Where I'd like to compose functions using Ramda, but I write these functions outside the route (so they will be reusable in other routes).
For example:
function extractParams() {
return req.query.id;
}
function findXById(id) {
return xs.find(el => el.id == id);
}
function success(answer) {
return res.status(200).json(answer);
}
Now I want to compose those functions inside several routers. One of them will be:
app.get('/api/v1/tours', (req, res) => {
return R.pipe(extractParams, findXById, success)();
}
Is there any way I can prepare a generic wrapper that wraps the request and response objects on the routers to be available to these functions? I guess I'll
also have to change their signature.
I think what's really needed here is a version of pipe that accepts some initial arguments and returns a new function that will accept the remaining ones, with all the functions having such a dual-application signature. I came up with the following doublePipe implementation that does this:
const doublePipe = (...fns) => (...initialArgs) =>
pipe (...(map (pipe (apply, applyTo (initialArgs)), fns) ))
const foo = (x, y) => (z) => (x + y) * z
const bar = (x, y) => (z) => (x + y) * (z + 1)
const baz = doublePipe (foo, bar)
console .log (
baz (2, 4) (1) //=> (2 + 4) * (((2 + 4) * 1) + 1) => 42
// / \ '------+----'
// bar ( x --/ , `-- y , `-- z, which is foo (2, 4) (1) )
)
<script src="//cdnjs.cloudflare.com/ajax/libs/ramda/0.26.1/ramda.js"></script>
<script>const {pipe, map, apply, applyTo} = R </script>
Note that the functions foo and bar will both receive the same x and y arguments, and that foo (x, y) will receive the z argument supplied from the outside, with its result passed as z to bar (x, y).
This is an interesting function, and it's a fairly useful generic solution to this sort of problem. But it won't work in your Express environment, because the handlers need to have the signature (req, res) => ... and not (req, res) => (...args) => ....
So below is an alternative, which mimics a trivial Express-like environment and uses a slightly different doublePipe version, which does not take an additional invocation, simply calling the first function with no parameters, and then sequentially passing the results through to the others as expected. This means the first function to doublePipe must have the signature (req, res) => () => ..., while the others have (req, res) => (val) => .... While we could fix it so that that the first one was just (req, res) => ..., it seems to me that this inconsistency would not be helpful.
const doublePipe = (...fns) => (...initialArgs) =>
reduce (applyTo, void 0, map (apply (__, initialArgs), fns))
const xs = [{id: 1, val: 'abc'}, {id: 2, val: 'def'},{id: 3, val: 'ghi'}, {id: 4, val: 'jkl'}]
const extractParams = (req, res) => () => req .query .id
const findXById = (xs) => (req, res) => (id) => xs .find (el => el .id == id)
const success = (req, res) => (answer) => res .status (200) .json (answer)
app .get ('/api/v1/tours', doublePipe (extractParams, findXById (xs), success))
console .log (
app .invoke ('get', '/api/v1/tours?foo=bar&id=3')
)
<script src="//cdnjs.cloudflare.com/ajax/libs/ramda/0.26.1/ramda.js"></script>
<script>
const {__, map, reduce, applyTo, apply, head, compose, split, objOf, fromPairs, last} = R
// Minimal version of Express, only enough for this demo
const base = compose (head, split ('?'))
const makeRequest = compose (objOf ('query'), fromPairs, map (split ('=')), split ('&'), last, split ('?'))
const makeResponse = () => {
const response = {
status: (val) => {response .status = val; return response},
json: (val) => {response.body = JSON .stringify (val); delete response.json; return response}
}
return response
}
const app = {
handlers: {get: {}, post: {}},
get: (route, handler) => app .handlers .get [route] = handler,
invoke: (method, route) =>
app .handlers [method] [base (route)] (makeRequest (route), makeResponse ())
}
</script>
findById does not have the required signature, but findById(xs) does, so that's what we pass into pipe.
Finally, note that Ramda and Express may never play particularly well together, as the handlers sent to Express are meant to modify their parameters, and Ramda is designed to never mutate input data. That said, this seems to work reasonably well for these requirements.
Update: explanation of doublePipe
A comment seemed to indicate that a more complete description of doublePipe was in order. I will only discuss the second version,
const doublePipe = (...fns) => (...initialArgs) =>
reduce (applyTo, void 0, map (apply (__, initialArgs), fns))
Here are two possible calls:
// foo :: (a, b) -> f
const foo = doublePipe (
f1, // :: (a, b) -> Void -> (c)
f2, // :: (a, b) -> c -> d
f3, // :: (a, b) -> d -> e
f4, // :: (a, b) -> e -> f
)
// bar :: (a, b, c) -> f
const bar = doublePipe (
g1, // :: (a, b, c) -> Void -> d
g2, // :: (a, b, c) -> d -> e
g3, // :: (a, b, c) -> e -> f
)
If you're not familiar with the Hindley-Milner signatures (such as (a, b) -> c -> d above), I wrote a long article on the Ramda wiki about their uses in Ramda. The foo function is built by passing f1 - f4 to doublePipe. The resulting function takes parameters of types a and b (req and res in your example) and returns a value of type f. Similarly bar is created by supplying g1 - g3 to doublePipe, returning a function that accepts parameters of types a, b, and c and returning a value of type f.
We can rewrite doublePipe a bit more imperatively to show the steps taken:
const doublePipe = (...fns) => (...initialArgs) => {
const resultFns = map (apply (__, initialArgs), fns)
return reduce (applyTo, void 0, resultFns)
}
and expanding that a bit, this might also look like
const doublePipe = (...fns) => (...initialArgs) => {
const resultFns = map (fn => fn(...initialArgs), fns)
return reduce ((value, fn) => fn (value), undefined, resultFns)
}
In the first line, we partially apply the initial arguments to each of the supplied functions, giving us a list of simpler functions. For foo resultFns would look like [f1(req, res), f2(req, res), f3(req, res), f4(req, res)], which would have signatures [Void -> c, c -> d, d -> e, e -> f]. We could now choose to pipe those functions and call the resulting function (return pipe(...resultFns)()), but I didn't see a good reason to create the piped function only to call it a single time and throw it away, so I reduce over that list, starting with undefined and passing the result of each one to the next.
I tend to think in terms of Ramda functions, but you could write this easily enough without them:
const doublePipe = (...fns) => (...initialArgs) =>
fns
.map (fn => fn (...initialArgs))
.reduce ((value, fn) => fn (value), void 0)
I hope this made that clearer!
Your three functions do not have the things they need in their declared scope. You need to modify their signature first:
function extractParams(req) { //<-- added `req`
return req.query.id;
}
function findXById(id, xs) { //<-- added `xs`
return xs.find(el => el.id == id);
}
function success(res, answer) { //<-- added `res`
return res.status(200).json(answer);
}
Note that the order of the parameters isn't "random". The data you need to operate on should be the last as it allows for a nicer function composition experience. It's one of the tenet of Ramda:
The parameters to Ramda functions are arranged to make it convenient for currying. The data to be operated on is generally supplied last.
Source: https://ramdajs.com/
This is not enough though. You need to curry some of them. Why? While the "recipe" of your function composition looks the same, each individual function operate on a specific data. This will make sense later, let's curry first:
const extractParams = (req) => req.query.id;
const findXById = R.curry((id, xs) => xs.find(el => el.id == id));
const success = R.curry((res, answer) => res.status(200).json(answer));
Now you can build a function composition whilst supplying some specific parameter to your functions in the composition:
app.get('/api/v1/tours', (req, res) =>
R.pipe(
extractParams,
findXById(42),
success(res))
(req));
It's important to note that while there is nothing "wrong" with this, it's also missing the point:
R.pipe(extractParams, findXById, success)()
Why? R.pipe or R.compose (or R.o) returns a function composition which is itself a function that you call with parameters (just one with R.o but let's ignore that for now). So you need to think about the data that goes through your function composition. In your case it's probably req:
R.pipe(extractParams, findXById, success)(req)
Each function in your function composition receives as its parameter, the result of the previous function. If something in between doesn't depend on that, then perhaps that function shouldn't be part of the composition. (Take that advice with a pinch of salt; special conditions may apply; just think about it ;)

Why do I get an error when I use transduce?

I am still new to functional programming and have been trying to learn how to use transducers. I thought I had a good use case but every time I attempt to write a transducer with Ramda for it, I get the following error:
reduce: list must be array or iterable
I have tried rewriting it several ways and looked at several explanations on the web of transduction but to no avail. Any suggestions?
const data = [{cost:2,quantity:3},{cost:4,quantity:5},{cost:1,quantity:1}];
const transducer = R.compose(R.map(R.product), R.map(R.props(['cost', 'quantity'])));
const result = R.transduce(transducer, R.add, 0)(data);
console.log(result)
In the context of a transducer, compose reads left to right. You just need to invert product and props:
const data = [
{cost:2,quantity:3},
{cost:4,quantity:5},
{cost:1,quantity:1}];
const transducer =
compose(
map(props(['cost', 'quantity'])),
map(product));
console.log(
transduce(transducer, add, 0, data)
)
<script src="https://cdnjs.cloudflare.com/ajax/libs/ramda/0.26.1/ramda.min.js"></script>
<script>const {compose, map, props, product, transduce, add} = R;</script>
The reason why the order reverses is that transducers utilize a property of function composition that is sometimes called abstraction from arity. It simply means that a function composition can return, well, another function:
const comp = f => g => x => f(g(x));
const mapTrace = tag => f => (console.log(tag), xs => (console.log(tag), xs.map(f)));
const sqr = x => x * x;
const main = comp(mapTrace("a")) (mapTrace("b")) (sqr); // returns another function
console.log(main); // logs the 2nd map and then the 1st one (normal order)
// pass an additional argument to that function
console.log(
main([[1,2,3]])); // logs in reverse order
Why returns the composition another function? Because map is a binary function that expects a function argument as its first argument. So when the composition is evaluated it yields another compositon of two partially applied maps. It is this additional iteration that reverses the order. I stop at this point without illustrating the evaluation steps, because I think it would get too complicated otherwise.
Additionally, you can see now how transducers fuse two iterations together: They simply use function composition. Can you do this by hand? Yes, you can absolutely do that.

Resources