How to limit mergeMap inner subscriptions to the N latest or a sliding window queue - meteor

I have a source stream merged from two streams. When the source stream emit event I'd like to call a subscription function Meteor.subscribe and keep it open, so I use mergeMap. When subscription is ready I pipe to another mergeMap to populate the data. It works well until I do 100 clicks and memory consumption is skyrockets. The question is, how is it possible to limit mergeMap, not to the first N subscriptions by concurrent: Number, but to the N recent ones, like a sliding window?
function paginationCache$(): Observable<any> {
return merge(this.pageParamsChanged$, this.routerParamsChanged$)
.pipe(
mergeMap((newParams) => {
// First merge map subscribes to data and un subscribes when second merge map unsubscribes
return Observable.create((observer: Subscriber<any>) => {
let subscription = Meteor.subscribe('my/best/data/sub', newParams,
() => {
observer.next(subscription);
observer.complete();
});
});
}),
mergeMap((subscription: any) => {
// second subscription is just populating the data
return Observable.create((observer: Subscriber<Meteor.Error | any>) => {
const collection = new Mongo.Collection(subscription.collectionName);
const { selector, options } = this.mongoParams();
collection.find(selector, options).dataChanges((data) => {
observer.next({ data });
});
return () => {
subscription.stop();
};
});
})
);
}
I'd like to give more detailed explanation what happening in that code.
In my example, source stream (the merge before pipe) it's never completes as long as I click button in my web interface, so it emits changes as I click next or previous button in my interface. First mergeMap gets changes from the source stream and sends them to backend API (which also has conflicted naming publication/subscription). So when data available on the client I call observer.next(subscription) to move to the second mergeMap, but I can't destroy or stop meteor's subscription. Two reasons: 1. I'd like to get realtime changes to selected data, 2. if I stop meteor's subscription, data on the client side will be removed. So, now second mergeMap it continuously updates selected data if it was updated on the server.
So after each UI button click (next, previous) I have new chain of subscriptions. It is okey if the original data table not big (1000 records) and I just clicked couple times. But, I can have more than that 30000 and I can click my buttons many times.
So, the idea is to make mergeMap like a limited size queue that holds just last N subscriptions, but queue is changing all the time I click the button.
LAST EDIT: working code:
function paginationCache$(): Observable<any> {
const N = 3;
const subscriptionsSubject = new Subject();
return merge(this.pageParamsChanged$, this.routerParamsChanged$)
.pipe(
mergeMap((newParams) => {
// First merge map subscribes to data and un subscribes when second merge map unsubscribes
subscriptionsSubject.next();
return Observable.create((observer: Subscriber<any>) => {
let subscription = Meteor.subscribe('mu/best/data/sub', newParams,
() => {
observer.next(subscription);
observer.complete();
});
});
}),
mergeMap((subscription: any) => {
// second subscription is just populating the data
return Observable.create((observer: Subscriber<Meteor.Error | any>) => {
const collection = new Mongo.Collection(subscription.collectionName);
const { selector, options } = this.mongoParams();
collection.find(selector, options).dataChanges((data) => {
observer.next({ data });
});
return () => {
subscription.stop();
};
}).pipe(
takeUntil(subscriptionsSubject
.pipe(
take(N),
filter((_, idx) => idx === N - 1)
)
)
);
})
);
}

Without considering your snippet, here's how I'd go about this:
not to the first N subscriptions by concurrent: Number, but to the N recent ones, like a sliding window
If I understood correctly, you'd want something like this(assuming N = 3):
N = 3
Crt | 1 | 2 | 3 |
Subscriptions | S1 | S2 | S3 |
When Crt = 4
Crt | 2 | 3 | 4 |
Subscriptions | S2 | S3 | S4 |
If that's the case, here's how I'd solve it:
const subscriptionsSubject = new Subject();
src$.pipe(
mergeMap(
data => (new Observable(s => {/* ... */ subscriptionsSubject.next(null) /* Notify about a new subscription when it's the case */ }))
.pipe(
takeUntil(
subscriptionsSubject.pipe(
take(N), // After `N` subscriptions, it will complete
filter((_, idx) => idx === N - 1) // Do not want to complete immediately, but only when exactly `N` subscriptions have been reached
)
)
)
)
)

I have two ideas here:
You're not completing the second inner Observable. I guess this shouldn't be the source of your problem but it's better to complete observers if you can:
return () => {
subscription.stop();
observer.complete();
};
You can use bufferCount to make a sliding window of Observables and then subscribe to them with switchMap(). Something along these lines:
import { of, range } from 'rxjs';
import { map, bufferCount, switchMap, shareReplay, tap } from 'rxjs/operators';
range(10)
.pipe(
// turn each value to an Observable
// `refCount` is used so that when `switchMap` switches to a new window
// it won't trigger resubscribe to its sources and make more requests.
map(v => of(v).pipe(shareReplay({ refCount: false, bufferSize: 1 }))),
bufferCount(3, 1),
tap(console.log), // for debugging purposes only
switchMap(sourcesArray => merge(...sourcesArray)),
)
.subscribe(console.log);
Live demo: https://stackblitz.com/edit/rxjs-kuybbs?devtoolsheight=60
I'm not completely sure this simulates your use-case but I tried to include also shareReplay so that it won't trigger multiple Meteor.subscribe calls for the same Observable. I'd have to have a working demo of your code to test it myself.

Related

NGRX - How to subscribe to multiple requests that update a list, but prevent it from loading multiple times on the first load?

I'm working with Angular-NGRX and i have 3 components that can modify the data of a list. In the main component I listen for all the changes to update the listing and it works. The problem is that the first load subscribes me 3 times.
This is the ngOnInit of the main component:
ngOnInit(): void {
// Change order
this.store.select(state => state.shared.orderBy).pipe(
takeUntil(this.onDestroy$),
tap(resp => {
this.orderBy = resp;
this._cmsFunctions.deselectAll('Resource');
}),
switchMap(resp => this._dynamicComponentsService.loadResources(this.itemsToShow, this.tabCounter, this.orderBy, this.filters, this.searchTerm, this.company, this.workgroup, (this.paramCode) ? this.paramCode : 0))
).subscribe();
// Change filters
this.store.select(state => state.shared.filters).pipe(
takeUntil(this.onDestroy$),
tap(resp => {
this.filters = resp;
this._cmsFunctions.deselectAll('Resource');
}),
switchMap(resp => this._dynamicComponentsService.loadResources(this.itemsToShow, this.tabCounter, this.orderBy, this.filters, this.searchTerm, this.company, this.workgroup, (this.paramCode) ? this.paramCode : 0))
).subscribe();
// Change search term
this.store.select(state => state.shared.searchTerm).pipe(
takeUntil(this.onDestroy$),
tap(resp => {
this.searchTerm = resp;
this._cmsFunctions.deselectAll('Resource');
}),
switchMap(resp => this._dynamicComponentsService.loadResources(this.itemsToShow, this.tabCounter, this.orderBy, this.filters, this.searchTerm, this.company, this.workgroup, (this.paramCode) ? this.paramCode : 0))
).subscribe();
}
And all i need is that when starting it only subscribes once:
enter image description here
How can I improve this code?
Thanks!
You shouldn't be making calls to a data service from the component, and certainly not based on selector calls. You should be dispatching an Action which is captured and handled by an Effect. You can use dataLoading and dataLoaded flags in your store, for example, to prevent multiple loads.

How do I return an entire paged set from the Jira API using Ramda?

I'm using the Nodejs library for talking to Jira called jira-connector. I can get all of the boards on my jira instance by calling
jira.board.getAllBoards({ type: "scrum"})
.then(boards => { ...not important stuff... }
the return set looks something like the following:
{
maxResults: 50,
startAt: 0,
isLast: false,
values:
[ { id: ... } ]
}
then while isLast === false I keep calling like so:
jira.board.getAllBoards({ type: "scrum", startAt: XXX })
until isLast is true. then I can organize all of my returns from promises and be done with it.
I'm trying to reason out how I can get all of the data on pages with Ramda, I have a feeling it's possible I just can't seem to sort out the how of it.
Any help? Is this possible using Ramda?
Here's my Rx attempt to make this better:
const pagedCalls = new Subject();
pagedCalls.subscribe(value => {
jira.board.getAllBoards({ type:"scrum", startAt: value })
.then(boards => {
console.log('calling: ' + value);
allBoards.push(boards.values);
if (boards.isLast) {
pagedCalls.complete()
} else {
pagedCalls.next(boards.startAt + 50);
}
});
})
pagedCalls.next(0);
Seems pretty terrible. Here's the simplest solution I have so far with a do/while loop:
let returnResult = [];
let result;
let startAt = -50;
do {
result = await jira.board.getAllBoards( { type: "scrum", startAt: startAt += 50 })
returnResult.push(result.values); // there's an array of results under the values prop.
} while (!result.isLast)
Many of the interactions with Jira use this model and I am trying to avoid writing this kind of loop every time I make a call.
I had to do something similar today, calling the Gitlab API repeatedly until I had retrieved the entire folder/file structure of the project. I did it with a recursive call inside a .then, and it seems to work all right. I have not tried to convert the code to handle your case.
Here's what I wrote, if it will help:
const getAll = (project, perPage = 10, page = 1, res = []) =>
fetch(`https://gitlab.com/api/v4/projects/${encodeURIComponent(project)}/repository/tree?recursive=true&per_page=${perPage}&page=${page}`)
.then(resp => resp.json())
.then(xs => xs.length < perPage
? res.concat(xs)
: getAll(project, perPage, page + 1, res.concat(xs))
)
getAll('gitlab-examples/nodejs')
.then(console.log)
.catch(console.warn)
The technique is pretty simple: Our function accepts whatever parameters are necessary to be able to fetch a particular page and an additional one to hold the results, defaulting it to an empty array. We make the asynchronous call to fetch the page, and in the then, we use the result to see if we need to make another call. If we do, we call the function again, passing in the other parameters needed, the incremented page number, and the merge of the current results and the ones just received. If we don't need to make another call, then we just return that merged list.
Here, the repository contains 21 files and folders. Calling for ten at a time, we make three fetches and when the third one is complete, we resolve our returned Promise with that list of 21 items.
This recursive method definitely feels more functional than your versions above. There is no assignment except for the parameter defaulting, and nothing is mutated along the way.
I think it should be relatively easy to adapt this to your needs.
Here is a way to get all the boards using rubico:
import { pipe, fork, switchCase, get } from 'rubico'
const getAllBoards = boards => pipe([
fork({
type: () => 'scrum',
startAt: get('startAt'),
}),
jira.board.getAllBoards,
switchCase([
get('isLast'),
response => boards.concat(response.values),
response => getAllBoards(boards.concat(response.values))({
startAt: response.startAt + response.values.length,
})
]),
])
getAllBoards([])({ startAt: 0 }) // => [...boards]
getAllBoards will recursively get more boards and append to boards until isLast is true, then it will return the aggregated boards.

Data validation using RxJS

I have the following function that validates that rangeFrom is not superior to rangeTo and that the rangeFrom does not already exist in the list of ranges.
How can rewrite this using RxJS?
const isTagAlreadyExist = (tags, currentTag) => _(tags)
.filter(x => x.id !== currentTag.id)
.some(x => _.inRange(currentTag.rangeTo, x.rangeFrom, x.rangeTo))
.value();
const validateRangeFrom = (tags, currentTag) => {
const errors = {};
if (isNumeric(currentTag.rangeFrom)) {
if (!_.inRange(currentTag.rangeFrom, 0, currentTag.rangeTo)) {
errors.rangeFrom = 'FROM_TAG_CANNOT_BE_GREATER_THAN_TO_TAG';
} else if (isTagAlreadyExist(tags, currentTag)) {
errors.rangeFrom ='TAG_ALREADY_EXISTS';
}
}
return {
errors
};
};
The question is: what parts do you want to rewrite to rxjs? Those are two pure functions that run synchronously from what I can see, I do not really see much a usecase for rxjs here - of course you could always utilize your functions within an rxjs stream:
const validateRangeFrom$ = (tags, currentTag) => {
return Observable.of(currentTag)
.map(tag => validateRangeFrom(tags, tag));
}
validateRangeFrom$(myTags, currentTag)
.subscribe(errors => console.log(errors));
But as you can see, this does not make much sense if you simply wrap it inside a stream, the essence of useful reactive programming is, that everything is reactive, not just some small parts, so for your example, you should start with having tags$ and currentTag$ as observables - let's assume that you have that, then you could do something like:
const tags$: Observable<ITag[]>... // is set somewhere, and emits a new array whenever it is changed
const currentTag$: Observable<ITag>... // is set somewhere and emits the tag whenever a new currentTag is set
const validateRangeFrom$ = Observable
.combineLatest(tags$, currentTag$, (tags, tag) => ({tags, tag}))
.map(({tags, tag}) => validateRangeFrom(tags, tag));
validateRangeFrom$.subscribe(errors => console.log(errors));
This will automatically trigger the validation for you whenever a new tags-array is emitted or a new currentTag is selected/set - but again: your validation-method is kept the same - as even in reactive programming you have to do validation and logic-operations at some point, the reactive part usually just concerns the flow of the data (see: tags$ and currentTag$)

Rxjs: add data to elements of array returned from http response

Following this question: Add data to http response using rxjs
I've tried to adapt this code to my use case where the result of the first http call yields an array instead of a value... but I can't get my head around it.
How do I write in rxjs (Typescript) the following pseudo code?
call my server
obtain an array of objects with the following properties: (external id, name)
for each object, call another server passing the external id
for each response from the external server, obtain another object and merge some of its properties into the object from my server with the same id
finally, subscribe and obtain an array of augmented objects with the following structure: (external id, name, augmented prop1, augmented prop2, ...)
So far the only thing I was able to do is:
this._appService
.getUserGames()
.subscribe(games => {
this._userGames = _.map(games, game => ({ id: game.id, externalGameId: game.externalGameId, name: game.name }));
_.forEach(this._userGames, game => {
this._externalService
.getExternalGameById(game.externalGameId)
.subscribe(externalThing => {
(<any>game).thumbnail = externalThing.thumbnail;
(<any>game).name = externalThing.name;
});
});
});
Thanks in advance
I found a way to make it work. I'll comment the code to better explain what it does, especially to myself :D
this._appService
.getUserGames() // Here we have an observable that emits only 1 value: an any[]
.mergeMap(games => _.map(games, game => this._externalService.augmentGame(game))) // Here we map the any[] to an Observable<any>[]. The external service takes the object and enriches it with more properties
.concatAll() // This will take n observables (the Observable<any>[]) and return an Observable<any> that emits n values
.toArray() // But I want a single emission of an any[], so I turn that n emissions to a single emission of an array
.subscribe(games => { ... }); // TA-DAAAAA!
Don't use subscribe. Use map instead.
Can't test it, but should look more like this:
this._appService
.getUserGames()
.map(games => {
this._userGames = _.map(games, game => ({ id: game.id, externalGameId: game.externalGameId, name: game.name }));
return this._userGames.map(game => { /* this should return an array of observables.. */
return this._externalService
.getExternalGameById(game.externalGameId)
.map(externalThing => {
(<any>game).thumbnail = externalThing.thumbnail;
(<any>game).name = externalThing.name;
return game;
});
});
})
.mergeAll()
.subscribe(xx => ...); // here you can subscribe..

Redux: How to do something other than reducing or rendering when an action is sent

I am trying to understand the concepts behind redux.
Consider this very simple UI:
+-------+-------+-------+
| Tab A | Tab B | Tab C |
| +--------------------+
| one ................ $1.22 |
| two ................ $3.22 |
| three ............ $211.99 |
The state has an object itemsById and an array currentlyVisibleItemIds and is easily rendered.
I also have a websocket open which delivers constant price updates and forwards them using store.dispatch(). The reducer creates a new updated itemsById object and prices re-render. All good.
However, for efficiency reasons I only want to listen to price updates of currently visible items so I have to send a subscription command through the socket whenever currentlyVisibleItemIds changes.
For the life of me I cannot find a good place to put this logic. If I do store.subscribe() in the websocket code I get all changes and have to manually figure out if currentlyVisibleItemIds changed. Calling into the websocket from the reducer feels very wrong. Should this go into a thunk so that the websocket code is called from within the action?
Any suggestions appreciated.
UPDATE
Here's what I currently have. Seems a bit clunky because I have to manually figure out if currentlyVisibleItemIds has changed:
function PriceFeed(port, store) {
var webSocket, isConnected, previousItems, unsubscribeFunc;
function onOpen() {
isConnected = true;
}
function onMessage(event) {
var json = JSON.parse(event.data);
if (json.type === 'updates') {
store.dispatch({
type: 'PRICE_UPDATES',
data: json.prices
});
}
}
function storeChangeHandler() {
if (!isConnected) {
return;
}
var currentItems = store.getState().currentlyVisibleItemIds;
if (currentItems != previousItems) {
webSocket.send(JSON.stringify({
type: 'subscription',
ids: currentItems
}));
previousItems = currentItems;
}
}
webSocket = new WebSocket(`ws://${location.hostname}:${port}`);
webSocket.addEventListener('message', onMessage);
webSocket.addEventListener('open', onOpen);
unsubscribeFunc = store.subscribe(storeChangeHandler);
}
redux-thunk should handle this pretty well, assuming you are tracking your visible items in the store. In your message receieved WS callback, you can do this:
store.dispatch(PriceUpdate(itemID, price))
where PriceUpdate is defined as
function PriceUpdate(itemID, price) {
return (dispatch, getState) => {
if (getState().visibleItems.includes(itemID)) {
dispatch({ type: PRICE_UPDATE, payload: { itemID, price });
}
}
}
That said, I would be wary of prematurely optimizing here. In most cases, even up to a very large number of transactions, it is probably worth it to just keep all your local data fresh, regardless of whether its in view. But of course, I don't know the context here, and there are definitely situations where this makes sense.

Resources