Issue with observable fork join - sqlite

Hi I have 3 tables of which, each one is child of another. I wrote a method to fetch from sqllite db as follows
public downloadFromOfflineDB(db,testSO){
var observableBatch = [];
observableBatch.push(db.executeSql("select * from TMP_AUD WHERE CRE_BY=? AND AUD_NUMBER=? ",
[localStorage.getItem("user_name"), testSO.auditNumber]).then(
response => {
this._util.logData('In downloadPendingInstancesForSyncFromOfflineDB- folder'+response.rows.length+'ID= '+response.rows.item(0).FLD_NUMBER);
if (response && response.rows && response.rows.length > 0) {
if (response && response.rows && response.rows.length > 0) {
var FLD_NUMBER = response.rows.item(0).FLD_NUMBER;
var folderArray = []
observableBatch.push(db.executeSql("select * from TMP_FOLDER WHERE CRE_BY=? AND FLD_NUMBER=? ",
[localStorage.getItem("user_name"), FLD_NUMBER]).then(
a => {
this._util.logData('In downloadPendingInstancesForSyncFromOfflineDB-TMP_FOLDER'+a.rows.length);
if (a && a.rows && a.rows.length > 0) {
for (let i = 0; i < a.rows.length; i++) {
var folderObj = {
folderName: a.rows.item(i).FLD_NAME,
files:[]
}
var FLD_NAME = a.rows.item(i).FLD_NAME
this._util.logData('In downloadPendingInstancesForSyncFromOfflineDB-TMP_FOLDER '+FLD_NAME);
observableBatch.push( db.executeSql("select * from TMP_FILES WHERE CRE_BY=? AND FLD_NAME=? ",
[localStorage.getItem("user_name"), FLD_NAME]).then(
b => {
this._util.logData('In downloadPendingInstancesForSyncFromOfflineDB-TMP_FILES'+b.rows.length);
var fileArray = [];
if (b && b.rows && b.rows.length > 0) {
for (let j = 0; j < b.rows.length; j++) {
var fileSO = {
compliance: b.rows.item(j).COMPLIANCE,
remarks: b.rows.item(j).REMARKS,
fileName: b.rows.item(j).FILE_NAME,
title: b.rows.item(j).TITLE
}
);
fileArray.push(fileSO);
}}
folderObj.files=fileArray;
}).catch(
e => {
this._util.logData('For sync error'+JSON.stringify(e));
return Observable.throw("An error occurred during sync");
})
);
folderArray.push(folderObj);
}}
}).catch(
e => {
this._util.logData('For sync error'+JSON.stringify(e));
return Observable.throw("An error occurred during sync");
})
);
}
}
testSO.folderArray = folderArray;
this._util.logData('Candidate for selected for sync' + JSON.stringify(testSO));
})
);
return Observable.forkJoin(observableBatch);
}
The issue here is below method is not waiting for all the calls to finish
public getFiles(testSO) {
return Observable.create(observer => {
this.platform.ready().then(() => {
this.sqlite.create({
name: 'offline.db',
location: 'default'
}).then((db: SQLiteObject) => {
this.downloadFromOfflineDB(db, testSO).subscribe(c => {
observer.next(c[0]);//This is undefined
observer.complete();
},
error => {
observer.error("An error occurred sync files.");
});
});
});
});
}
First method is executing, while second method returns before first execution is complete and I am not getting my object testSO populated. Can someone please guide me and tel me what I am doing wrong here.I used observable fork Join.

Looks like you are calling Observable.forkJoin(observableBatch) with only one item - result of db.executeSql. When you add more items later on it doesn't affect forkJoin.

Related

AngularFire2 Firebase Observable never ends (list is empty)

I'm trying to query an empty firebase list. The problem is that the observable method subscribe never finish and I can't show to user that ddbb list is empty.
The function getUserAppointmentsByDate(...) is calling getUserAppointments(...), where this.database.list('/appointment/users/' + user_uid) is an empty firebase list for the input user (user_uid).
how should I manage an empty query to firebase?
thanks in advance!
getUserAppointmentsByDate(user_uid: string, start: string, end: string) {
if (typeof (user_uid) == "undefined" || typeof (start) == "undefined" || typeof (end) == "undefined") {
console.error("invalid argument for getPatientReport");
return;
}
return this.getUserAppointments(user_uid)
.map(
(appointment) => {
return appointment
.filter((appointment) => {
var appointmentStart = new Date(appointment.start);
var startFilter = new Date(start);
var endFilter = new Date(end);
//Filter old, not cancelled and not deleted
return (appointmentStart.getTime() < endFilter.getTime())
&& (appointmentStart.getTime() > startFilter.getTime())
&& (appointment.status != AppointmentStatus.CANCELLED);
});
})
}
getUserAppointments(user_uid: string): any {
return this.database.list('/appointment/users/' + user_uid) //*THIS IS AN EMPTY LIST
.mergeMap((appointments) => {
return Observable.forkJoin(appointments.map(
(appointment) => this.database.object('/appointment/list/' + appointment.$key)
.take(1)))
})
}
As the this.database.list('/appointment/users/' + user_uid) return a empty array. Observable.forkJoin(appointments.map( complete without emit any value (that is the expected way of forkJoin works). In this case, you have two options, handling in the complete function.
.subscribe(
res => console.log('I got values'),
err => console.log('I got errors'),
// do it whatever you want here
() => console.log('I complete with any values')
)
or handle in an if statement:
import { of } from 'rxjs/observable/of';
...
return this.database.list('/appointment/users/' + user_uid)
.mergeMap((appointments) => {
if (appointments.length === 0) return of([]);
return Observable.forkJoin(appointments.map(
(appointment) => this.database.object('/appointment/list/' + appointment.$key)
.take(1)))
})

How to use batchWriteItem to write more than 25 items into DynamoDB Table using PHP

I am using AWS SDK for PHP 3.x
A single call to BatchWriteItem can write up to 16 MB of data, which can comprise as many as 25 put or delete requests. Individual items to be written can be as large as 400 KB.
$result = $dynamodbClient->batchWriteItem([
'RequestItems' => [
$tableName => [
[
'PutRequest' => [
'Item' => [
'Id' => ['N' => '1'],
'AlbumTitle' => [
'S' => 'Somewhat Famous',
],
'Artist' => [
'S' => 'No One You Know',
],
'SongTitle' => [
'S' => 'Call Me Today',
],
],
],
],
],
],
]);
For single item its working fine. How can I write more than 25 items.
To write more than 25 items, you have to repeatedly call BatchWriteItem, adding items from your collection, 25 at a time.
Something along these lines (pseudo-code):
requests = []; // use an array to stage your put item requests
foreach(item in SourceCollection) {
addItem(item, requests); // add this item to the array
if(count(requests) == 25) { // when you have 25 ready..
// result = dynamodbClient->batchWriteItem(...)
requests = []; // clean up the array of put item requests
// handle the failed items from the result object
}
}
Make sure to handle failed items from each batchWriteItem result by re-adding them back to the requests
Here is my way for a lambda function:
exports.handler = (event, context, callback) => {
console.log(`EVENT: ${JSON.stringify(event)}`);
var AWS = require('aws-sdk');
AWS.config.update({ region: process.env.REGION })
var docClient = new AWS.DynamoDB.DocumentClient();
const {data, table, cb} = JSON.parse(event.body);
console.log('{data, table, cb}:', {data, table, cb});
// Build the batches
var batches = [];
var current_batch = [];
var item_count = 0;
for (var i = 0; i < data.length; i++) {
// Add the item to the current batch
item_count++
current_batch.push({
PutRequest: {
Item: data[i],
},
})
// If we've added 25 items, add the current batch to the batches array
// and reset it
if (item_count % 25 === 0) {
batches.push(current_batch)
current_batch = []
}
}
// Add the last batch if it has records and is not equal to 25
if (current_batch.length > 0 && current_batch.length !== 25) {
batches.push(current_batch)
}
// Handler for the database operations
var completed_requests = 0
var errors = false
function requestHandler (request) {
console.log('in the handler: ', request)
return function (err, data) {
// Increment the completed requests
completed_requests++;
// Set the errors flag
errors = (errors) ? true : err;
// Log the error if we got one
if(err) {
console.error(JSON.stringify(err, null, 2));
console.error("Request that caused database error:");
console.error(JSON.stringify(request, null, 2));
callback(err);
}else {
var response = {
statusCode: 200,
headers: {
'Content-Type': 'application/json',
'Access-Control-Allow-Methods': 'GET,POST,OPTIONS',
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Credentials': true
},
body: JSON.stringify(data),
isBase64Encoded: false
};
console.log(`success: returned ${data}`);
callback(null, response);
}
// Make the callback if we've completed all the requests
if(completed_requests === batches.length) {
cb(errors);
}
}
}
// Make the requests
var params;
for (var j = 0; j < batches.length; j++) {
// Items go in params.RequestItems.id array
// Format for the items is {PutRequest: {Item: ITEM_OBJECT}}
params = '{"RequestItems": {"' + table + '": []}}'
params = JSON.parse(params)
params.RequestItems[table] = batches[j]
console.log('before db.batchWrite: ', params)
// Perform the batchWrite operation
docClient.batchWrite(params, requestHandler(params))
}
};
dealspoondBatchWrite
i am using the following code to add data using batchWriteItem. Suggest if there is a better way.
// Build the batches
$albums= "// collection of album json";
$batches = [];
$current_batch = [];
$item_count = 0;
foreach ($albums as $album) {
// Add the item to the current batch
$item_count++;
$json = json_encode($album);
$data['PutRequest'] = array('Item' => $marshaler->marshalJson($json));
array_push($current_batch, $data);
// If we've added 25 items, add the current batch to the batches array
// and reset it
if ($item_count % 25 == 0) {
array_push($batches, $current_batch);
$current_batch = [];
}
}
// Handler for the database operations
// Add the last batch if it has records and is not equal to 25
if (count($current_batch) > 0 && count($current_batch) != 25) {
array_push($batches, array_values($current_batch));
}
//batches.push(current_batch);
// Handler for the database operations
$completed_requests = 0;
$errors = false;
$batch_count = 0;
foreach ($batches as $batch) {
try {
$batch_count++;
$params = array('RequestItems' => array($tableName => $batch), 'ReturnConsumedCapacity' => 'TOTAL', 'ReturnItemCollectionMetrics' => 'SIZE');
$response = $dynamodb->batchWriteItem($params);
echo "Album $batch_count Added." . "<br>";
echo "<pre>";
// print_r($params);
print_r($response);
echo "</pre>";
}
catch (DynamoDbException $e) {
echo "Unable to add movie:\n";
echo $e->getMessage() . "\n";
// break;
}
}

AngularFire2 query, join or filter with foreign keys from another firebase table

I have this firebase data structure
{
members: {
m1: {
lastName: "smith",
firstName: "john"
},
m2: {
lastName: "abc",
firstName: "mike"
}
},
userFavs: {
u1: {
m1:true
},
u2: {
m2:true
}
}
}
In my service, I have this method:
getMembers(): FirebaseListObservable<any[]> {
return this.af.database.list('/members',{
query: {
orderByChild: 'firstName'
}
});
}
In members page TS file, I have method to do search:
setFilteredItems(){
if (this.searchTerm == null || this.searchTerm == ''){
this.members = this.membersSvc.getMembers()
.map((members) => {return members});
}else{
//return items.filter(item => item.name.toLowerCase().indexOf(args[0].toLowerCase()) !== -1);
this.members = this.membersSvc.getMembers()
.map((members) =>
members.filter(member => member.lastName.toLowerCase().indexOf(this.searchTerm.toLowerCase()) !== -1 || member.firstName.toLowerCase().indexOf(this.searchTerm.toLowerCase()) !== -1));
}
}
The search for members is working fine. Now I am adding 2 buttons below the search bar, All and Favorites. A user can add a member in his/her favorites. In search, the app needs to be able to filter the results with member keys that exists in the user favorites.
How can I add the additional filter of member keys that exists in the userFavs node?
I added the additional filter by getting the array of userFavs keys. So in my user service I have a method:
getUserFavKeys(){
let favKeys = [];
const userKey = this.authService.getActiveUser().uid;
let url = `/userCircles/${userKey}`;
this.af.database.list(url, { preserveSnapshot: true})
.subscribe(itemKeys=>{
itemKeys.forEach(itemKey => {
//console.log(itemKey.key);
favKeys.push(itemKey.key);
});
})
return favKeys;
}
Then in the component ngOnInit method, I initialized the array of keys:
this.favKeys = this.userSvc.getUserFavKeys();
And when the circles is selected:
onCirclesSelected(){
this.searching = false;
this.members = this.membersSvc.getMembers()
.map((members) =>
members.filter(member => this.userCircles.indexOf(member.$key) !== -1)
);
if (this.searchTerm == null || this.searchTerm == ''){
//do nothing
}else{
//filter w the search text
this.members = this.members
.map((members) =>
members.filter(member => member.lastName.toLowerCase().indexOf(this.searchTerm.toLowerCase()) !== -1 || member.firstName.toLowerCase().indexOf(this.searchTerm.toLowerCase()) !== -1));
}
}
Hope that helps to anyone that needs the same search feature.

extending firebase.database.Reference

in the old FB I added a helper function to get/set values as follows:
// val() -> get(), resolve with value at ref
// val(value) -> set(value), resolve with value
// val(vals) -> update(vals), resolve with vals
Firebase.prototype.val = function(vals) {
let self=this;
if (!vals) {
return this.once('value').then(
snapshot => {
if (typeof snapshot.val() === 'undefined' || snapshot.val() === null) throw 'INVALID_VALUE';
return snapshot.val();
},
err => {
throw err;
});
}
let singleVal=(vals.constructor != Object); // is singleVal then vals is a single value
if (singleVal ) return this.set(vals); // set single value
if (!singleVal) return this.update(vals).then(() => vals); // update multiple values
};
}
I could then do for example return ref.child(...).val();
This function does not run in V3.
How can I extend firebase in that way in V3 ?
thx!
Here's the solution - I find this extension very handy
// val() -> get(), resolve with value at ref, fails with error.code
// val(value) -> set(value), resolve with value, fails with error.code
// val(vals) -> update(vals), resolve with vals, fails with error.code
firebase.database.Reference.prototype.val = function(vals) {
let path=this.toString().substring(firebase.database().ref().toString().length-1);
let valsAsString = (typeof vals==='string' ? vals : JSON.stringify(vals));
if (!vals) {
return this.once('value').then(
snapshot => {
if (typeof snapshot.val() === 'undefined' || snapshot.val() === null) {
console.log('val('+path+') failed (null) ! '+error.message+' ('+error.code+')');
throw 'INVALID_VALUE';
}
return snapshot.val(); },
error => {
console.log('val('+path+') failed ! '+error.message+' ('+error.code+')');
throw error.code;
});
}
let singleVal=(vals.constructor != Object); // is singleVal then vals is a single value
if (singleVal ) return this.set(vals).then( // set single value
() => {
return vals;
}, error => {
console.log('val('+path+','+valsAsString+') failed ! '+error.message+' ('+error.code+')');
throw error.code;
}
);
return this.update(vals).then( // update multiple values
() => {
return vals;
}, error => {
console.log('val('+path+','+valsAsString+') failed ! '+error.message+' ('+error.code+')');
throw error.code;
}
);
};
}

Firebase Queue - Handling Reject/Resolve while Looping

I have a Queue that looks like this
new Queue(queueRef, options, ({post, user, postId}, progress, resolve, reject) => {
rootRef.child(`users/${user.user_id}/followers`).once('value', (snapshot) => {
const followers = toArray(snapshot.val())
for (var i = 0; i < followers.length; i++) {
rootRef.child(`users/${followers[i].user_id}/feed/${postId}`).set(post, (err) => {
if (err) {
reject(err)
} else if (i >= followers.length - 1) {
resolve({post, user, postId})
}
})
}
}, reject)
})
My issue is that I'm really only resolving once all the sets have finished and rejecting if any of those fail. What I'd like to do is somehow pass each iteration of a loop to another Queue which can then reject/resolve for that specific request rather than the whole collection.
This looks like it's probably an XY problem and probably has a better solution. But you're looking for something like Q.all().
In essence, call a method that does each op and returns a promise, and resolve/reject when the entire set is done.
new Queue(queueRef, options, ({post, user, postId}, progress, resolve, reject) => {
rootRef.child(`users/${user.user_id}/followers`).once('value', (snapshot) => {
var promiseList = [], p;
const followers = toArray(snapshot.val())
for (var i = 0; i < followers.length; i++) {
p = processNextFollower(followers[i]);
// p.then(progress);
promiseList.push(p);
}
Q.all(promiseList).then(resolve, reject);
}, reject)
})
function processNextFollower(follower, postId) {
var def = Q.defer();
rootRef.child(`users/${follower.user_id}/feed/${postId}`).set(post, (err) => {
if (err) {
def.reject(err)
} else if (i >= followers.length - 1) {
def.resolve({post, user, postId})
}
})
}
return def.promise;
}

Resources