What is the best way to INSERT multiple rows in a SQLite table - sqlite

My problem mainly is performance related, I have this code running on the main ElectronJS proccess :
ipcMain.handle('add_product', async (event, args)=>{
return new Promise((resolve, reject)=>{
try {
if(Array.isArray(args)){
args.forEach(prod =>{
const {name,barcode,stock,price,buy_price,image,alert} = prod
const stmt = db.prepare("INSERT INTO products VALUES (?,?,?,?,?,?,?)")
stmt.run(name, barcode, stock, alert, price, buy_price, image)
stmt.finalize()
})
resolve({text : `${args.length} product have been added to database!`})
}else{
// This code execute's only when adding a single product
// It is not relevant to the question
const {name,barcode,stock,price,buy_price,image,alert} = args
const stmt = db.prepare("INSERT INTO products VALUES (?,?,?,?,?,?,?)")
stmt.run(name, barcode, stock, alert, price, buy_price, image)
stmt.finalize()
resolve({text : `Product '${name}' have been saved!`})
}
}catch (error){
reject(error)
}
})
})
It receives an array of objects, each object contains a single product details. Now the above code works and successfully inserts rows inside the database. However when testing it with a substantial data sample (more than 5000 product) the whole application freezes for a couple of seconds while it is saving rows to the database before it becomes responsive again.
The dev stack is :
ElectronJS
ReactJS (using it for the VIEW)
SQLite
What is the optimal and performance driven way to make the application works fatser?

Okay so the way I formulated the query was that it would run 5000 times -once for each product- which significantly slowed the whole application.
I changed the code to :
ipcMain.handle('add_product', async (event, args)=>{
return new Promise((resolve, reject)=>{
try {
if(Array.isArray(args)){
let sql = `INSERT INTO products VALUES`
args.forEach((prod, i) =>{
const {name,barcode,price,buy_price,stock,alert,image} = prod
if(i === args.length - 1){
sql += `('${name}','${barcode}','${price}','${buy_price}','${stock}','${alert}','${image}')`
}else{
sql += `('${name}','${barcode}','${price}','${buy_price}','${stock}','${alert}','${image}'),`
}
})
db.exec(sql, (error)=>{
if(error){
reject(error)
}else{
resolve({text : `${args.length} product have been added to database!`})
}
})
}else{
const {name,barcode,price,buy_price,stock,alert,image} = args
const stmt = db.prepare("INSERT INTO products VALUES (?,?,?,?,?,?,?)")
stmt.run(name, barcode, stock, alert, price, buy_price, image)
stmt.finalize()
resolve({text : `Product '${name}' have been saved!`})
}
}catch (error){
reject(error)
}
})
})
Now the query runs only once (in an asynch fashion to not block the UI) but with all the products and it's much faster.

Related

Use Firestore Transaction to handle concurrency for lucky draw

Project requirement
A web UI where there is a spinning wheel. The wheel has 1000 possible prizes when the user spin the wheel, they will win that item the cursor is pointing. Each prize has a limit quantity, for this example, let's say each prize has 100 quantity. Thus total number of prizes available is 100,000, 1000 different items, each with 10 quantity. Even though spinning the wheel feels like luck is involved, but because there is a limit quantity, we will figure out which items are available and pre-deteremine what the user will get.
What I've done to get a prize from Firestore
I'm using Firestore to store each items are available. For each doc in prizes, it contains the following:
{
prize: 'PlayStation',
available: true,
}
Thus, I have 1000 docs, containing this schema. If the item has been picked, available will be false. We store each prize as one doc, because some prizes has only 1 quantity, while some other has 50.
On the click on the spin button, will send a request to the backend to get one item. This is the Firebase/Firestore logic I have currently:
try {
const collectionRef = collection(firestore, 'prizes');
const q = query(
collectionRef,
where('available', '==', true), // only get prizes that are available
limit(100) // added to reduce read
);
const querySnapshot = await getDocs(q);
// get available prizes ID
const availableIds = querySnapshot.docs.map((doc) => doc.id);
let data = null;
if (availableIds.length) {
// randomly picked on from the list
const documentId =
availableIds[Math.floor(Math.random() * availableIds.length)];
// to ensure that no 2 person picked the same prize, we use transaction
let res = await runTransaction(firestore, async (transaction) => {
const tDocRef = doc(collection(firestore, 'prizes'), documentId);
const tDoc = await transaction.get(tDocRef);
data = tDoc.data();
if (data.available == true) {
transaction.update(tDocRef, {
available: false,
});
return { data };
}
});
return res; // if we get a prize
}else{
return -1; // if there are no more prizes, send `-1` to frontend, tell user no more stocks
}
if(data==null){
return null; // might have 2 person picked the same prize, frontend auto try again
}
} catch (e) {
return null; // might have 2 person picked the same prize, frontend auto try again
}
How can I improve this?
do you see any issues I could face that I don't see it?
can this handle 10,000 people clicking the spin button at once?
can this ensure that no one get the same prize based on the doc, so to ensure that the total number of prizes for each item is correctly distributed
how can I improve the performance? (reducing the chance of getting null especially when the prizes are getting more limited.
thank you for reading, any ideas/thoughts are welcome

Firestore Transactions is not handling race condition

Objective
User on click a purchase button on the web frontend, it will send a POST request to the backend to create a purchase order. First, it will check the number of available stocks. If available is greater than 0, reduce available by 1 and then create the order.
The setup
Backend (NestJS) queries the Firestore for the latest available value, and reduce available by 1. For debugging, I will return the available value.
let available;
try {
await runTransaction(firestore, async (transaction) => {
const sfDocRef = doc(collection(firestore, 'items_available'), documentId);
const sfDoc = await transaction.get(sfDocRef);
if (!sfDoc.exists()) {
throw 'Document does not exist!';
}
const data = sfDoc.data();
available = data.available;
if(available>0){
transaction.update(sfDocRef, {
available: available-1,
});
}
});
} catch (e) {
console.log('Transaction failed: ', e);
}
return { available };
My stress test setup
Our goal is to see all API requests having different available value, this would mean that Firestore Transactions is reducing the value even though there are multiple requests coming in.
I wrote a simple multi-threaded program that queries the backend's create order API, it will query the available value and return the available value. This program will save the available value returned for each API request.
The stress test performed is about 10 transactions per second, as I have 10 concurrent processes querying the backend. Each process will http.get 20 queries:
const http = require('http');
function call(){
http.get('http://localhost:5000/get_item_available', res => {
let data = [];
res.on('data', chunk => {
data.push(chunk);
});
res.on('end', () => {
console.log('Response: ', Buffer.concat(data).toString());
});
}).on('error', err => {
console.log('Error: ', err.message);
});
}
for (var i=0; i<20; i++){
call();
}
The problem
Unfortunately, the available values I got from the requests contains repeated values, that is, having same available values instead of having unique available values.
What is wrong? Isn't Firestore Transactions meant to handle race conditions? Any suggestions on what I could change to handle multiple requests hitting the server and return a new value for each request?
You have a catch clause to handle when the transaction fails, but then still end up returning a value to the caller return { available }. In that situation you should return an error to the caller.

Firestore query "onSnapshot" called at the same time does not work (

I created an app with Ionic and Firestore that features live chat and I'm having a problem with it.
The conversation is loaded with the method:
refUneConversationMyUserCol.ref.orderBy('date', 'desc').limit(20).get()
To this is added an "onSnapshot" request to retrieve the last message sent live
this.unsubscribeDataUneConversation = refUneConversationMyUserCol.ref.orderBy('date', 'desc').limit(1).onSnapshot(result => {
console.log(result.docs[0].data());
if (this.isCalledBySnapshot === false) {
this.isCalledBySnapshot = true;
} else if (result.docs[0].data().expediteur !== this.authentificationService.uidUserActif) {
const data = result.docs[0].data();
const id = result.docs[0].id;
this.dataUneConversation.push({ id, ...data } as UneConversation);
}
});
It will work perfectly however, when I send a message at the same time (with 2 different accounts talking to each other), I encounter a problem, the onSnapshot is triggered only once and I only receive one message.
I specify that the two messages are sent well in the database, they are only not displayed both during the live session
Do you have any idea why?
Thank you
(Here is the whole method)
async getDataUneConversation(idI: string) {
if (this.loadedDataUneConversation !== idI) {
/* ANCHOR Msg en direct */
this.isCalledBySnapshot = false;
if (this.unsubscribeDataUneConversation) {
await this.unsubscribeDataUneConversation();
}
const refUneConversationMyUserCol = this.afs.collection<User>('users').doc<User>(this.authentificationService.uidUserActif).collection<Conversations>('conversations');
const result = await refUneConversationMyUserCol.ref.orderBy('date', 'desc').limit(20).get();
/* ANCHOR Msg en direct */
this.unsubscribeDataUneConversation = refUneConversationMyUserCol.ref.orderBy('date', 'desc').limit(1).onSnapshot(result => {
console.log(result.docs[0].data());
if (this.isCalledBySnapshot === false) {
this.isCalledBySnapshot = true;
} else if (result.docs[0].data().expediteur !== this.authentificationService.uidUserActif) {
const data = result.docs[0].data();
const id = result.docs[0].id;
this.dataUneConversation.push({ id, ...data } as UneConversation);
}
});
/* ANCHOR Msg en brut */
if (result.docs.length < 20) {
this.infiniteLastUneConversationMax = true;
} else {
this.infiniteLastUneConversationMax = false;
}
this.infiniteLastUneConversation = result.docs[result.docs.length - 1];
this.dataUneConversation = result.docs.map(doc => {
const data = doc.data();
const id = doc.id;
return { id, ...data } as UneConversation;
});
this.dataUneConversation.reverse();
this.loadedDataUneConversation = idI;
}
}
EDIT for working :
this.unsubscribeDataUneConversation = refUneConversationMyUserCol.ref.orderBy('date', 'asc').startAfter(this.dataUneConversation[this.dataUneConversation.length
- 1].date).onSnapshot(result => {
result.docs.forEach(element => {
const data = element.data();
const id = element.id;
if (!this.dataUneConversation.some(e => e.id === element.id)) {
this.dataUneConversation.push({ id, ...data } as UneConversation);
}
});
});
You're limiting live messages to only one last message. In a chat app, you want to listen to all new messages. So the issue is probably in your .limit(1) clause.
But if you do that, I understand that you'll get the whole conversation, with all messages, since the conversation started.
My approach would be like this:
Get the date of the last message from your refUneConversationMyUserCol... conversation loader.
When you do the onSnapshot() to get the last message, do not limit to 1 message, instead, start at a date after the date of the last loaded message.
Since you're ordering by date anyway, this will be an easy fix. Look into "Adding a cursor to your query".
Basically, you'll be saying to Firestore: give me LIVE new messages but start at NOW - and even if there are many messages posted at the same time, you'll get them all, since you're not limiting to 1.
Feel free to ask if this is not clear enough.

Add new function in promise.all runtime in node

I am not sure about this question if this can be implemented or not.
I am using node.js with express.js and MySQL database.
I have a few records in MySQL database. These records are updating continues.
So, suppose I fetch some records from MySQL and start operations on each record with Promise.all using demoFunction function which is returned promise.
In this function, I am trying to check for new records in MySQL database. If I got new records then I want to push this new record's operation into current Promise.all queue. Is this possible? If not possible then how can I achieve this goal with continues execution?
So, my code is like,
const demoFunction = (arg1, arg2) => {
checkForNewData();
return new Promise((resolve, reject) => {
// Rest of my code is here for this function
// This function will be take around 5 to 10 mins
});
};
const dataFromDatabase = "Here i'm getting some data into array of object from SQL database";
let allPromises = dataFromDatabase.map((obj) => demoFunction(obj.arg1, obj.arg1));
const checkForNewData = () => {
const newDataFromDatabase = "Here i'm getting some new data into array of object from SQL database";
for (let i = 0; i < newDataFromDatabase.length; i++) {
allPromises.push(demoFunction(newDataFromDatabase[i].arg1, newDataFromDatabase[i].arg2));
}
};
return Promise.all(allPromises)
.then(() => {
// response
})
.catch((e) => {
console.log(e);
})
In this function, I am trying to check for new records in MySQL database. If I got new records then I want to push this new record's operation into current Promise.all queue. Is this possible?
Nope, Promise.all takes a finite and set number of promises and waits for all of them to complete.
If not possible then how can I achieve this goal with continues execution?
Well, a promise is just a value - if you have a promise for something then execution has already started somewhere else. You can always execute a second .all but what happens if records were added in the meantime?
It's fine to do:
Promise.all(allPromises).then(() => Promise.all(allPromises)).then(() => {
});
But at that point you're better off just waiting for the checkNewData call to finish before calling the Promise.all since otherwise you're introducing a race between checkAllData and the Promise.all
A promise is a "one time" thing, consider using an async iterator if you want to process results (note, this requires Node 12):
async function* getRecordData() {
for await(const item in getPromisesOfInitDataFromDatabase()) {
yield item; // or process it
}
while(true) { // or how often you want
for await(const item of getNewDastaFromDatabase()) {
yield item; // or process it
}
await sleep(3000); // or some sleep timeout to not constantly poll
}
}
Then elsewhere:
(async () => {
for await(const item of getRecordData()) {
// items are available here one by one, including new items in the database
}
})();

Meteor+React getMeteorData not updating with correct data

I'm on the latest Meteor and React. I'm using the mongo $text search feature on the server to filter a query. I'm finding that my local data isn't changing, even though the query is being rerun correctly on the server.
My component looks something like this:
TodoLists = React.createClass({
mixins: [ReactMeteorData],
getMeteorData() {
Meteor.subscribe('todoLists', this.props.query);
const todoLists = TodoLists.find().fetch();
//print out 1
console.log(todoLists);
return {
todoLists
};
// same as:
// return {
// todoLists: TodoLists.find().fetch()
// };
},
....
});
On the server, I have:
Meteor.publish('todoLists', function(query) {
//if no search query
if (!query) {
return TodoLists.find();
}
//searching...
const todoLists = TodoLists.find({
$text: {
$search: query
}
});
//print out 2
console.log(todoLists.fetch());
return todoLists;
}
});
Interestingly, I found that when this.props.query changes, the publish function on the server does run, and prints out the filtered results to the console as directed (print out 2). However, when I print out the results from the TodoLists.find().fetch() on the client (print out 1), I just get all of the results, as if no filtering were being done.
Is there some reason this code isn't working? Why would the results be as expected on the server, but not correct on the client?

Resources