I have this code:
Meteor.startup(function(){
Deps.autorun(function(){
var j = Jobs.find().count();
console.log(j);
});
});
A couple of things.
A) The console prints 4 different times with 4 different results (incrementing the count up: (0, 164, 687, 2228))
B) How do you get Meteor to read the count only once?
Related
I am new to Firebase and I am totally confused about what should I use. Here is my flow.
I have a collection score on firebase and it has values
- start_time
- count
- max_count
Now when start_time matches with the current time, I need to increment the count every five seconds till it matches max_count to the database. This should be in the backend. Now here I got confused. What can be suitable for this?
There are so many documents about Cloud Tasks and Pub/Sub.
If I Call the firebase function from Pub/Sub to update the count every 5 seconds then I will be paying for un-used compute time for calling a function.
I am not aware more about Cloud Tasks that is it matches my requirement? Can anyone please guide me?
Neither Cloud Tasks nor Pub/Sub would be the right solution for this and I wouldn't recommend using a cron-type service for such a menial task.
Instead consider moving the incremental logic to your client and just storing start_time and max_count in your database. Here's an example:
// Let's set a start_time 10 seconds in the future and pretend this was in the database
const start_time = Math.floor((new Date()).getTime() / 1000) + 10;
// Pretend this came from the database, we only want to iterate 10 times
const max_count = 10;
let prev_count = 0;
document.write("Waiting 10 seconds before starting<br />");
// Let's iterate once a second until we reach the start_time
let interval = setInterval(() => {
const now = Math.floor((new Date()).getTime() / 1000);
// If it's not start time, exit
if (now < start_time) return;
// Determine the count by dividing by 5 seconds
let count = Math.floor((now - start_time) / 5);
if (count > prev_count) {
document.write(`Tick: ${count}<br />`);
}
prev_count = count;
if (count >= max_count) {
clearInterval(interval);
}
}, 1000);
If you need the count stored in the database, have it update the count value in your database each time it increments.
I have been trying to learn firebase cloud functions recently and I have wrote an http that takes the itemName, sellerUid, and quantity. Then I have a background trigger (an onWrite) that finds the Item Price with the provided sellerUid and itemName and computes the total (Item Price * Quantity) and then writes it into a document in firestore.
My question is:
with what I have right now, suppose my client purchases N items, this means that I will have:
N reads (from the N items' price searching),
2 writes (one initial write for the N items and 1 for the Total Amount after computation),
N number of searches from cloud function??
I am not exactly sure how cloud functions count towards read and writes as well as the amount of compute time it needs (though it's all just text though so should be negligible?)
Would love to hear your thoughts on if what I have is already good enough or is there a much more efficient way of going about this.
Thanks!
exports.itemAdded = functions.firestore.document('CurrentOrders/{documentId}').onWrite(async (change, context) => {
const snapshot = change.after.data();
var total = 0;
for (const [key, value] of Object.entries(snapshot)) {
if (value['Item Name'] != undefined) {
await admin.firestore().collection('Items')
.doc(key).get().then((dataValue) => {
const itemData = dataValue.data();
if (!dataValue.exists) {
console.log('This is empty');
} else {
total += (parseFloat(value['Item Quantity']) * parseFloat(itemData[value['Item Name']]['Item Price']));
}
});
console.log('This is in total: ', total);
}
}
snapshot['Total'] = total;
console.log('This is snapshot afterwards: ', snapshot);
return change.after.ref.set(snapshot);
});
With your current approach you will be billed with:
N reads (from the N items' price searching);
1 write that triggers your onWrite function;
1 write that persists the total value;
One better approach that I can think of is one of comparing the size of the list of values in change.before.data() and change.after.data(), and reading the current total value (0 if this is the first time) and afterwards add only the values that were added in change.after.data() instead of N values, which would potentially result in you being charged for less reads.
For the actual pricing, if you check this Documentation for Cloud Functions, you will see that on your case only invocation and compute billing applies to your case, however there is a free tier for both, so if you are using this only to learn and this app does not have a lot of use, you should be on the free tier with either approach.
Let me know if you need any more information.
I am trying to write a dart function, which will get players weight and in the end give you the average of all players weight.
I am using executor package to fetch 3 players weight in one go and then as soon as a player's weight is fetched, i add it to list.
Issue it that i can't add await before for loop, and without await the code after for loop gets executed.
Is there anyway that i can sort of pause the program or return the value only when the executor tasks are complete?
avgWeight(int n)async {
List playersWeight=[];
Executor executor = Executor(concurrency: 3);
for (int i = 0; i < n; i++) {
// executor.join(withWaiting: true).;
executor.scheduleTask(() async {
int currentPlayerWeight = await PlayerDetail(i+1).fetchPlayerWeight();
print('courrentPlayerNo: ${i+1} currentPlayerWeight : $currentPlayerWeight ');
if(currentPlayerWeight!=null){
await playersWeight.add(currentPlayerWeight);
}
});
await executor.join(withWaiting:true);
print(playersWeight);//for debugging only. this shuld be printed only when all tasks are completed.
}
//playersWeight.reduce((a, b) => a + b) / playersWeight.length
}
Debug print statements of list of weights should be printed only after all tasks are completed. if value of n is 4 then list should be printed just once after all 4 elements are added, but for me its printing with each element.
current output
[200]
[200, 190]
[200, 190, 265]
[200, 190, 265, 255]
needed output:
[200, 190, 265, 255]
I need to return avg weight but i can't do that because of current issue.
As seen in the comments the problem was that await executor.join(withWaiting:true); was inside the for-loop but should have been outside.
Data to be inserted has just two TEXT columns whose individual length don't even exceed 256.
I initially used executeSimpleSQL since I didn't need to get any results.
It worked for simulataneous inserts of upto 20K smoothly i.e. in the bakground no lag or freezing observed.
However, with 0.1 million I could see horrible freezing during insertion.
So, I tried these two,
Insert in chunks of 500 records - This didn't work well since even for 20K records it showed visible freezing. I didn't even try with 0.1million.
So, I decided to go async and used executeAsync alongwith Bind etc. This also shows visible freezing for just 20K records. This was the whole array being inserted and not in chunks.
var dirs = Cc["#mozilla.org/file/directory_service;1"].
getService(Ci.nsIProperties);
var dbFile = dirs.get("ProfD", Ci.nsIFile);
var dbService = Cc["#mozilla.org/storage/service;1"].
getService(Ci.mozIStorageService);
dbFile.append('mydatabase.sqlite');
var connectDB = dbService.openDatabase(dbFile);
let insertStatement = connectDB.createStatement('INSERT INTO my_table
(my_col_a,my_col_b) VALUES
(:myColumnA,:myColumnB)');
var arraybind = insertStatement.newBindingParamsArray();
for (let i = 0; i < my_data_array.length; i++) {
let params = arraybind.newBindingParams();
// Individual elements of array have csv
my_data_arrayTC = my_data_array[i].split(',');
params.bindByName("myColumnA", my_data_arrayTC[0]);
params.bindByName("myColumnA", my_data_arrayTC[1]);
arraybind.addParams(params);
}
insertStatement.bindParameters(arraybind);
insertStatement.executeAsync({
handleResult: function(aResult) {
console.log('Results are out');
},
handleError: function(aError) {
console.log("Error: " + aError.message);
},
handleCompletion: function(aReason) {
if (aReason != Components.interfaces.mozIStorageStatementCallback.REASON_FINISHED)
console.log("Query canceled or aborted!");
console.log('We are done inserting');
}
});
connectDB.asyncClose(function() {
console.log('[INFO][Write Database] Async - plus domain data');
});
Also, I seem to get the async callbacks after a long time. Usually, executeSimpleSQL is way faster than this.If I use SQLite Manager Tool extension to open the DB immediately this is what I get ( as expected )
SQLiteManager: Error in opening file mydatabase.sqlite - either the file is encrypted or corrupt
Exception Name: NS_ERROR_STORAGE_BUSY
Exception Message: Component returned failure code: 0x80630001 (NS_ERROR_STORAGE_BUSY) [mozIStorageService.openUnsharedDatabase]
My primary objective was to dump data as big as 0.1 million + and then later on perform reads when needed.
In Meteor we have the '#index' operator to get the index value of the iteration. But I wanted to get the total number of iterations then print that number on the page. So the page at top might read the total number of boys in a group.
For example, I might have something like:
Total = {{#each StudentMale}} {{formatMaleCount #index}} {{/each}}
and a register helper just to add 1 to the number
Template.registerHelper('formatMaleCount', function (count) {
return count + 1;
});
and this would print:
Total = 1234567
I'd like to have:
Total = 7
Coming up short on how to do this. I tried to have the helper put the values in an array, but this wouldn't work since a new array is produced on each iteration.
StudentMale is presumably an array or cursor, so in a new helper:
If it's an array:
arrayLength( array ) {
return array.length;
}
Or if it's a collection:
studentMaleLength() {
return StudentMales.find().fetch().length;
}
Then just call your helper.