Firebase Cloud Functions - limit calls per second per IP - firebase

I am planing to create few API calls based on Firebase Cloud Functions and DB.
Every call will access to Firebase DB, get data or modify data in Firebase DB and return relevant responce.
But i am afraid of DDOS atacks, or similar massive access attacks.
By my usecase scenairio I am sure that user does not need more than 25 API calls in one minute.
Is it possible to set such limit to my Firebase Cloud Functions/DB ?
How can i protect my Firebase from overusing by unconscious individuals.

private mylock:boolean = true;
if(this.mylock == true){
do DB dquery//
this.mylock = false;
setTimeout(function(){
this.mylock = true;
}, 3000);
}
You could also look into Cloudflare to protect against attacks https://www.cloudflare.com/integrations/google-cloud/
But you might be going over the top, worrying about this... in a tin foil hat kind of way, it is quite rare for this to happen. I would simply put a budget alert in your billing section of Google Cloud that will flag up any large usage.

Related

How to delete firestore security rules history

so we stumbled over the issue that we hit the quota limit for stored firestore security rules of 2500. During deployment the CLI asks us if we want to delete the oldest 10 rules. Since we want to automate our deployment reacting to a console prompt is not exactly what we want.
Anyone knows how to mass delete the complete firestore security rule history without having to do it manually one by one trough firebase?
I couldn't find any info on that whatsoever from Googles side...
Not sure if you can delete all security rules at once, but as per the documentation, you can avoid manual work by creating a logic to delete the oldest rules:
For example, to delete ALL rule sets deployed for longer than 30 days:
const thirtyDays = new Date(Date.now() - THIRTY_DAYS_IN_MILLIS);
const promises = [];
allRulesets.forEach((rs) => {
if (new Date(rs.crateTime) < thirtyDays) {
promises.push(admin.securityRules().deleteRuleset(rs.name));
}
});
await Promise.all(promises);
console.log(`Deleted ${promises.length} rulesets.`);

Import large data (json) into Firebase periodically

We are in the situation that we will have to update large amounts of data (ca. 5 Mio Records) in firebase periodically. At the moment we have a few json files that are around ~1 GB in size.
As existing third party solutions (here and here) have some reliability issues (import object per object; or need for open connection) and are quite disconnected to the google cloud platform ecosystem. I wonder if there is now an "official" way using i.e. the new google cloud functions? Or a combination with app engine / google cloud storage / google cloud datastore.
I really like not to deal with authentication — something that cloud functions seems to handle well, but I assume the function would time out (?)
With the new firebase tooling available, how to:
Have long running cloud functions to do data fetching / inserts? (does it make sense?)
Get the json files into & from somewhere inside the google cloud platform?
Does it make sense to first throw large data into google-cloud-datastore (i.e. too $$$ expensive to store in firebase) or can the firebase real-time database be reliably treaded as a large data storage.
I finally post the answer as it aligns with the new Google Cloud Platform tooling of 2017.
The newly introduced Google Cloud Functions have a limited run-time of approximately 9 minutes (540 seconds). However, cloud functions are able to create a node.js read stream from cloud storage like so (#googlecloud/storage on npm)
var gcs = require('#google-cloud/storage')({
// You don't need extra authentication when running the function
// online in the same project
projectId: 'grape-spaceship-123',
keyFilename: '/path/to/keyfile.json'
});
// Reference an existing bucket.
var bucket = gcs.bucket('json-upload-bucket');
var remoteReadStream = bucket.file('superlarge.json').createReadStream();
Even though it is a remote stream, it is highly efficient. In tests I was able to parse jsons larger than 3 GB under 4 minutes, doing simple json transformations.
As we are working with node.js streams now, any JSONStream Library can efficiently transform the data on the fly (JSONStream on npm), dealing with the data asynchronously just like a large array with event streams (event-stream on npm).
es = require('event-stream')
remoteReadStream.pipe(JSONStream.parse('objects.*'))
.pipe(es.map(function (data, callback(err, data)) {
console.error(data)
// Insert Data into Firebase.
callback(null, data) // ! Return data if you want to make further transformations.
}))
Return only null in the callback at the end of the pipe to prevent a memory leak blocking the whole function.
If you do heavier transformations that require a longer run time, either use a "job db" in firebase to track where you are at and only do i.e. 100.000 transformations and call the function again, or set up an additional function which listens on inserts into a "forimport db" that finally transforms the raw jsons object record into your target format and production system asynchronously. Splitting import and computation.
Additionally, you can run cloud functions code in a nodejs app engine. But not necessarily the other way around.

Is there a way to get a webhook called on every Firebase Realtime DB transaction?

Basically, I'm using Firebase as my realtime database from my iOS application. However, I'd love to get a "Transaction log" to my server, even if it is not realtime. Is there a way to set this up? Maybe with a webhook?
You can use Cloud Functions for Firebase to write a database trigger that runs some JavaScript (running in a node.js environment) whenever data in your database changes. You can effectively use this to send changes from your Realtime Database to whatever other server you control. You would probably have to implement a webhook or some other endpoint on your server to receive the data.
In order to make outbound network requests in a function like this, you will need to upgrade your project to the Blaze plan if you haven't already.
I can't understand very well your question, what are you trying to achieve?
You can "save" an action done on your app with firebase in , at least, three ways:
1)Creating a node to log the transactions and use childupdates to save a transaction in its node and at the same time in another node like "transactionsLog" (https://firebase.google.com/docs/database/ios/read-and-write#update_specific_fields)
2)If you only need to track events done in your app by users, you can use firebase Analytics and save an event every time there is a transaction: https://firebase.google.com/docs/analytics/ios/start (get started) and https://firebase.google.com/docs/analytics/ios/start#log_events (to log events)
3)To do more complex actions when a transaction has been done you can use a Firebase Function (https://firebase.google.com/docs/functions/) where you can do, in javascript (it is a nodeJs environment ), everything you want.
Example first method (Swift):
let key = ref.child("transactions").childByAutoId().key
let post = ["uid": userID,
"author": username,
"amount": amount,
"date": date]
let childUpdates = ["/transactions/\(key)": post,
"/transactionsLog/\(userID)/\(key)/": true]
ref.updateChildValues(childUpdates)

How do I use Firebase to handle automatic server-side calculations?

Perhaps my question should be restated as: how do I refactor those behaviours into CRUD, which is what Firebase excels at?
I get that CRUD works well. I also see how the Firebase declarative security model allows me to ensure proper security server-side, where it should exist.
Let's say I have a subscription service. Each time a person signs up for a service, they need to automatically have a "due" line item added to their account. In simple terms:
/users/john
/services/goodstuff
So john can sign up for goodstuff, I might let him in for 30 days without paying, but will remind him when 30 days is up, "hey, you need to pay or else you lose your subscription to goodstuff."
With a server back-end, I would POST to /services/goodstuff/members, e.g., have part of the POST handler add a "you owe" line item to john's account, ensuring that no one can join goodstuff without being marked as owing.
In a Firebase BaaS app, where those server-side logics don't exist, how would I refactor the app to get the same effective behaviour?
Update (March 10, 2017): While the architecture I outline below is still valid and can be used to combine Firebase with any existing infrastructure, Firebase just released Cloud Functions for Firebase, which allows you to run JavaScript functions on Google's servers in response to Firebase events (such as database changes, users signing in and much more).
One potential solution (untested, sorry; but it should be the right idea):
{
"rules": {
"users": {
"$user": {
/* When they create their user record, they must write a 'due' that's
* within the next 30 days. */
".write": "!data.exists() && newData.child('due').isNumber() && newData.child('due').val() < now + (30*24*60*60*1000)"
}
},
"services":
"$service": {
/* Must be authenticated and due date must not be passed. */
".read": "auth != null && now < root.child('users/' + auth.id + '/due).val()"
}
}
}
}
This would require that when somebody logs in for the first time and initializes their users/ entry, they'd have to write a due date in the next 30 days. And then when accessing any service, that due date would be verified to have not passed.
That said, another option is to just spin up a tiny backend service to handle this sort of business logic. Firebase excels at protecting, storing, and synchronizing data. But if you have complicated business logic, you might want to think about spinning up a tiny backend process. Firebase has a REST api as well as Node.JS and JVM clients, so it's really easy to run your own backend code that integrates with Firebase.

Possibility for only currently connected (not authenticated) and admin user to read and write on certain location

Is there any way to write a security rule or is there any other approach that would make possible only for currently connected (not authenticated) user to write/read certain location - admin should also be able to write/read?
Can a rule be written that disallows users to read of complete list of entries and let them read only entry that matches some identifier that was passed from client?
I'm trying to exchange some data between user and Node.js application through Firebase and that data shouldn't be able to read or write by anyone else other than user and/or admin.
I know that one solution would be that user requests auth token on my server and uses it to authenticate on Firebase and that would make it possible to write rule which prevents reads and writes. However, I'm trying to avoid user connecting to my server so this solution is not first option.
This is in a way session based scenario which is not available in Firebase but I have
some ideas that could solve this kind of problem - if implemented before session management:
maybe letting admin write into /.info/ location which is observed by client for every change and can be read only by active connection - if I understood correctly how .info works
maybe creating .temp location for that purpose
maybe letting admin and connected client could have more access to connection information which would contain some connection unique id, that can be used to create location with that name and use it inside rule to prevent reading and listing to other users
Thanks
This seems like a classic XY problem (i.e. trying to solve the attempted solution instead of the actual problem).
If I understand your constraints correctly, the underlying issue is that you do not wish to have direct connections to your server. This is currently the model we're using with Firebase and I can think of two simple patterns to accomplish this.
1) Store the data in an non-guessable path
Create a UUID or GID or, assuming we're not talking bank level security here, just a plain Firebase ID ( firebaseRef.push().name() ). Then have the server and client communicate via this path.
This avoids the need for security rules since the URLs are unguessable, or close enough to it, in the case of the Firebase ID, for normal uses.
Client example:
var fb = new Firebase(MY_INSTANCE_URL+'/connect');
var uniquePath = fb.push();
var myId = uniquePath.name();
// send a message to the server
uniquePath.push('hello world');
From the server, simply monitor connect, each one that connects is a new client:
var fb = new Firebase(MY_INSTANCE_URL+'/connect');
fb.on('child_added', newClientConnected);
function newClientConnected(snapshot) {
snapshot.ref().on('child_added', function(ss) {
// when the client sends me a message, log it and then return "goodbye"
console.log('new message', ss.val());
ss.ref().set('goodbye');
});
};
In your security rules:
{
"rules": {
// read/write are false by default
"connect": {
// contents cannot be listed, no way to find out ids other than guessing
"$client": {
".read": true,
".write": true
}
}
}
}
2) Use Firebase authentication
Instead of expending so much effort to avoid authentication, just use a third party service, like Firebase's built-in auth, or Singly (which supports Firebase). This is the best of both worlds, and the model I use for most cases.
Your client can authenticate directly with one of these services, never touching your server, and then authenticate to Firebase with the token, allowing security rules to take effect.

Resources