Firebase Realtime Database currently gives TRIGGER_PAYLOAD_TOO_LARGE error - firebase

Since this morning, our Firebase application has a problem when writing data to the Realtime Database instance. Even the simplest task, such as adding one key-value pair to an object triggers
Error: TRIGGER_PAYLOAD_TOO_LARGE: This request would cause a function payload exceeding the maximum size allowed.
It is especially strange since nothing in our code or database has changed for more than 24 hours.
Even something as simple as
Database.ref('environments/' + envkey).child('orders/' + orderkey).ref.set({a:1})
triggers the error.
Apperently, the size of the payload is not the problem, but what could be causing this?
Database structure, as requested
environments
+-env1
+-env2
--+orders
---+223344
-----customer: "Peters"
-----country: "NL"
-----+items
------item1
-------code: "a"
-------value: "b"
------item2
-------code: "x"
-------value: "2"

Ok I figured this out. The issue is not related to your write function, but to one of the cloud functions the write action would trigger.
For example, we have a structure like:
/collections/data/abcd/items/a
in JSON:
"collections": {
"data": {
"abc": {
"name": "example Col",
"itemCount": 5,
"items": {
"a": {"name": "a"},
"b": {"name": "b"},
"c": {"name": "c"},
"d": {"name": "d"},
"e": {"name": "e"},
}
}
}
}
Any write into an item was failing at all whatsoever. API, Javascript, even a basic write in the console.
I decided to look at our cloud functions and found this:
const countItems = (collectionId) => {
return firebaseAdmin.database().ref(`/collections/data/${collectionId}/items`).once('value')
.then(snapshot => {
const items = snapshot.val();
const filtered = Object.keys(items).filter(key => {
const item = items[key];
return (item && !item.trash);
});
return firebaseAdmin.database().ref(`/collections/meta/${collectionId}/itemsCount`)
.set(filtered.length);
});
};
export const onCollectionItemAdd = functions.database.ref('/collections/data/{collectionId}/items/{itemId}')
.onCreate((change, context) => {
const { collectionId } = context.params;
return countItems(collectionId);
});
On it's own it's nothing, but that trigger reads for ALL items and by default firebase cloud functions send's the entire snapshot to the CF even if we don't use it. In Fact it sends the previous and after values too, so if you (like us) have a TON of items at that point my guess it the payload that it tries to send to the cloud function is way too big.
I removed the count functions from our CF and boom, back to normal. Not sure the "correct" way to do the count if we can't have the trigger at all, but I'll update this if we do...

The TRIGGER_PAYLOAD_TOO_LARGE error is part of a new feature Firebase is rolling out, where our existing RTDB limits are being strictly enforced. The reason for the change is to make sure that we aren't silently dropping any Cloud Functions triggers, since any event exceeding those limits can't be sent to Functions.
You can turn this feature off yourself by making this REST call:
curl -X PUT -d "false" https://<namespace>.firebaseio.com/.settings/strictTriggerValidation/.json?auth\=<SECRET>
Where <SECRET> is your DB secret
Note that if you disable this, the requests that are currently failing may go through, but any Cloud Functions you have that trigger on the requests exceeding our limits will fail to run. If you are using database triggers for your functions, I would recommend you re-structure your requests so that they stay within the limits.

Related

Firebase: Delete user analytics data - userdeletionRequests:upsert - GDPR

My question is, how can I delete a users analytics data from firebase using "Google User Deletion API" and its method: userdeletionRequests:upsert? This is important for me to fully fulfill GDPR.
I tried searching for this, but didn't a solution for using it in combination with "NodeJS" and "firebase-cloud-functions".
My biggest problem is, how I get the access, token, this is what I have for now:
const accessToken = (await admin.credential.applicationDefault().getAccessToken()).access_token;
return ky.post(constants.googleUserDeletionURL, {
body: JSON.stringify({
kind: "analytics#userDeletionRequest",
id: {
type: "USER_ID",
userId: uid,
},
propertyId: constants.googleUserDeletionPropertyId,
}),
headers: {
"Authorization": `Bearer ${accessToken}`,
},
}).catch((err) => {
functions.logger.log(`An Error happened trying to delete user-anayltics ${(err as Error).message}`);
});
But I always get An Error happened trying to delete user-anayltics Request failed with status code 403 Forbidden
Okay, after some painful and long days (literally took me like >20 hours), I've figured out how to achieve this request. Here is a step for step guide:
Step 1 - Needed Dependencies
To send a post-request to google, we need a http-client-library. I've choosen "ky", so we need to install it first with:
npm install ky
Furthermore, we need to create or OWN oAuth2 token, otherwise the post request will be denied with "error 403". To create our own oAuth token, we need another dependency:
npm install #googleapis/docs
Step 2 - Needed Google Property ID
With your request, Google needs to know which property-id / project you are targeting (where it should later delete the user-analytics-data). To get this property-id, we need to log in into GCP via Firebase (not the "real" GCP, the one via Firebase).
For this, go into your "console.firebase.google.com" → Select your project → Analytics Dashboard → "View more in Google Analytics (button at the right corner)"
Write "property-id" into the search field and then save it somewhere (we need it later)
Step 3 - Creating Client-Secret
The third step is to create a service-account, which we will later add into our functions-folder, in order to create our oAuthClient (don't worry, you will see what I mean to a later point)
To create your new service.json, log in to google cloud platform via "https://cloud.google.com/" and then follow the pictures:
FIRST:
SECOND:
THIRD:
FOURTH:
FIFTH
Step 4 - Download JSON
After we created our "oAuth-Deleter service-account", we need to manually download the needed JSON, so we can paste it into our functions-folder.
For this, select "oauth-deleter#your-domain.iam.gserviceaccount.com" under "Service Account"
Then click on "Keys" and "Add key", which will automagically download you a service-json (SELECT Key type → JSON → Create).
Step 5 - Paste JSON file into your functions-folder
To loosen up the mood a bit, here is an easy step. Paste the downloaded JSON-File into your functions-folder.
Step 6 - Grant Access to our new created oAuth-Delelter-Account
Creating the service-account and giving it access in the normal GCP is not enough for Google, so we also have to give it access in our Firebase project. For this, go back into your "GCP via Firebase (see Step 2)" → Click on Setting → "User Access for Account" → Click on the "plus"
Then click on "Add user" and write the email we copied before into the email field (the email from Step 3, Picture FOURTH "Service-Account ID). In our case, it is "oAuth-Deleter#your-domain.iam.gserviceaccount.com". Furthermore, it needs admin-access:
Step 6 - The code
Now, after these million unnecessary but necessary steps, we get to the final part. THE DAMN CODE. I've written this in typescript with "compilerOptions" → "module": "esnext", "target": "esnext". But I am sure that you are smart enough to change the code after completing this many steps :)
import admin from "firebase-admin";
import functions from "firebase-functions";
import ky from "ky";
import docs from "#googleapis/docs";
import { UserRecord } from "firebase-functions/v1/auth";
export const dbUserOnDeleted = functions.
.auth
.user()
.onDelete((user) => doOnDeletedUser(user))
----------------------------
export asnc function doOnDeletedUser/user: UserRecord) {
try {
const googleDeletionURL = "https://www.googleapis.com/analytics/v3/userDeletion/userDeletionRequests:upsert"
// Step 1: Paste propertyID: (See Step 2)
const copiedPropertyID = "12345678"
// Step 2: Create oAuthClient
const oAuthClient = new docs.auth.GoogleAuth({
keyFile: "NAME-OF-THE-FILE-YOU-COPIED-INTO-YOUR-FUNCTIONS-FOLDER",
scopes: ["https://www.googleapis.com/auth/analytics.user.deletion"]
});
// Step 3: Get user uid you want to delete from user-analytics
const uid = user.uid
// Step 4: Generate access token
// (this is the reason why we needed the 5 steps before this)
// yup, a single line of code
const accessToken = await oAuthClient.getAccessToken() as string;
// Step 5: Make the request to google and delete the user
return ky.post(googleDeletionURL, {
body: JSON.stringify({
kind: "analytics#userDeletionRequest",
id: {
type: "USER_ID",
userid: uid
},
propertyId: copiedPropertyID
}),
headers: {
"Authorization": "Bearer " + accessToken,
}
});
} catch (err) {
functions.logger.error(`Something bad happened, ${(err as Error).message)`
}
}
Afterthoughts
This was and probably will be my longest post at stack overflow forever. I have to say that it was a pain in the a** to get this thing to working. The amount of work and setup that is needed to simply delete a data from one endpoint is just ridiculous. Google, please fix.

NuxtJS store returning local storage values as undefined

I have a nuxt application. One of the components in it's mounted lifecycle hook is requesting a value from the state store, this value is retrieved from local storage. The values exist in local storage however the store returns it as undefined. If I render the values in the ui with {{value}}
they show. So it appears that in the moment that the code runs, the value is undefined.
index.js (store):
export const state = () => ({
token: process.browser ? localStorage.getItem("token") : undefined,
user_id: process.browser ? localStorage.getItem("user_id") : undefined,
...
Component.vue
mounted hook:
I'm using UserSerivce.getFromStorage to get the value directly from localStorage as otherwise this code block won't run. It's a temporary thing to illustrate the problem.
async mounted() {
// check again with new code.
if (UserService.getFromStorage("token")) {
console.log("user service found a token but what about store?")
console.log(this.$store.state.token, this.$store.state.user_id);
const values = await ["token", "user_id"].map(key => {return UserService.getFromStorage(key)});
console.log({values});
SocketService.trackSession(this, socket, "connect")
}
}
BeforeMount hook:
isLoggedIn just checks that the "token" property is set in the store state.
return !!this.$store.state.token
beforeMount () {
if (this.isLoggedIn) {
// This runs sometimes??? 80% of the time.
console.log("IS THIS CLAUSE RUNNING?");
}
}
video explanation: https://www.loom.com/share/541ed2f77d3f46eeb5c2436f761442f4
OP's app is quite big from what it looks, so finding the exact reason is kinda difficult.
Meanwhile, setting ssr: false fixed the errors.
It raised more, but they should probably be asked into another question nonetheless.

Firebase emulator always returns Error: 2 UNKNOWN after trying out the firestore background trigger functions?

** This is that my firestore (emulator) looks like**
I am trying to practice learning about cloud functions with firebase emulator however, I am running into this probably more often than I expected. I hope it is my end's problem.
I am trying to write a function where when the user made the https request to create an order, the background trigger function will return out the total (quantity * price) to the user. The later part is still WIP at the moment; I am currently just trying to understand and learn more about cloud functions.
This is the https request code I have to add the item, price, and quantity to my firestore. It works well and as intended.
exports.addCurrentOrder = functions.https.onRequest(async (req, res) => {
const useruid = req.query.uid;
const itemName = req.query.itemName;
const itemPrice = req.query.itemPrice;
const itemQuantity = req.query.itemQuantity;
console.log('This is in useruid: ', useruid);
const data = { [useruid] : {
'Item Name': itemName,
'Item Price': itemPrice,
'Item Quantity': itemQuantity,
}};
const writeResult = await admin.firestore().collection('Current Orders').add(data);
res.json({result: data});
});
This is the part that's giving me all sorts of errors:
exports.getTotal = functions.firestore.document('Current Orders/{documentId}').onCreate((snap, context) => {
const data = snap.data();
for(const i in data){
console.log('This is in i: ', i);
}
return snap.ref.set({'testing': 'testing'}, {merge: true});
});
Whenever I have this, the console will always give me:
functions: Error: 2 UNKNOWN:
at Object.callErrorFromStatus (/Users/user/firecast/functions/node_modules/#grpc/grpc-js/build/src/call.js:30:26)
at Object.onReceiveStatus (/Users/user/firecast/functions/node_modules/#grpc/grpc-js/build/src/client.js:175:52)
at Object.onReceiveStatus (/Users/user/firecast/functions/node_modules/#grpc/grpc-js/build/src/client-interceptors.js:341:141)
at Object.onReceiveStatus (/Users/user/firecast/functions/node_modules/#grpc/grpc-js/build/src/client-interceptors.js:304:181)
at Http2CallStream.outputStatus (/Users/user/firecast/functions/node_modules/#grpc/grpc-js/build/src/call-stream.js:116:74)
at Http2CallStream.maybeOutputStatus (/Users/user/firecast/functions/node_modules/#grpc/grpc-js/build/src/call-stream.js:155:22)
at Http2CallStream.endCall (/Users/user/firecast/functions/node_modules/#grpc/grpc-js/build/src/call-stream.js:141:18)
at Http2CallStream.handleTrailers (/Users/user/firecast/functions/node_modules/#grpc/grpc-js/build/src/call-stream.js:273:14)
at ClientHttp2Stream.<anonymous> (/Users/user/firecast/functions/node_modules/#grpc/grpc-js/build/src/call-stream.js:322:26)
at ClientHttp2Stream.emit (events.js:210:5)
Caused by: Error
at WriteBatch.commit (/Users/user/firecast/functions/node_modules/#google-cloud/firestore/build/src/write-batch.js:415:23)
at DocumentReference.create (/Users/user/firecast/functions/node_modules/#google-cloud/firestore/build/src/reference.js:283:14)
at CollectionReference.add (/Users/user/firecast/functions/node_modules/#google-cloud/firestore/build/src/reference.js:2011:28)
**at /Users/user/firecast/functions/index.js:43:76**
at /usr/local/lib/node_modules/firebase-tools/lib/emulator/functionsEmulatorRuntime.js:593:20
at /usr/local/lib/node_modules/firebase-tools/lib/emulator/functionsEmulatorRuntime.js:568:19
at Generator.next (<anonymous>)
at /usr/local/lib/node_modules/firebase-tools/lib/emulator/functionsEmulatorRuntime.js:8:71
at new Promise (<anonymous>)
at __awaiter (/usr/local/lib/node_modules/firebase-tools/lib/emulator/functionsEmulatorRuntime.js:4:12)
⚠ Your function was killed because it raised an unhandled error.
Even if I comment out the function that I think is giving me the error, I will still run into this problem (and when I run the sample function found on the official cloud function guide too!)
I know there is definitely something that I am doing horribly wrong on the background trigger (and would love to have someone be kind enough to show me how to write such function/get me startted)
What am I doing wrong here? Or is this some sort of the emulator bug?
I think I found it. Although I think there is better solution possible.
On completely new system+firebase I have used this firebase emulators tutorial to create first onCreate trigger called makeUppercase and it worked. Than I added your getTotal and it was not working and as well it spoiled the makeUppercase trigger as well!
I started to test some changes. Tried many times and, finally, I have removed "space" character from collection name like this:
exports.getTotal = functions.firestore.document('CurrentOrders/{documentId}')...etc.
Both triggers started working as well (used fresh VM with Windows+node12). It's possible that it will be working on real Firestore instance. So it seems the "space" in collection name is generating some errors in whole index.js.

Firestore query reads all the documents again (from database) when only one is modified even with persistence enabled

Expected Behavior: I assumed that when some field of a document is modified, query will only return that document and so it would be considered only 1 read. Say I query a collection with n documents, I would hope that it would be n reads for the initial state, and if a document is modified later only that document would be read again from database, resulting in n+1 reads in total.
What I see
is n reads initially, and n reads later whenever one document is modified regardless of Offline Persistence being enalbed or not; resulting in 2n reads with all "fromCache : false" metadata.
myCollection:
doc1 :
value: "some string"
...
docn:
value: "some text"
later I change value for one document via console:
myCollection:
doc1 :
value: "changed to other value"
...
docn:
value: "some text"
But I get all n documents again in the snapshot of query. Regardless of the fact that I can observe changes using "s.docChanges", it seems that snapshot has all the n documents.
const someCollectionRef = firebase.firestore().collection("collection/document/subdocl‌​lection");
someCollectionRef.onSnapshot(s => {
var d = [];
s.forEach(doc => {
d.push(doc.data()));
//fromCache is always false even when one document is
//modified it seems that all documents are read from db again
console.log(`metadata is ` , doc.metadata);
}
//d seems to have all my documents even when only 1 is modified
});
No difference with offline persistence enabled. I tested the same example with offline persistence enabled and no difference! Modifying a document still causes all the documents to be read with "fromCache : false" metadata, i.e, they are read from database. The only time data is being read from cache is when a query is refreshed but documents are all identical.
I noticed this behavior and did some research: actually all is well!
When you are listening to a collection of N documents, you perform N reads for the initial state, then 1 read when a documents changes, as expected. This is described nicely in this youtube video, and in the documentation for the billing aspect.
The trap is the fromCache metadata. When you receive an update for a document change, all the other unchanged documents have fromCache: false, but they actually come from the cache, as described by a Firebase dev in this SO answer!
The fromCache metadata is true only in the scope of offline persistence, when the initial N reads were done from the local cache. When a document has received data from the server, it sets fromCache: false, and never goes back to fromCache: true. The naming is disturbing, but it seems to be working as expected.
And the code for testing:
<html>
<body>
<script src="/__/firebase/7.22.1/firebase-app.js"></script>
<script src="/__/firebase/7.23.0/firebase-firestore.js"></script>
<script src="/__/firebase/init.js"></script>
<script>
firebase.firestore().enablePersistence().then(() => {
const collectionRef = firebase.firestore().collection("/fruits");
collectionRef.onSnapshot({includeMetadataChanges: true}, snapshot => {
console.log("=================== snapshot received ===================");
snapshot.forEach(doc => {
console.log(`doc ${doc.id}, metadata:`, doc.metadata);
});
});
});
</script>
</body>
</html>
I modify a document using Firebase console, and in my browser's console, when the long polling request terminates, it shows that a single document was transferred:
15
[[14,["noop"]]]370
[[15,[{
"documentChange": {
"document": {
"name": "projects/xxx/databases/(default)/documents/fruits/apricot",
"fields": {
"color": {
"stringValue": "red"
}
},
"createTime": "2020-10-13T08:14:19.649748Z",
"updateTime": "2020-10-13T14:16:09.991161Z"
},
"targetIds": [
2
]
}
}
]]]122
[[16,[{
"targetChange": {
"resumeToken": "CgkI+bft8+Cx7AI=",
"readTime": "2020-10-13T14:16:09.991161Z"
}
}
]]]15
[[17,["noop"]]]

How to delete a large node in firebase

I have a Firebase child node with about 15,000,000 child objects with a total size of about 8 GB of data.
exampele data structure:
firebase.com/childNode/$pushKey
each $pushKey contains a small flat dictionary:
{a: 1.0, b: 2.0, c: 3.0}
I would like to delete this data as efficiently and easy as possible. How?
What i Tried:
My first try was a put request:
PUT firebase.com/childNode.json?auth=FIRE_SECRET
data-raw: null
response: {
"error": "Data requested exceeds the maximum size that can be accessed with a single request. Contact support#firebase.com for help."
}
So that didn't work, let's do a limit request:
PUT firebase.com/childNode.json?auth=FIRE_SECRET&orderBy="$key"&limitToFirst=100
data-raw: null
response: {
"error": "Querying related parameters not supported on this request type"
}
No luck so far :( What about writing a script that will get the first X number of keys and then create a patch request with each value set to null?
GET firebase.com/childNode.json?auth=FIRE_SECRET&shallow=true&orderBy="$key"&limitToLast=100
{
"error" : "Mixing 'shallow' and querying parameters is not supported"
}
It's really not going to be easy this one? I could remove the shallow requirement and get the keys, and finish the script. I was just hoping there would be a easier/more efficient way???
Another thing i tried were to create a node script that listen for childAdded and then directly tries to remove those children?
ref.authWithCustomToken(AUTH_TOKEN, function(error, authData) {
if (error) {console.log("Login Failed!", error)}
if (!error) {console.log("Login Succeeded!", authData)}
ref.child("childNode").on("child_added", function(snap) {
console.log(`found: ${snap.key()}`)
ref.child("childNode").child(snap.key()).remove( function(err) {
if (!err) {console.log(`deleted: ${snap.key()}`)}
})
})
})
This script actually hangs right now, but earlier I did receive somethings like a max stack limit warning from firebase. I know this is not a firebase problem, but I don't see any particular easy way to solve that problem.
Downloading a shallow tree, will download only the keys. So instead of asking the server to order and limit, you can download all keys.
Then you can order and limit it client-side, and send delete requests to Firebase in batches.
You can use this script for inspiration: https://gist.github.com/wilhuff/b78e7391396e09f6c614
Use firebase cli tool for this: firebase database:remove --project .
In Browser Console this is fastest way
database.ref('data').limitToFirst(10000).once('value', snap => {
var updates = {};
snap.forEach(snap => {
updates[snap.key] = null;
});
database.ref('data').update(updates);
});

Resources