Is there or will there be modular Firebase Admin Database functions i.e. update(), get(), set(), ref() etc.? Maybe there is a workaround?
Else I would have to code equal functions twice, like "craftBuilding()" on server (admin) and one for client.
Tried to:
import { getDatabase, ref, set, update, child, onChildAdded } from 'firebase-admin/database';
Error:
The requested module 'firebase-admin/database' does not provide an export named 'ref'
I expected to be able to use firebase rtdb functions like on client-side, to not code identical functions twice.
The Admin SDK is not totally modular yet but has some function such a getDatabase() starting V10.
import { getDatabase } from 'firebase-admin/database';
// TODO: Initialize Admin SDK
const db = getDatabase();
// usage
const ref = db.ref('../../..')
I would recommend using latest version of the SDK along with the existing syntax but use newer modular imports.
Since Firebase Admin is namespaced, ref is available after you initialize.
import { ref, update, set } from '#path/yourAdminFunctions';
var admin = require("firebase-admin");
// Initialize Admin SDK
var db = admin.database();
export db;
const dbReference = ref("your reference here");
const IDSnapFromQuery = query(dbReference, ["ID", "==", 10]);
You could modularize it yourself by creating functions for the operations you require in a separate file:
import { db } from '#path/yourMainFile';
export function ref(databaseReference) {
return db.ref(databaseReference);
};
// for query pass in the db.ref instance you're using
export function query(dbReference, queryParams) {
// identify what query params are e.g. ID == 10 in this example
// note that you will have to code the queryParams filter, I have left it out
dbReference
.orderByChild('ID')
.equalTo(10).on('child_added',
(snapshot) => {
return snapshot;
});
}
// do the same for update, set etc
You can read more on the admin SDK documentation.
Related
I have this multi layered application entirely hosted on GCP. At the moment, we only have the back-end part. Front-end and API are to be developed. For the front-end, the decision has been made - it will be a React.js app hosted on Firebase Hosting and the authentication method will be Email/password and users will be provisioned manually through the Firebase Hosting UI.
As we'd like to have a re-usable middle layer (API) we're in a process of making a decision what type of a solution to be used for our middle layer. The main request here is only logged in users to be able to call the API endpoints. Eventually, there will be also a native/mobile application which will have to also be able to make authenticated requests to the API.
My question here is, what type of GCP service is advised to pick here? I want it to be light, scalable and price optimized. Preferred programming language would be C# but Node.js would be also acceptable.
Firebase Functions would work well for this authenticated API. With a function, you can simply check for the existence of context.auth.uid before proceeding with the API call.
https://firebase.google.com/docs/functions/callable
You'll want to use the .onCall() method to access this context.auth object.
Here's an example I took from one of my active Firebase projects which uses this concept:
Inside your functions>src folder, create a new function doAuthenticatedThing.ts
/**
* A Firebase Function that can be called from your React Firebase client UI
*/
import * as functions from 'firebase-functions';
import { initializeApp } from 'firebase/app';
import { connectFirestoreEmulator, getFirestore, getDocs, query, where, collection } from 'firebase/firestore';
import firebaseConfig from './firebase-config.json';
let isEmulator = false;
const doAuthenticatedThing = functions
.region('us-west1')
.runWith({
enforceAppCheck: true,
memory: '256MB',
})
.https.onCall(async (_data, context) => {
// disable if you don't use app-check verify (you probably should)
if (context.app == undefined) {
throw new functions.https.HttpsError(
'failed-precondition',
'The function must be called from an App Check verified app.',
);
}
// checks for a firebase authenticated frontend user
if (context.auth == undefined) {
throw new functions.https.HttpsError(
'failed-precondition',
'The user must be authenticated.',
);
}
// establish firestore db for queries
const app = initializeApp(firebaseConfig);
const db = getFirestore(app);
// start the emulator
if (process.env.MODE === 'development' && !isEmulator) {
connectFirestoreEmulator(db, '127.0.0.1', 6060);
isEmulator = true;
}
// obtain the user's firebase auth UID
const uuid = context?.auth?.uid as string;
// do some database stuff
const ref = collection(db, 'collection-name');
const q = query(ref, where(uuid, '==', uuid));
const results = await getDocs(q);
if (results.empty) {
throw new functions.https.HttpsError(
'internal',
'There were no results found!',
);
}
// prepare document data
const data: Array<any> = [];
// gather chats, and an array of all chat uids
results.forEach((d) => {
data.push({ id: d.id, data: d.data() });
});
return data;
});
export default doAuthenticatedThing;
Make sure to reference this new Firebase Function in the functions/src/index.ts file.
import doAuthenticatedThingFn from './doAuthenticatedThing';
export const doAuthenticatedThing = doAuthenticatedThingFn;
Create a frontend React Hook so any component can use any function you make. Call it useGetFunction.ts
import { getApp } from 'firebase/app';
import { getFunctions, HttpsCallable, httpsCallable } from '#firebase/functions';
const useGetFunction = (functionName: string): HttpsCallable<unknown, unknown> => {
const app = getApp();
const region = 'us-west1';
const functions = getFunctions(app, region);
return httpsCallable(functions, functionName);
};
export default useGetFunction;
Now you can simply get this function and use it in any React component:
const SomeComponent = () => {
const doAuthenticatedThing = useGetFunction('doAuthenticatedThing');
useEffect(() => {
(async () => {
const results = await doAuthenticatedThing();
})();
}, []);
};
I am having some issues connecting my firebase storage with my google action. I need to be able to "download" the json files inside in order to be able to read and pick out what a user may need given data that they provide when they call the action.
Below is the code that I currently have, complied from the different APIs and other stackoverflow questions I have found.
const functions = require('firebase-functions');
const admin = require('firebase-admin');
const Firestore = require('#google-cloud/firestore');
const firestore = new Firestore();
var storage = require('#google-cloud/storage');
const gcs = storage({projectId: 'aur-healthcare-group'});
const bucket = gcs.bucket('gs://aur-healthcare-group');
admin.storage().bucket().file('aur-healthcare-group/aur_members.json').download(function(errr, contents){
if(!err){
var jsObjext = JSON.parse(contents.toString('utf8'));
}
});
The current error I am receiving is "code":3,"message":"Function failed on loading user code. This is likely due to a bug in the user code. Error message: Error: please examine your function logs to see the error cause. When I check the logs I only get the above mentioned message again.
I believe that I am not accessing my firebase storage correctly and have trouble finding a good resource on how to access this correctly. Would somebody be able to give me an example of how to access the storage correctly so I will be able to apply it to my project?
Since you're running in Firebase Functions, you shouldn't need to require the #google-cloud/storage dependency directly. Rather, you can get the correctly authenticated storage component via admin.storage()
Following that, you shouldn't download the file to your function, as you would be better off reading directly into memory via a readStream.
With regards to your existing code error, it may be because you're checking if (!err) when the callback variable is errr.
I've done this in the past and here's a code snippet of how I achieved it. It's written in Typescript specifically, but I think you should be able to port it to JS if you're using that directly.
import * as functions from 'firebase-functions';
import * as admin from 'firebase-admin'
import { Bucket } from '#google-cloud/storage';
admin.initializeApp()
const db = admin.firestore()
const bucket = admin.storage().bucket('project-id.appspot.com') // Use your project-id here.
const readFile = async (bucket: Bucket, fileName: string) => {
const stream = bucket.file(fileName).createReadStream();
return new Promise((resolve, reject) => {
let buffer = '';
stream.on('data', function(d: string) {
buffer += d;
}).on('end', function() {
resolve(buffer)
});
})
}
app.handle('my-intent-handler', async (conv) => {
const contents = await readArticle(bucket, 'filename.txt')
conv.add(`Your content is ${contents}`)
})
exports.fulfillment = functions.https.onRequest(app)
I was trying to figure out how I could optimize cold start times for my firebase functions. After reading this article, I wanted to try it out but I realized that the article specifically targets the base usage of the http onRequest function and doesn't give an example using express.
A similar question popped up here but doesn't seem like there's a clear answer. I saw the author of the article Doug actually commented on the question and he mentions to create a dynamic import for each route in the app since onRequest() only allows for passing the app as its only argument, but I wasn't understanding exactly what he meant by that other than to use the base API without the express app. Ideally I'd be able to use express so I can have finer control over the api url paths and use some of utility that express offers.
Can anyone give me an example of how to use express with Doug's example? Even if I have to define a new express app for each route, I'm okay with that. Just don't see how to configure it that way.
EDIT: To be clear, the goal is to optimize cold starts across all function invocations, not just the http routed ones. From my understanding, Doug's example eliminates the imports being preloaded with single routes declared using onRequest, but it doesn't show how that is possible when defining routes through express.
Assuming each router you split out is defined in it's own file like so:
// $FUNCTIONS_DIR/routes/some-route-handler.js
import express from "express";
const router = express.Router();
/* ... define routes ... */
export default router;
You could then use this middleware to load each route handler module only when it's needed.
function lazyRouterModule(modulePath) {
return async (req, res, next) {
let router;
try {
router = (await import(modulePath)).default;
} catch (err) {
// error loading module, let next() handle it
next(err);
return;
}
router(req, res, next);
}
}
In your sub-function file, you'd use that middleware to create your express app and connect the routes.
// $FUNCTIONS_DIR/fn/my-express.js
import express from "express";
const app = express();
app.use('/api', lazyRouterModule('./routes/api.js'));
app.use('/profiles', lazyRouterModule('./routes/profiles.js'));
export default app;
Then in your main functions file, you'd connect up your subfunction files on-demand:
// $FUNCTIONS_DIR/index.js
import * as functions from 'firebase-functions'
export const myExpress = functions.https
.onRequest(async (request, response) => {
await (await import('./fn/my-express.js')).default(request, response)
});
export const newUserData = functions.firestore.document('/users/{userId}')
.onCreate(async (snap, context) => {
await (await import('./fn/new-user-data.js')).default(snap, context)
});
When lazy-loading modules like this, you will want to lazy-load firebase-admin from a common file so you don't end up calling initializeApp() multiple times.
// $FUNCTIONS_DIR/common/firebase-admin.js
import * as admin from "firebase-admin";
admin.initializeApp();
export = admin;
In any function that wants to use "firebase-admin", you'd import it from here using:
// $FUNCTIONS_DIR/fn/some-function.js OR $FUNCTIONS_DIR/routes/some-route-handler.js
import * as admin from "../common/firebase-admin";
// use admin as normal, it's already initialized
Is there a way to import CSV or JSON to firebase cloud firestore like in firebase realtime database?
General Solution
I've found many takes on a script allowing to upload a JSON but none of them allowed sub-collections. My script above handles any level of nesting and sub-collections. It also handles the case where a document has its own data and sub-collections. This is based on the assumption that collection is array/object of objects (including an empty object or array).
To run the script make sure you have npm and node installed. Then run your code as node <name of the file>. Note, there is no need to deploy it as a cloud funciton.
const admin = require('../functions/node_modules/firebase-admin');
const serviceAccount = require("./service-key.json");
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: "https://<your-database-name>.firebaseio.com"
});
const data = require("./fakedb.json");
/**
* Data is a collection if
* - it has a odd depth
* - contains only objects or contains no objects.
*/
function isCollection(data, path, depth) {
if (
typeof data != 'object' ||
data == null ||
data.length === 0 ||
isEmpty(data)
) {
return false;
}
for (const key in data) {
if (typeof data[key] != 'object' || data[key] == null) {
// If there is at least one non-object item in the data then it cannot be collection.
return false;
}
}
return true;
}
// Checks if object is empty.
function isEmpty(obj) {
for(const key in obj) {
if(obj.hasOwnProperty(key)) {
return false;
}
}
return true;
}
async function upload(data, path) {
return await admin.firestore()
.doc(path.join('/'))
.set(data)
.then(() => console.log(`Document ${path.join('/')} uploaded.`))
.catch(() => console.error(`Could not write document ${path.join('/')}.`));
}
/**
*
*/
async function resolve(data, path = []) {
if (path.length > 0 && path.length % 2 == 0) {
// Document's length of path is always even, however, one of keys can actually be a collection.
// Copy an object.
const documentData = Object.assign({}, data);
for (const key in data) {
// Resolve each collection and remove it from document data.
if (isCollection(data[key], [...path, key])) {
// Remove a collection from the document data.
delete documentData[key];
// Resolve a colleciton.
resolve(data[key], [...path, key]);
}
}
// If document is empty then it means it only consisted of collections.
if (!isEmpty(documentData)) {
// Upload a document free of collections.
await upload(documentData, path);
}
} else {
// Collection's length of is always odd.
for (const key in data) {
// Resolve each collection.
await resolve(data[key], [...path, key]);
}
}
}
resolve(data);
You need a custom script to do that.
I wrote one based on the Firebase admin SDK, as long as firebase library does not allow you to import nested arrays of data.
const admin = require('./node_modules/firebase-admin');
const serviceAccount = require("./service-key.json");
const data = require("./data.json");
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: "https://YOUR_DB.firebaseio.com"
});
data && Object.keys(data).forEach(key => {
const nestedContent = data[key];
if (typeof nestedContent === "object") {
Object.keys(nestedContent).forEach(docTitle => {
admin.firestore()
.collection(key)
.doc(docTitle)
.set(nestedContent[docTitle])
.then((res) => {
console.log("Document successfully written!");
})
.catch((error) => {
console.error("Error writing document: ", error);
});
});
}
});
Update: I wrote an article on this topic - Filling Firestore with data
There is not, you'll need to write your own script at this time.
I used the General Solution provided by Maciej Caputa. Thank you (:
Here are a few hints. Assuming that you have an Ionic Firebase application installed with the required Firebase node modules in the functions folder inside that solution. This is a standard Ionic Firebase install. I created an import folder to hold the script and data at the same level.
Folder Hierarchy
myIonicApp
functions
node_modules
firebase-admin
ImportFolder
script.js
FirebaseIonicTest-a1b2c3d4e5.json
fileToImport.json
Script Parameters
const admin = require('../myIonicApp/functions/node_modules/firebase-admin'); //path to firebase-admin module
const serviceAccount = require("./FirebaseTest-xxxxxxxxxx.json"); //service account key file
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: "https://fir-test-xxxxxx.firebaseio.com" //Your domain from the hosting tab
});
Creating the Service Account Key File
In the Firebase console for your project, next to the Project
Overwiew item, click on the gear icon and select Users and
permissions
At the bottom of the screen, click Advanced permission
settings
This opens another tab for the Google Cloud Platform Console
On the left select the Service Accounts item
Create a Service Account for an existing Service Account
I simply added a key to the App Engine default service account
The Create key function will offer to download the key to a JSON file
JSON Data Structure
To use the script provided, the data structure must be as follows:
{
"myCollection" : {
"UniqueKey1" : {
"field1" : "foo",
"field2" : "bar"
},{
"UniqueKey2" : {
"field1" : "fog",
"field2" : "buzz"
}...
}
For reference. I wrote a function that helps to import and export data in Firestore.
https://github.com/dalenguyen/firestore-import-export (archived)
Updated 2023:
Use this package instead. Thanks, #Webber
https://github.com/dalenguyen/firestore-backup-restore
https://gist.github.com/JoeRoddy/1c706b77ca676bfc0983880f6e9aa8c8
This should work for an object of objects (generally how old firebase json is set up). You can add that code to an app that's already configured with Firestore.
Just make sure you have it pointing to the correct JSON file.
Good luck!
This workaround in Python may help some people. First convert json or csv to dataframe using Pandas, then convert dataframe to dictionary and upload dictionary to firestore.
import firebase_admin as fb
from firebase_admin import firestore
import pandas as pd
cred = fb.credentials.Certificate('my_credentials_certificate.json')
default_app = fb.initialize_app(cred)
db = firestore.client()
df = pd.read_csv('data.csv')
dict = df.to_dict(orient='records')
my_doc_ref = db.collection('my_collection').document('my_document')
my_doc_ref.set(dict)
There could be similar workarounds in javascript and other languages using libraries similar to Pandas.
No as of now, you can't.. firestore structures data into a different format that is, using collections and each collection has a series of documents which then are stored in JSON format.. in future they might make a tool to convert JSON in to firestore.. for reference check this out
:https://cloud.google.com/firestore/docs/concepts/structure-data
****EDIT :****
You can automate the process up to some extent, that is, write a mock software which only pushes the fields of your CSV or JSON data into your Cloud Firestore DB. I migrated my whole database my making just a simple app which retrieved my DB and pushed it on Firestore
var admin = require('firebase-admin');
var serviceAccount = require('./serviceAccountKey.json');
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: 'https://csvread-d149c.firebaseio.com'
});
const csv = require('csv-parser');
const fs = require('fs');
const firestore = admin.firestore();
// CSV FILE data.csv
// Name,Surname,Age,Gender
// John,Snow,26,M
// Clair,White,33,F
// Fancy,Brown,78,F
fs.createReadStream('data.csv')
.pipe(csv())
.on('data', (row) => {
console.log(row);
if(row) {
firestore.collection('csv').add({
name: row.Name,
surname: row.Surname,
age: row.Age,
sex: row.Gender
});
}
else {
console.log('No data')
}
})
.on('end', () => {
console.log('CSV file successfully processed');
});
There is now a paid GUI tool that does this, Firefoo. I haven't used it, but it seems quite powerful for managing Firestore - including importing csv files:
This is a small modification that copies the 'id' of the document, if exists, to its path. Otherwise it will use the "for"'s index.
...
...
} else {
// Collection's length of is always odd.
for (const key in data) {
// Resolve each collection.
if (data[key]['id']!==undefined)
await resolve(data[key], [...path,data[key]['id']])
else
await resolve(data[key], [...path, key]);
}
}
}
resolve(data);
1 - Importing only collections
If the names of your collections are not only composed of numbers, then you can define the name of the document as follows.
...
...
} else {
// Collection's length of is always odd.
for (const key in data) {
// // Resolve each collection.
// If key is a number it means that it is not a collection
// The variable id in this case will be the name of the document field or you
// can generate randomly
let id = !isNaN(key) ? data[key].id : key;
await resolve(data[key], [...path, id]);
}
}
}
2 - Import collections and sub-collections
In the same way as in the example above, the name of the sub-collection can not contain only numbers.
...
...
for (const key in data) {
// Resolve each collection and remove it from document data.
if (isCollection(data[key], [...path, key])) {
// Remove a collection from the document data.
delete documentData[key];
// If key is a number it means that it is not a collection
// The variable id in this case will be the name of the document field or you
// can generate randomly
let id = !isNaN(key) ? data[key].id : key;
// Resolve a colleciton.
resolve(data[key], [...path, id]);
}
}
...
...
Note.: Changes in #Maciej Caputa's code
this library you can use https://www.npmjs.com/package/csv-to-firestore
install this library with this command npm i -g csv-to-firestore
put this script inside your ./scripts
module.exports = {
path: "./scripts/palabrasOutput.csv",
firebase: {
credential: "./scripts/serviceAccount.json",
collection: "collectionInFirestore",
},
mapper: (dataFromCSV) => {
return dataFromCSV;
},
};
and run:
csv-to-firestore --config ./scripts/csvToFirestore.js
btw: Generate new private key (it's a file in Service account section in firestore)
put that file in /scripts folder and rename serviceAccount.json
Is there a way to import CSV or JSON to firebase cloud firestore like in firebase realtime database?
General Solution
I've found many takes on a script allowing to upload a JSON but none of them allowed sub-collections. My script above handles any level of nesting and sub-collections. It also handles the case where a document has its own data and sub-collections. This is based on the assumption that collection is array/object of objects (including an empty object or array).
To run the script make sure you have npm and node installed. Then run your code as node <name of the file>. Note, there is no need to deploy it as a cloud funciton.
const admin = require('../functions/node_modules/firebase-admin');
const serviceAccount = require("./service-key.json");
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: "https://<your-database-name>.firebaseio.com"
});
const data = require("./fakedb.json");
/**
* Data is a collection if
* - it has a odd depth
* - contains only objects or contains no objects.
*/
function isCollection(data, path, depth) {
if (
typeof data != 'object' ||
data == null ||
data.length === 0 ||
isEmpty(data)
) {
return false;
}
for (const key in data) {
if (typeof data[key] != 'object' || data[key] == null) {
// If there is at least one non-object item in the data then it cannot be collection.
return false;
}
}
return true;
}
// Checks if object is empty.
function isEmpty(obj) {
for(const key in obj) {
if(obj.hasOwnProperty(key)) {
return false;
}
}
return true;
}
async function upload(data, path) {
return await admin.firestore()
.doc(path.join('/'))
.set(data)
.then(() => console.log(`Document ${path.join('/')} uploaded.`))
.catch(() => console.error(`Could not write document ${path.join('/')}.`));
}
/**
*
*/
async function resolve(data, path = []) {
if (path.length > 0 && path.length % 2 == 0) {
// Document's length of path is always even, however, one of keys can actually be a collection.
// Copy an object.
const documentData = Object.assign({}, data);
for (const key in data) {
// Resolve each collection and remove it from document data.
if (isCollection(data[key], [...path, key])) {
// Remove a collection from the document data.
delete documentData[key];
// Resolve a colleciton.
resolve(data[key], [...path, key]);
}
}
// If document is empty then it means it only consisted of collections.
if (!isEmpty(documentData)) {
// Upload a document free of collections.
await upload(documentData, path);
}
} else {
// Collection's length of is always odd.
for (const key in data) {
// Resolve each collection.
await resolve(data[key], [...path, key]);
}
}
}
resolve(data);
You need a custom script to do that.
I wrote one based on the Firebase admin SDK, as long as firebase library does not allow you to import nested arrays of data.
const admin = require('./node_modules/firebase-admin');
const serviceAccount = require("./service-key.json");
const data = require("./data.json");
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: "https://YOUR_DB.firebaseio.com"
});
data && Object.keys(data).forEach(key => {
const nestedContent = data[key];
if (typeof nestedContent === "object") {
Object.keys(nestedContent).forEach(docTitle => {
admin.firestore()
.collection(key)
.doc(docTitle)
.set(nestedContent[docTitle])
.then((res) => {
console.log("Document successfully written!");
})
.catch((error) => {
console.error("Error writing document: ", error);
});
});
}
});
Update: I wrote an article on this topic - Filling Firestore with data
There is not, you'll need to write your own script at this time.
I used the General Solution provided by Maciej Caputa. Thank you (:
Here are a few hints. Assuming that you have an Ionic Firebase application installed with the required Firebase node modules in the functions folder inside that solution. This is a standard Ionic Firebase install. I created an import folder to hold the script and data at the same level.
Folder Hierarchy
myIonicApp
functions
node_modules
firebase-admin
ImportFolder
script.js
FirebaseIonicTest-a1b2c3d4e5.json
fileToImport.json
Script Parameters
const admin = require('../myIonicApp/functions/node_modules/firebase-admin'); //path to firebase-admin module
const serviceAccount = require("./FirebaseTest-xxxxxxxxxx.json"); //service account key file
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: "https://fir-test-xxxxxx.firebaseio.com" //Your domain from the hosting tab
});
Creating the Service Account Key File
In the Firebase console for your project, next to the Project
Overwiew item, click on the gear icon and select Users and
permissions
At the bottom of the screen, click Advanced permission
settings
This opens another tab for the Google Cloud Platform Console
On the left select the Service Accounts item
Create a Service Account for an existing Service Account
I simply added a key to the App Engine default service account
The Create key function will offer to download the key to a JSON file
JSON Data Structure
To use the script provided, the data structure must be as follows:
{
"myCollection" : {
"UniqueKey1" : {
"field1" : "foo",
"field2" : "bar"
},{
"UniqueKey2" : {
"field1" : "fog",
"field2" : "buzz"
}...
}
For reference. I wrote a function that helps to import and export data in Firestore.
https://github.com/dalenguyen/firestore-import-export (archived)
Updated 2023:
Use this package instead. Thanks, #Webber
https://github.com/dalenguyen/firestore-backup-restore
https://gist.github.com/JoeRoddy/1c706b77ca676bfc0983880f6e9aa8c8
This should work for an object of objects (generally how old firebase json is set up). You can add that code to an app that's already configured with Firestore.
Just make sure you have it pointing to the correct JSON file.
Good luck!
This workaround in Python may help some people. First convert json or csv to dataframe using Pandas, then convert dataframe to dictionary and upload dictionary to firestore.
import firebase_admin as fb
from firebase_admin import firestore
import pandas as pd
cred = fb.credentials.Certificate('my_credentials_certificate.json')
default_app = fb.initialize_app(cred)
db = firestore.client()
df = pd.read_csv('data.csv')
dict = df.to_dict(orient='records')
my_doc_ref = db.collection('my_collection').document('my_document')
my_doc_ref.set(dict)
There could be similar workarounds in javascript and other languages using libraries similar to Pandas.
No as of now, you can't.. firestore structures data into a different format that is, using collections and each collection has a series of documents which then are stored in JSON format.. in future they might make a tool to convert JSON in to firestore.. for reference check this out
:https://cloud.google.com/firestore/docs/concepts/structure-data
****EDIT :****
You can automate the process up to some extent, that is, write a mock software which only pushes the fields of your CSV or JSON data into your Cloud Firestore DB. I migrated my whole database my making just a simple app which retrieved my DB and pushed it on Firestore
var admin = require('firebase-admin');
var serviceAccount = require('./serviceAccountKey.json');
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: 'https://csvread-d149c.firebaseio.com'
});
const csv = require('csv-parser');
const fs = require('fs');
const firestore = admin.firestore();
// CSV FILE data.csv
// Name,Surname,Age,Gender
// John,Snow,26,M
// Clair,White,33,F
// Fancy,Brown,78,F
fs.createReadStream('data.csv')
.pipe(csv())
.on('data', (row) => {
console.log(row);
if(row) {
firestore.collection('csv').add({
name: row.Name,
surname: row.Surname,
age: row.Age,
sex: row.Gender
});
}
else {
console.log('No data')
}
})
.on('end', () => {
console.log('CSV file successfully processed');
});
There is now a paid GUI tool that does this, Firefoo. I haven't used it, but it seems quite powerful for managing Firestore - including importing csv files:
This is a small modification that copies the 'id' of the document, if exists, to its path. Otherwise it will use the "for"'s index.
...
...
} else {
// Collection's length of is always odd.
for (const key in data) {
// Resolve each collection.
if (data[key]['id']!==undefined)
await resolve(data[key], [...path,data[key]['id']])
else
await resolve(data[key], [...path, key]);
}
}
}
resolve(data);
1 - Importing only collections
If the names of your collections are not only composed of numbers, then you can define the name of the document as follows.
...
...
} else {
// Collection's length of is always odd.
for (const key in data) {
// // Resolve each collection.
// If key is a number it means that it is not a collection
// The variable id in this case will be the name of the document field or you
// can generate randomly
let id = !isNaN(key) ? data[key].id : key;
await resolve(data[key], [...path, id]);
}
}
}
2 - Import collections and sub-collections
In the same way as in the example above, the name of the sub-collection can not contain only numbers.
...
...
for (const key in data) {
// Resolve each collection and remove it from document data.
if (isCollection(data[key], [...path, key])) {
// Remove a collection from the document data.
delete documentData[key];
// If key is a number it means that it is not a collection
// The variable id in this case will be the name of the document field or you
// can generate randomly
let id = !isNaN(key) ? data[key].id : key;
// Resolve a colleciton.
resolve(data[key], [...path, id]);
}
}
...
...
Note.: Changes in #Maciej Caputa's code
this library you can use https://www.npmjs.com/package/csv-to-firestore
install this library with this command npm i -g csv-to-firestore
put this script inside your ./scripts
module.exports = {
path: "./scripts/palabrasOutput.csv",
firebase: {
credential: "./scripts/serviceAccount.json",
collection: "collectionInFirestore",
},
mapper: (dataFromCSV) => {
return dataFromCSV;
},
};
and run:
csv-to-firestore --config ./scripts/csvToFirestore.js
btw: Generate new private key (it's a file in Service account section in firestore)
put that file in /scripts folder and rename serviceAccount.json