I am currently facing the following situation.
Sending Firebase Messages via HttpCall via the google API endpoint:
https://fcm.googleapis.com/v1/projects/projectName/messages:send
Here we have to use OAuth2.0 with a valid Bearer Token like discussed in this question:
What Bearer token should I be using for Firebase Cloud Messaging testing?
After following these steps I was able to send Firebase Messages via the google API.
Now I would like to get the Bearer Token via a HttpCall without doing the manual step with the Playground https://developers.google.com/oauthplayground.
I cannot find any documentation on how to "Exchange authorization code for tokens" via simple HttpCall. I have no possibility to implement any code because I would like to send Firebase messages inside a "Cloud Flow", therefore no possibility to load any external DLL (like the Firebase Admin Dll, which would implement this functionality).
Any help is appreciated
The below code is a Postman Pre-Request Script that is installed on your API collection that contains the routes you are testing. Its purpose is to convert static credentials, like a email-password combination or service account key into an access token to be used with API calls.
Emulate User
To use it for testing on behalf of users, you would add a X-Auth-Token-Type: user header on the request (used & removed by the script below) and you will need to have set up the following environment variables:
Name
Value
firebase_apiKey
The Firebase API Key for a web application
firebase_test_user
An email for an account used for testing
firebase_test_password
A password for an account used for testing
Emulate Service Account (Use with caution!)
To use it for testing on behalf of a service account, you would add a X-Auth-Token-Type: admin header on the request (used & removed by the script below) and you will need to have set up the following environment variables:
Name
Value
firebase_privateKey
The value of private_key in a Service Account KeyImportant: For security do not set the "initial value" for this variable!
firebase_scope (optional)
A space-delimited list of scopes to authenticate for.Note: If omitted, the default Admin SDK scopes are used
The Pre-Request Script
const { Header, Response, HeaderList } = require('postman-collection');
/**
* Information about the current Firebase user
* #typedef {Object} UserInfo
* #property {String} accessToken - The Firebase ID token for this user
* #property {String | undefined} displayName - Display name of the user, if available
* #property {Number} expiresAt - When this token expires as a unix timestamp
* #property {String | undefined} email - Email associated with the user, if available
* #property {String} refreshToken - Refresh token for this user's ID token
* #property {String} uid - User ID for this user
*/
/**
* Loads a third-party JavaScript module from a CDN (e.g. unpkg, jsDelivr)
* #param {[String, String, String]} moduleTuple - Array containing the module's ID, its source URL and an optional SHA256 signature
* #param {Object | (err: any, exports: any) => any} exportsRefOrCallback - Object reference to use as `exports` for the module or a result handler callback
* #param {(err: any, exports: any) => any} callback - result handler callback
*/
function loadModule(moduleTuple, exportsRefOrCallback, callback = undefined) {
const exports = arguments.length == 2 ? {} : exportsRefOrCallback;
callback = arguments.length == 2 ? exportsRefOrCallback : callback;
const [id, src, signature] = moduleTuple;
if (pm.environment.has("jslibcache_" + id)) {
const script = pm.environment.get("jslibcache_" + id);
if (signature && signature === CryptoJS.SHA256(script).toString()) {
console.log("Using cached copy of " + src);
try {
eval(script);
return callback(null, exports);
} catch {}
}
}
pm.sendRequest(src, (err, response) => {
try {
if (err || response.code !== 200) {
pm.expect.fail('Could not load external library');
}
const script = response.text();
signature && pm.expect(CryptoJS.SHA256(script).toString(), 'External library (' + id + ') has a bad SHA256 signature').to.equal(signature);
pm.environment.set("jslibcache_" + id, script);
eval(script);
callback(null, exports);
} catch (err) {
callback(err, null);
}
});
}
/**
* Signs in a test user using an email and password combination
*
* #param {String} email email of the account to sign in with
* #param {String} password email of the account to sign in with
* #param {(error: any, response: Response) => any} callback request result handler
*/
function signInWithEmailAndPassword(email, password, callback) {
pm.sendRequest({
url: "https://www.googleapis.com/identitytoolkit/v3/relyingparty/verifyPassword?key=" + encodeURIComponent(pm.environment.get("firebase_apiKey")),
body: JSON.stringify({ email, password, "returnSecureToken": true }),
headers: new HeaderList({}, [new Header("application/json", "Content-Type")]),
method: "POST"
}, callback);
}
/**
* Builds an Admin SDK compatible JWT using a Service Account key
*
* Required Environment Variables:
* - `firebase_privateKey` - the private key from inside a service account key JSON file
*
* Environment Variables:
* - `firebase_scope` - scopes used for the access token, space delimited
*
* #param {Boolean | (error: any, idToken: String) => any} callbackOrForceRefresh token result handler or `true` to force using a fresh user token
* #param {(error: any, idToken: String) => any} [callback] token result handler
*/
function getAdminToken(callbackOrForceRefresh, callback) {
let forceRefresh = Boolean(callbackOrForceRefresh);
if (arguments.length === 1) {
callback = callbackOrForceRefresh;
forceRefresh = callbackOrForceRefresh = false;
}
loadModule(
["jsrsasign", "https://unpkg.com/jsrsasign#10.3.0/lib/jsrsasign.js", "39b7a00e9eed7d20b2e60fff0775697ff43160e02e5276868ae8780295598fd3"],
(loadErr, { KJUR }) => {
if (loadErr) return callback(loadErr, null);
const exp = pm.environment.get("currentAdmin.exp");
const nowSecs = Math.floor(Date.now() / 1000);
if (exp && exp > nowSecs && forceRefresh === false) {
return callback(null, pm.environment.get("currentAdmin.jwt"));
}
try {
if (!pm.environment.has('firebase_privateKey')) {
pm.expect.fail('Missing required environment variable "firebase_privateKey".');
}
// use specified scopes, or fallback to Admin SDK defaults
const scope = pm.environment.get('firebase_scope') || 'https://www.googleapis.com/auth/cloud-platform https://www.googleapis.com/auth/firebase.database https://www.googleapis.com/auth/firebase.messaging https://www.googleapis.com/auth/identitytoolkit https://www.googleapis.com/auth/userinfo.email';
const privateKey = String(pm.environment.get('firebase_privateKey')).replace("\\n", "\n");
const header = {"alg" : "RS256", "typ" : "JWT"};
const claimSet =
{
"iss": "https://securetoken.google.com/" + pm.environment.get("firebase_projectId"),
"scope": scope,
"aud":"https://accounts.google.com/o/oauth2/auth",
"exp": nowSecs + 3600, // now + 1 hour
"iat": nowSecs
}
const jwt = KJUR.jws.JWS.sign(null, header, claimSet, privateKey);
// comment these lines out to disable caching
pm.environment.set("currentAdmin.jwt", jwt);
pm.environment.set("currentAdmin.exp", claimSet.exp);
callback(null, jwt);
} catch (err) {
callback(err, null);
}
}
);
}
/**
* Builds a User ID Token using an email-password combo
*
* Required Environment Variables:
* - `firebase_apiKey` - the Firebase API key for a web application
* - `firebase_test_user` - an email for a test user
* - `firebase_test_password` - the password for the test user
*
* #param {Boolean | (error: any, idToken: String) => any} callbackOrForceRefresh token result handler or `true` to force using a fresh user token
* #param {(error: any, idToken: String) => any} [callback] token result handler
*/
function getIdToken(callbackOrForceRefresh, callback) {
let forceRefresh = Boolean(callbackOrForceRefresh);
if (arguments.length === 1) {
callback = callbackOrForceRefresh;
forceRefresh = callbackOrForceRefresh = false;
}
if (pm.environment.has("currentUser") && forceRefresh === false) {
/** #type UserInfo */
const currentUser = JSON.parse(pm.environment.has("currentUser"));
if (currentUser.expiresAt > Date.now()) { // has token expired?
return callback(null, currentUser.accessToken);
}
}
try {
if (!pm.environment.has('firebase_apiKey')) {
pm.expect.fail('Missing required environment variable "firebase_apiKey".');
}
if (!pm.environment.has('firebase_test_user')) {
pm.expect.fail('Missing required environment variable "firebase_test_user".');
}
if (!pm.environment.has('firebase_test_password')) {
pm.expect.fail('Missing required environment variable "firebase_test_password".');
}
} catch (err) {
return callback(err, null);
}
signInWithEmailAndPassword(pm.environment.get("firebase_test_user"), pm.environment.get("firebase_test_password"), (err, response) => {
if (err || response.code !== 200) {
pm.expect.fail('Could not sign in user: ' + response.json().error.message);
}
/** #type String */
let accessToken;
try {
const { idToken, refreshToken, email, displayName, localId: uid, expiresIn } = response.json();
accessToken = idToken;
const expiresAt = Date.now() + Number(expiresIn);
// comment these lines out to disable caching
pm.environment.set("currentUser", JSON.stringify({ accessToken, refreshToken, email, displayName, uid, expiresAt }));
// pm.environment.set("currentUser.accessToken", accessToken);
// pm.environment.set("currentUser.refreshToken", refreshToken);
// pm.environment.set("currentUser.email", email);
// pm.environment.set("currentUser.displayName", displayName);
// pm.environment.set("currentUser.uid", uid);
// pm.environment.set("currentUser.expiresAt", expiresAt);
} catch (err) {
return callback(err, null);
}
callback(null, accessToken);
});
}
const tokenTypeHeader = pm.request.headers.one("X-Auth-Token-Type");
pm.request.removeHeader("X-Auth-Token-Type");
switch (tokenTypeHeader && tokenTypeHeader.value.toLowerCase()) {
case "admin":
getAdminToken(false, (err, token) => {
if (err || !token) pm.expect.fail("failed to get admin SDK token for request: " + err.message);
pm.request.addHeader(new Header("Bearer " + token, "Authorization"));
});
case "user":
getIdToken(false, (err, idToken) => {
if (err || !idToken) pm.expect.fail("failed to get user ID token for request: " + err.message);
pm.request.addHeader(new Header("Bearer " + idToken, "Authorization"));
});
break;
default:
break; // no auth, do nothing
}
You can obtain a valid Bearer Token from the OAuth access token with your firebase service account. Using your Service Account credentials from your Firebase console. If it is at all possible within your environment, I suggest using the OAuth 2 options than you can find here: https://firebase.google.com/docs/database/rest/auth#authenticate_with_an_access_token
Otherwise, You will then have to mint the credentials which will provide an access token which will be a valid bearer token.
It should be noted that this is only available in the following languages:
node.js
python
java
https://firebase.google.com/docs/cloud-messaging/auth-server#use-credentials-to-mint-access-tokens
Related
I am trying to receive notifications in an Expo React Native App.
The notifications will be sent using Azure Notification Hub REST API
I followed the steps below :
Added the Android project in Firebase Console
To get the Server Key I followed - Firebase messaging, where to get Server Key?
Configured the FCM ServerKey in Azure Notification Hub
Added the google-services.json at the root in my React Native App and modified app.json as mentioned in - https://docs.expo.dev/push-notifications/using-fcm/
To register in ANH, we first need the SAS Token - https://learn.microsoft.com/en-us/rest/api/notificationhubs/common-concepts I generated the token with the following code
const Crypto = require('crypto-js');
const resourceURI =
'http://myNotifHubNameSpace.servicebus.windows.net/myNotifHubName ';
const sasKeyName = 'DefaultListenSharedAccessSignature';
const sasKeyValue = 'xxxxxxxxxxxx';
const expiresInMins = 200;
let sasToken;
let location;
let registrationID;
let deviceToken;
function getSASToken(targetUri, sharedKey, ruleId, expiresInMins) {
targetUri = encodeURIComponent(targetUri.toLowerCase()).toLowerCase();
// Set expiration in seconds
var expireOnDate = new Date();
expireOnDate.setMinutes(expireOnDate.getMinutes() + expiresInMins);
var expires =
Date.UTC(
expireOnDate.getUTCFullYear(),
expireOnDate.getUTCMonth(),
expireOnDate.getUTCDate(),
expireOnDate.getUTCHours(),
expireOnDate.getUTCMinutes(),
expireOnDate.getUTCSeconds()
) / 1000;
var tosign = targetUri + '\n' + expires;
// using CryptoJS
//var signature = CryptoJS.HmacSHA256(tosign, sharedKey);
var signature = Crypto.HmacSHA256(tosign, sharedKey);
var base64signature = signature.toString(Crypto.enc.Base64);
//var base64signature = signature.toString(CryptoJS.enc.Base64);
var base64UriEncoded = encodeURIComponent(base64signature);
// construct autorization string
var token =
'SharedAccessSignature sr=' +
targetUri +
'&sig=' +
base64UriEncoded +
'&se=' +
expires +
'&skn=' +
ruleId;
console.log('signature:' + token);
return token;
}
I then called the create registration API - https://learn.microsoft.com/en-us/rest/api/notificationhubs/create-registration-id
The registrationID has to be extracted from the response header of the API Call
I used the following code to generate the ANH Regsitration ID
async function createRegistrationId() {
const endpoint =
'https://xxxxxx.servicebus.windows.net/xxxxxxx/registrationIDs/?api-version=2015-01';
sasToken = getSASToken(resourceURI, sasKeyValue, sasKeyName, expiresInMins);
const headers = {
Authorization: sasToken,
};
const options = {
method: 'POST',
headers: headers,
};
const response = await fetch(endpoint, options);
if (response.status !== 201) {
console.log(
'Unbale to create registration ID. Status Code: ' + response.status
);
}
console.log('Response Object : ', response);
for (var pair of response.headers.entries()) {
//console.log(pair[0] + ': ' + pair[1]);
}
location = response.headers.get('Location');
console.log('Location - ' + location);
console.log('Type - ' + response.type);
registrationID = location.substring(
location.lastIndexOf('registrationIDs/') + 'registrationIDs/'.length,
location.lastIndexOf('?api-version=2015-01')
);
console.log('Regsitration ID - ', registrationID);
return location;
}
Next step was to update this registration ID in ANH with the Native Device Token
I used expo-notifications package and the method getDevicePushTokenAsync() method to get the native device token
async function registerForPushNotificationsAsync() {
let token;
if (Device.isDevice) {
const { status: existingStatus } = await Notifications.getPermissionsAsync();
let finalStatus = existingStatus;
if (existingStatus !== 'granted') {
const {
status
} = await Notifications.requestPermissionsAsync();
finalStatus = status;
}
if (finalStatus !== 'granted') {
alert('Failed to get push token for push notification!');
return;
}
token = (await Notifications.getDevicePushTokenAsync()).data;
console.log(token);
} else {
alert('Must use physical device for Push Notifications');
}
if (Platform.OS === 'android') {
Notifications.setNotificationChannelAsync('default', {
name: 'default',
importance: Notifications.AndroidImportance.MAX,
vibrationPattern: [0, 250, 250, 250],
lightColor: '#FF231F7C',
});
}
return token;
}
The native device token was in the following format on Android device
c6RI81R7Rn66kWZ0rar3M2:APA91bEcbLXGwEZF-8hu1yGHfXgWBNuxr_4NY_MR8d7HEzeHAJrjoJnjUlneAIiVglCNIGUr11qkP1G4S76bx_H7NItxfQhZa_bgnQjqSlSaY4-oCoarDYWcY-Mz_ulW8rQZFy_SA6_j
I then called the updateRegistrationId API - https://learn.microsoft.com/en-us/rest/api/notificationhubs/create-update-registration
async function updateRegistraitonId() {
//IF you use registrationIDs as in returned location it was giving 401 error
const endpoint =
'https://xxxxx.servicebus.windows.net/xxxxxxx/registrations/' +
registrationID +
'?api-version=2015-01';
const endpoint1 = location;
const headers = {
Authorization: sasToken,
'Content-Type': 'application/atom+xml;type=entry;charset=utf-8',
};
//Remember to create well-formed XML using back-ticks
//else you may get 400 error
//If you use the tags element it was giving an error
const regDATA = `<entry xmlns="http://www.w3.org/2005/Atom">
<content type="application/xml">
<GcmRegistrationDescription xmlns:i="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://schemas.microsoft.com/netservices/2010/10/servicebus/connect">
<GcmRegistrationId>${deviceToken}</GcmRegistrationId>
</GcmRegistrationDescription>
</content>
</entry>`;
const options = {
method: 'PUT',
headers: headers,
body: regDATA,
};
const response = await fetch(endpoint, options);
if (response.status !== 201) {
console.log(
'Looks like there was a problem. Status Code: ' + response.status
);
console.log('Response Object : ', response);
//return;
}
}
According to API documentation, I should get 201 response, I got 200 response code . I am not sure if this is the issue
After this I had the notification handling code to recieve the notification,similar to the code in - https://docs.expo.dev/versions/latest/sdk/notifications/
I then tried to send notification using Test Send from ANH, it failed with the error -
**
"The Token obtained from the Token Provider is wrong"
**
I checked in ANH Metrics, the error was categorized as GCM Authentication error, GCM Result:Mismatch SenderId
I tried to check for documentation to add the SenderId , but I couldnt find anyway to inlcude the SenderId also in the payload of updateRegistration call (in xml atom entry)
I tried to use the device token and send directly from Firebase Console, I did not receive it either.
I used the Direct Send API of Azure notification Hub but still did not receive anything
I am suspecting there could be some issue in the way I am handling notifiations in the client device, I can fix that later , but first I will have to resolve the error I am getting in Test Send in Azure NH
Any help to be able to successfully send using Test Send in ANH or pointers ahead for next steps will be much appreciated
I'm a little bit lost on how firebase functions work with authentication,
Suppose I have a function that pulls 100 documents and sets the cache header for 24 hours.
res.set('Cache-Control', 'public, max-age=0, s-maxage=86400' // 24 * 60 * 60
By default, does that apply to all users or is it cached per user? There's some instances where the 100 documents are unique to the user - while other functions where the 100 documents are available to any user that's authenticated.
I see in the docs that you can set a __session which implies it's for individual users data, however there isn't much documentation for how to set that (or where). Is it set by default?
My goal is to have a function that requires the user be authenticated, then return 100 documents from a non-user specific collection - aka not have to read 100 documents per user. However, I don't think thats feasible because it would need to check if each user is authorized (not cacheable). So is there a way to just make a publicly available cache?
Any light that can be shared on this is greatly appreciated!
The Cache-Control header is used to instruct a user's browser and any CDN edge server on how to cache the request.
For requests requiring authentication, making use of the CDN is not really possible for this as you should be using Cache-Control: private for these responses (the default for Cloud Functions).
While you could check that your users are authenticated and then redirect them to a publically cached resource (like https://example.com/api/docs?sig=<somesignature>), this URL would still be accessible if someone got hold of that URL/cached data.
Arguably the best approach would be to store your "cached" responses in a single Cloud Firestore document (if it is less than 1MB in size and is JSON-compatible) or store it in Cloud Storage.
The code included below is an example of how you could do this with a Cloud Firestore cache. I've used posts where the authenticated user is the author as an example, but for this specific use case, you would be better off using the Firebase SDK to make such a query (realtime updates, finer control, query API). A similar approach could be applied for "all user" resources.
If attempting to cache HTML or some other not JSON friendly format, I would recommend changing the caching layer to Cloud Storage. Instead of storing the post's data in the cache entry, store the path and bucket to the cached file in storage (like below). Then if it hasn't expired, get a stream of that file from storage and pipe it through to the client.
{
data: {
fullPath: `/_serverCache/apiCache/${uid}/posts.html`,
bucket: "myBucket"
},
/* ... */
}
Common Example Code
import functions from "firebase-functions";
import { HttpsError } from "firebase-functions/lib/providers/https";
import admin from "firebase-admin";
import hash from "object-hash";
admin.initializeApp();
interface AttachmentData {
/** May contain a URL to the resource */
url?: string;
/** May contain Base64 encoded data of resource */
data?: string;
/** Type of this resource */
type: "image" | "video" | "social" | "web";
}
interface PostData {
author: string;
title: string;
content: string;
attachments: Record<string, AttachmentData>;
postId: string;
}
interface CacheEntry<T = admin.firestore.DocumentData> {
/** Time data was cached, as a Cloud Firestore Timestamp object */
cachedAt: admin.firestore.Timestamp;
/** Time data was cached, as a Cloud Firestore Timestamp object */
expiresAt: admin.firestore.Timestamp;
/** The ETag signature of the cached resource */
eTag: string;
/** The cached resource */
data: T;
}
/**
* Returns posts authored by this user as an array, from Firestore
*/
async function getLivePostsForAuthor(uid: string) {
// fetch the data
const posts = await admin.firestore()
.collection('posts')
.where('author', '==', uid)
.limit(100)
.get();
// flatten the results into an array, including the post's document ID in the data
const results: PostData[] = [];
posts.forEach((postDoc) => {
results.push({ postId: postDoc.id, ...postDoc.data() } as PostData);
});
return results;
}
/**
* Returns posts authored by this user as an array, caching the result from Firestore
*/
async function getCachedPostsForAuthor(uid: string) {
// Get the reference to the data's location
const cachedPostsRef = admin.firestore()
.doc(`_server/apiCache/${uid}/posts`) as admin.firestore.DocumentReference<CacheEntry<PostData[]>>;
// Get the cache entry's data
const cachedPostsSnapshot = await cachedPostsRef.get();
if (cachedPostsSnapshot.exists) {
// get the expiresAt property on it's own
// this allows us to skip processing the entire document until needed
const expiresAt = cachedPostsSnapshot.get("expiresAt") as CacheEntry["expiresAt"] | undefined;
if (expiresAt !== undefined && expiresAt.toMillis() > Date.now() - 60000) {
// return the entire cache entry as-is
return cachedPostsSnapshot.data()!;
}
}
// if here, the cache entry doesn't exist or has expired
// get the live results from Firestore
const results = await getLivePostsForAuthor(uid);
// etag, cachedAt and expiresAt are used for the HTTP cache-related headers
// only expiresAt is used when determining expiry
const cacheEntry: CacheEntry<PostData[]> = {
data: results,
eTag: hash(results),
cachedAt: admin.firestore.Timestamp.now(),
// set expiry as 1 day from now
expiresAt: admin.firestore.Timestamp.fromMillis(Date.now() + 86400000),
};
// save the cached data and it's metadata for future calls
await cachedPostsRef.set(cacheEntry);
// return the cached data
return cacheEntry;
}
HTTPS Request Function
This is the request type you would use for serving Cloud Functions behind Firebase Hosting. Unfortunately the implementation details aren't as straightforward as using a Callable Function (see below) but is provided as an official project sample. You will need to insert validateFirebaseIdToken() from that example for this code to work.
import express from "express";
import cookieParserLib from "cookie-parser";
import corsLib from "cors";
interface AuthenticatedRequest extends express.Request {
user: admin.auth.DecodedIdToken
}
const cookieParser = cookieParserLib();
const cors = corsLib({origin: true});
const app = express();
// insert from https://github.com/firebase/functions-samples/blob/2531d6d1bd6b16927acbe3ec54d40369ce7488a6/authorized-https-endpoint/functions/index.js#L26-L69
const validateFirebaseIdToken = /* ... */
app.use(cors);
app.use(cookieParser);
app.use(validateFirebaseIdToken);
app.get('/', async (req, res) => {
// if here, user has already been validated, decoded and attached as req.user
const user = (req as AuthenticatedRequest).user;
try {
const cacheEntry = await getCachedPostsForAuthor(user.uid);
// set caching headers
res
.header("Cache-Control", "private")
.header("ETag", cacheEntry.eTag)
.header("Expires", cacheEntry.expiresAt.toDate().toUTCString());
if (req.header("If-None-Match") === cacheEntry.eTag) {
// cached data is the same, just return empty 304 response
res.status(304).send();
} else {
// send the data back to the client as JSON
res.json(cacheEntry.data);
}
} catch (err) {
if (err instanceof HttpsError) {
throw err;
} else {
throw new HttpsError("unknown", err && err.message, err);
}
}
});
export const getMyPosts = functions.https.onRequest(app);
Callable HTTPS Function
If you are making use of the client SDKs, you can also request the cached data using Callable Functions.
This allows you to export the function like this:
export const getMyPosts = functions.https.onCall(async (data, context) => {
if (!context.auth) {
throw new functions.https.HttpsError(
'failed-precondition',
'The function must be called while authenticated.'
);
}
try {
const cacheEntry = await getCachedPostsForAuthor(context.auth.uid);
return cacheEntry.data;
} catch (err) {
if (err instanceof HttpsError) {
throw err;
} else {
throw new HttpsError("unknown", err && err.message, err);
}
}
});
and call it from the client using:
const getMyPosts = firebase.functions().httpsCallable('getMyPosts');
getMyPosts()
.then((postsArray) => {
// do something
})
.catch((error) => {
// handle errors
})
Just want to check, is there any API to add the authorized domain in a programmatical way instead of adding it manually by going to Firebase console?
Also, is there any limit on how many domains can be added as the authorized domains?
JavaScript in Cloud Functions solution
import { google } from "googleapis";
(async () => {
/**
* ! START - Update Firebase allowed domains
*/
// Change this to whatever you want
const URL_TO_ADD = "engineering.acme-corp.net";
// Acquire an auth client, and bind it to all future calls
const auth = new google.auth.GoogleAuth({
scopes: ["https://www.googleapis.com/auth/cloud-platform"],
});
const authClient = await auth.getClient();
google.options({ auth: authClient });
// Get the Identity Toolkit API client
const idToolkit = google.identitytoolkit("v3").relyingparty;
/**
* When calling the methods from the Identity Toolkit API, we are
* overriding the default target URLs and payloads (that interact
* with the v3 endpoint) so we can talk to the v2 endpoint, which is
* what Firebase Console uses.
*/
// Generate the request URL
const projectId = await auth.getProjectId();
const idToolkitConfigUrl = `https://identitytoolkit.googleapis.com/admin/v2/projects/${projectId}/config`;
// Get current config so we can use it when we later update it
const currentConfig = await idToolkit.getProjectConfig(undefined, {
url: idToolkitConfigUrl,
method: "GET",
});
// Update the config based on the values that already exist
await idToolkit.setProjectConfig(undefined, {
url: idToolkitConfigUrl,
method: "PATCH",
params: { updateMask: "authorizedDomains" },
body: JSON.stringify({
authorizedDomains: [
...(currentConfig.data.authorizedDomains || []),
URL_TO_ADD,
],
}),
});
})();
A quick note on other languages
The principles should be the same:
Find a way to interact with Google's identify toolkit API (maybe Google offers an SDK to your language)
Get current config
Set new config
If you can't find an SDK, you can also work with raw http requests: https://cloud.google.com/identity-platform/docs/reference/rest/v2/projects/getConfig (it's just a bit trickier to do authentication when doing everything manually)
There is no API for this - you must do it through the console. You can also file a feature request with Firebase support if you want.
There doesn't appear to be any documentation stating limits of number of domains. Again, reach out to Firebase support if the documentation is unclear.
Thanks #Jean Costa
Totally working for me.
Here is C# implementation
using Google.Apis.Auth.OAuth2;
using Newtonsoft.Json;
var serviceAccountJsonFile = "path to service account json";
var projectId = "your project ids";
var authorizedDomains = new
{
authorizedDomains = new string[] {
"localhost",
"******.firebaseapp.com",
"*********.web.app",
"abc.def.com"
}
}; // your desire authorized domain
List<string> scopes = new()
{
"https://www.googleapis.com/auth/identitytoolkit",
"https://www.googleapis.com/auth/firebase",
"https://www.googleapis.com/auth/cloud-platform"
};
var url = "https://identitytoolkit.googleapis.com/admin/v2/projects/" + projectId + "/config";
using var stream = new FileStream(serviceAccountJsonFile, FileMode.Open, FileAccess.Read);
var accessToken = GoogleCredential
.FromStream(stream) // Loads key file
.CreateScoped(scopes) // Gathers scopes requested
.UnderlyingCredential // Gets the credentials
.GetAccessTokenForRequestAsync().Result; // Gets the Access Token
var body = JsonConvert.SerializeObject(authorizedDomains);
using (var client = new HttpClient())
{
var request = new HttpRequestMessage(HttpMethod.Patch, url) {
Content = new StringContent(body,System.Text.Encoding.UTF8)
};
request.Headers.Add("Accept", "application/json");
request.Headers.Add("Authorization", "Bearer " + accessToken);
try
{
var response = client.SendAsync(request).Result;
Console.WriteLine(response.Content.ReadAsStringAsync().Result);
}
catch (HttpRequestException ex)
{
// Failed
}
}
Thanks #Jean Costa and #Yan Naing
here is my php implemetation
use GuzzleHttp\Client as GuzzleClient;
use GuzzleHttp\Exception\TransferException;
use Google\Service\IdentityToolkit;
use Google\Service\IAMCredentials;
$KEY_FILE_LOCATION = storage_path('/app/credentials/service-account-1.json') ;
if (!file_exists($KEY_FILE_LOCATION)) {
throw new Exception(sprintf('file "%s" does not exist', $KEY_FILE_LOCATION));
}
$json= file_get_contents($KEY_FILE_LOCATION);
if (!$config = json_decode($json, true)) {
throw new Exception('invalid json for auth config');
}
$client = new \Google\Client();
$client->setAuthConfig($config );
$client->setScopes([ "https://www.googleapis.com/auth/identitytoolkit",
"https://www.googleapis.com/auth/firebase",
"https://www.googleapis.com/auth/cloud-platform"]);
$service = new IdentityToolkit($client);
// Get the Identity Toolkit API client
$idToolkit = $service->relyingparty;
//Get current config
$current_config= $idToolkit->getProjectConfig();
//Get service account access token
$access_token_req = new IAMCredentials\GenerateAccessTokenRequest();
$access_token_req->setScope( "https://www.googleapis.com/auth/firebase");
$credentials = new IAMCredentials($client);
$access_token = $credentials->projects_serviceAccounts->generateAccessToken("projects/-/serviceAccounts/{$config["client_email"]}" , $access_token_req )->getAccessToken();
// Generate the request URL (https://cloud.google.com/identity-platform/docs/reference/rest/v2/projects/updateConfig)
$idToolkitConfigUrl = "https://identitytoolkit.googleapis.com/admin/v2/projects/{$config["project_id"]}/config";
$authorized_domains = [ 'authorizedDomains' => array_merge( ['twomore.com'],$current_config->authorizedDomains)];
$client = new GuzzleClient( );
$response = null;
try {
$response = $client->request('PATCH', $idToolkitConfigUrl, [
'verify' => Helpers::isProduction() ? true : false ,
'http_errors'=> false, //off 4xx and 5xx exceptioins
'json' => $authorized_domains ,
'headers' => [
"Authorization" => "Bearer " . $access_token ,
"Accept" => "application/json",
]
]);
} catch (TransferException $e) {
throw new Exception( $e->getMessage());
}
$data = json_decode($response->getBody()->getContents(),true);
if($response->getStatusCode()!==200){
throw new Exception($response->getReasonPhrase() . ( isset($data['exception']['message']) ? " - " . $data['exception']['message'] : ""));
}
return response()->json(['data' => [
'authorized_domains' => $data['authorizedDomains']
]]);
This is my client side code:
function signIn(){
var email = document.getElementById("username").value;
var password = document.getElementById("password").value;
// As httpOnly cookies are to be used, do not persist any state client side.
firebase.auth().setPersistence(firebase.auth.Auth.Persistence.NONE);
// When the user signs in with email and password.
firebase.auth().signInWithEmailAndPassword(email, password).then(user => {
// Get the user's ID token as it is needed to exchange for a session cookie.
return firebase.auth().currentUser.getIdToken().then(idToken => {
// Session login endpoint is queried and the session cookie is set.
// CSRF protection should be taken into account.
// ...
var csrfToken = getCookie('_csrf')
return postIdTokenToSessionLogin('/sessionLogin', idToken, csrfToken);
});
}).then(() => {
// A page redirect would suffice as the persistence is set to NONE.
return firebase.auth().signOut();
}).then(() => {
window.location.assign('/profile');
});
}
I'm sending the idToken and csrfToken to generate a sessionId. Using this sessionId, I'm able to assign session cookies.
Here is my server side code:
app.post("/sessionLogin", (req, res) => {
// Get ID token and CSRF token.
var idToken = req.body.idToken.toString();
var csrfToken = req.body.csrfToken.toString();
// Guard against CSRF attacks.
if (!req.cookies || csrfToken !== req.cookies._csrf) {
res.status(401).send('UNAUTHORIZED REQUEST!');
return;
}
// Set session expiration to 5 days.
var expiresIn = 60 * 60 * 24 * 5 * 1000;
// Create the session cookie. This will also verify the ID token in the
process.
// The session cookie will have the same claims as the ID token.
// We could also choose to enforce that the ID token auth_time is recent.
firebase.auth().verifyIdToken(idToken).then(function(decodedClaims) {
// In this case, we are enforcing that the user signed in in the last 5
minutes.
if (new Date().getTime() / 1000 - decodedClaims.auth_time < 5 * 60) {
return firebase.auth().createSessionCookie(idToken, {expiresIn:
expiresIn});
}
throw new Error('UNAUTHORIZED REQUEST!');
})
.then(function(sessionCookie) {
// Note httpOnly cookie will not be accessible from javascript.
// secure flag should be set to true in production.
var options = {maxAge: expiresIn, path: "/", httpOnly: false, secure: true
/** to test in localhost */};
res.cookie('session', sessionCookie, options);
res.end(JSON.stringify({status: 'success'}));
})
.catch(function(error) {
res.status(401).send('UNAUTHORIZED REQUEST!');
});
});
app.get("/profile", (req, res) => {
console.log('Cookies: ', req.cookies); //Empty object, 'Cookies: {}'
res.render("profile");
});
app.post("/profile", (req, res) => {
res.send(req.body.name);
console.log('Cookies: ', req.cookies); //Cookies object with csrf and
session token
});
Now, this is working fine and I'm able to pass the cookies to the server with every POST request. An unauthenticated user cannot send POST requests. However, I was hoping to authenticate users and serve the user-related data. So, How can I use these session cookies to serve routes on GET requests as well? Right now, my client side does not send these cookies on GET requests.
I've followed these Firebase documents and GitHub Repos
Will it be the right approach? If not, I'd appreciate your guidance in the right direction. Thank you in advance.
I've found a lot of tutorials that have gotten me as far as logging in with OAuth in Meteor / MeteorAngular. Unfortunately, they've never gotten me as far as successfully accessing the Microsoft graph API. I know I need to (somehow) transform my user access token into a bearer token, but I haven't found any guides on how to do that in Meteor -- and the sample Node apps I've found to do similar things don't run out of the box, either. I have some routes that should return the data I need, I just can't quite seem to hit them.
Meteor + Microsoft... is darned near nonexistent, as far as I can tell
lol you aren't wrong. However as one of the primary engineers behind Sidekick AI (a scheduling app integrating with Microsoft+Google calendar built with Meteor) I have some very practical examples of using said access_token to work with the Microsoft Graph API with Meteor.
Here's some examples of functions I've written which use the access_token to communicate with the Graph API to work with calendars. Notably the line that matters is where I'm setting
Authorization: `Bearer ${sources.accessToken} // Sources just being the MongoDB collection we store that info in
NOTE: if you need more I'd be happy to provide but didn't want to clutter this initially
NOTE: wherever you see "Outlook" it really should say "Microsoft"; that's just a symptom of uninformed early development.
/**
* This function will use the Meteor HTTP package to make a POST request
* to the Microsoft Graph Calendar events endpoint using an access_token to create a new
* Event for the connected Outlook account
*
* #param { String } sourceId The Sources._id we are using tokens from for the API request
*
* #param { Object } queryData Contains key value pairs which will describe the new Calendar
* Event
*
* #returns { Object } The newly created Event Object
*/
export const createOutlookCalendarEvent = async (
sourceId: string,
queryData: {
subject: any
start: {
dateTime: any
timeZone: string
}
end: {
dateTime: any
timeZone: string
}
isOnlineMeeting: any
body: {
content: string
contentType: string
},
attendees: {
emailAddress: {
address: string
name: string
}
}[]
}
): Promise<object> => {
await attemptTokenRefresh(sourceId)
const sources = Sources.findOne({ _id: sourceId })
const options = {
headers: {
Accept: `application/json`,
Authorization: `Bearer ${sources.accessToken}`,
},
data: queryData,
}
return new Promise((resolve, reject) => {
HTTP.post(`${OauthEndpoints.Outlook.events}`, options, (error, response) => {
if (error) {
reject(handleError(`Failed to create a new Outlook Calendar event`, error, { options, sourceId, queryData, sources }))
} else {
resolve(response.data)
}
})
})
}
/**
* This function will use the Meteor HTTP package to make a GET request
* to the Microsoft Graph Calendars endpoint using an access_token obtained from the
* Microsoft common oauth2 v2.0 token endpoint. It retrieves Objects describing the calendars
* under the connected account
*
* #param sourceId The Sources._id we are using tokens from for the API request
*
* #returns Contains Objects describing the user's connected account's calendars
*/
export const getCalendars = async (sourceId: string): Promise<Array<any>> => {
await attemptTokenRefresh(sourceId)
const sources = Sources.findOne({ _id: sourceId })
const options = {
headers: {
Accept: `application/json`,
Authorization: `Bearer ${sources.accessToken}`,
},
}
return new Promise((resolve, reject) => {
HTTP.get(OauthEndpoints.Outlook.calendars, options, (error, response) => {
if (error) {
reject(handleError(`Failed to retrieve the calendars for a user's connected Outlook account`, error, { options, sources, sourceId }))
} else {
resolve(response.data.value)
}
})
})
}
// reference for the afore-mentioned `OauthEndpoints`
/**
* Object holding endpoints for our Oauth integrations with Google,
* Microsoft, and Zoom
*/
export const OauthEndpoints = {
...
Outlook: {
auth: 'https://login.microsoftonline.com/common/oauth2/v2.0/authorize?',
token: 'https://login.microsoftonline.com/common/oauth2/v2.0/token',
calendars: 'https://graph.microsoft.com/v1.0/me/calendars',
calendarView: 'https://graph.microsoft.com/v1.0/me/calendar/calendarView',
events: 'https://graph.microsoft.com/v1.0/me/calendar/events',
messages: 'https://graph.microsoft.com/v1.0/me/messages',
sendMail: 'https://graph.microsoft.com/v1.0/me/sendMail',
},
...
}