Firebase remote config returns an object - firebase

Trying to get Firebase remote config working. A remote config parameter "ab_placeholder_value" has been setup with a default value "abvertisement" in Firebase, published too.
React Native code:
remoteConfig().fetchAndActivate()
.then(fetchedRemotely => {
if (fetchedRemotely) {
console.log('Configs were retrieved from the backend and activated.')
} else {
console.log(
'No configs were fetched from the backend, and the local configs were already activated',
)
}
})
remoteConfig().fetch(0)
const val = remoteConfig().getValue('ab_placeholder_value')
console.log('The ab_placeholder_value: ')
console.log(val)
remoteConfig().fetch(1)
const all = remoteConfig().getAll()
console.log('All values: ')
console.log(all)
Output:
LOG The ab_placeholder_value:
LOG {"_source": "default", "_value": "disabled"}
LOG All values:
LOG {"ab_placeholder_value": {"_source": "default", "_value": "disabled"}}
LOG Configs were retrieved from the backend and activated.
Looks like the app is talking with the Firebase server, since the parameter name "ab_placeholder_value" has been retrieved. But why is the parameter value an object: {"_source": "default", "_value": "disabled"}?
How can I get the value set in Firebase remote config? Any help is appreciated!

remoteConfig().fetchAndActivate() returns an object. You can access values of it in your case by using ._value:
const val = remoteConfig().getValue('ab_placeholder_value')._value;
console.log('The ab_placeholder_value: ')
console.log(val)
I initially run everything through a function to create a remoteConfigValues object and then access values stored in that object throughout the code to make things a bit more readable:
remoteConfig.fetchAndActivate()
.then(() => {
remoteConfigAllObj = remoteConfig.getAll();
var remoteConfigValues = {};
for (let key in remoteConfigAllObj){
remoteConfigValues[key] = remoteConfigAllObj[key]._value;
}
})
The values from the remoteConfig object then become available using the dot notation from the created remoteConfigValues object.
console.log(remoteConfigValues.ab_placeholder_value);
NOTE – In my case I load the remoteConfig before app initialization at startup. The remoteConfigValues object will therefore contain whatever is activated at that time, so it may be a mix of values from static, remote, and default variables. If you want to know where the values came from originally, you'll find that information under remoteConfigAllObj[key]._source for each key in the remoteConfigAllObj that you gathered in the beginning.

Related

NuxtJS store returning local storage values as undefined

I have a nuxt application. One of the components in it's mounted lifecycle hook is requesting a value from the state store, this value is retrieved from local storage. The values exist in local storage however the store returns it as undefined. If I render the values in the ui with {{value}}
they show. So it appears that in the moment that the code runs, the value is undefined.
index.js (store):
export const state = () => ({
token: process.browser ? localStorage.getItem("token") : undefined,
user_id: process.browser ? localStorage.getItem("user_id") : undefined,
...
Component.vue
mounted hook:
I'm using UserSerivce.getFromStorage to get the value directly from localStorage as otherwise this code block won't run. It's a temporary thing to illustrate the problem.
async mounted() {
// check again with new code.
if (UserService.getFromStorage("token")) {
console.log("user service found a token but what about store?")
console.log(this.$store.state.token, this.$store.state.user_id);
const values = await ["token", "user_id"].map(key => {return UserService.getFromStorage(key)});
console.log({values});
SocketService.trackSession(this, socket, "connect")
}
}
BeforeMount hook:
isLoggedIn just checks that the "token" property is set in the store state.
return !!this.$store.state.token
beforeMount () {
if (this.isLoggedIn) {
// This runs sometimes??? 80% of the time.
console.log("IS THIS CLAUSE RUNNING?");
}
}
video explanation: https://www.loom.com/share/541ed2f77d3f46eeb5c2436f761442f4
OP's app is quite big from what it looks, so finding the exact reason is kinda difficult.
Meanwhile, setting ssr: false fixed the errors.
It raised more, but they should probably be asked into another question nonetheless.

How to pass a parameter from the Jupyter backend to a frontend extension

I currently have a value that is stored as an environment variable the environment where a jupyter server is running. I would like to somehow pass that value to a frontend extension. It does not have to read the environment variable in real time, I am fine with just using the value of the variable at startup. Is there a canonical way to pass parameters a frontend extension on startup? Would appreciate an examples of both setting the parameter from the backend and accessing it from the frontend.
[update]
I have posted a solution that works for nbextentions, but I can't seem to find the equivalent pattern for labextensions (typescript), any help there would be much appreciated.
I was able to do this by adding the following code to my jupter_notebook_config.py
from notebook.services.config import ConfigManager
cm = ConfigManager()
cm.update('notebook', {'variable_being_set': value})
Then I had the parameters defined in my extension in my main.js
// define default values for config parameters
var params = {
variable_being_set : 'default'
};
// to be called once config is loaded, this updates default config vals
// with the ones specified by the server's config file
var update_params = function() {
var config = Jupyter.notebook.config;
for (var key in params) {
if (config.data.hasOwnProperty(key) ){
params[key] = config.data[key];
}
}
};
I also have the parameters declared in my main.yaml
Parameters:
- name: variable_being_set
description: ...
input_type: text
default: `default_value`
This took some trial and error to find out because there is very little documentation on the ConfigManager class and none of it has an end-to-end example.

How to use Sign-In User ID to send push notifications

I have some users signed into my actions-on-google app via Google Sign-In ( https://developers.google.com/actions/identity/google-sign-in )
I want to sent push notifications to one of those users.
For getting push notifications work with actions in the first place, I tried this sample: https://github.com/actions-on-google/dialogflow-updates-nodejs/blob/master/functions/index.js but I only can get this to work without this commit: https://github.com/actions-on-google/dialogflow-updates-nodejs/commit/c655062047b49e372da37af32376bd06d837fc7f#diff-1e53ef2f51bd446c876676ba83d7c888
It works fine, but I think const userID = conv.user.id; returns the deprecated Anonymous User ID. The commit suggests to use const userID = conv.arguments.get('UPDATES_USER_ID'); which returns undefined.
I use this nodejs code to send the push notifications.
const request = require('request');
const {JWT} = require('google-auth-library');
const serviceAccount = require('./service-account.json');
let jwtClient = new JWT(
serviceAccount.client_email, null, serviceAccount.private_key,
['https://www.googleapis.com/auth/actions.fulfillment.conversation'],
null
);
jwtClient.authorize((authErr, tokens) => {
let notification = {
userNotification: {
title: process.argv[2],
},
target: {
userId: USERID,
intent: 'tell_latest_status',
// Expects a IETF BCP-47 language code (i.e. en-US)
locale: 'en-US'
},
};
request.post('https://actions.googleapis.com/v2/conversations:send', {
'auth': {
'bearer': tokens.access_token,
},
'json': true,
'body': {
'customPushMessage': notification, 'isInSandbox': true
},
}, (reqErr, httpResponse, body) => {
console.log(httpResponse.statusCode + ': ' + httpResponse.statusMessage);
});
});
I simply can't get this to work with the const userID = conv.arguments.get('UPDATES_USER_ID'); version, because as I said
When I use conv.user.profile.payload.sub as suggested here: https://developers.google.com/actions/identity/user-info the AoG API returns "SendToConversation response: Invalid user id for target."
Is there any way to make this work with Google Sign-In?
Has anyone made this work? I mean with the UPDATES_USER_ID field?
I already created an issue on the samples repo: https://github.com/actions-on-google/dialogflow-updates-nodejs/issues/15 but I was sent here.
Thanks!
While researching why I sometimes got undefined I found an answer on this question that solved my issue.
I've found solution for this problem. While getting UPDATES_USER_ID
conv.arguments.get() only works for first attempt. So, while building
your action you must save it. If you didn't store or save, you can
reset your profile and try again, you will be able to get.
You can reset your user profile for the action here.

How do I delete user analytics data from Firebase using userDeletionRequests:upsert?

Problem Description
My Android app collects data via Google Analytics for Firebase. For privacy reasons, users must be able to wipe their data off the Firebase servers, should they choose to do so.
The app requests a deletion by forwarding its Firebase APP_INSTANCE_ID to my own server. This server has been prepared in advance with credentials, from my personal Google account (via oauth2), for managing the Firebase project. The server authenticates with www.googleapis.com, and, using the supplied APP_INSTANCE_ID, invokes the upsert.
As noted by the documentation, the generic Google Analytics API is appropriate for this task.
After some initial trouble (b/c I didn't have the correct auth scope, and the Analytics API wasn't properly enabled), googleapis.com now returns HTTP 200 for each upsert request. (As an aside, even if you supply a bogus APP_INSTANCE_ID, it returns 200.)
Here is a sample response from the upsert, which shows nothing amiss:
{ kind: 'analytics#userDeletionRequest',
id:
{ type: 'APP_INSTANCE_ID',
userId: (REDACTED 32-char hexidecimal string) },
firebaseProjectId: (REDACTED),
deletionRequestTime: '2018-08-28T12:46:30.874Z' }
I know the firebaseProjectId is correct, because if I alter it, I get an error. I have verified that the APP_INSTANCE_ID is correct, and stable up until the moment it is reset with resetAnalyticsData().
Test Procedure
To test the deletions, I populated Firebase with several custom events, using the procedure below (Nexus 5X emulator, no Google Play, no Google accounts configured, but that shouldn't make any difference):
Install the app
Fire off some custom events (FirebaseAnalytics.logEvent)
Observe those events appear on the Firebase console
(About a minute later:) Make the upsert call, observe HTTP 200, and note the "deletionRequestTime"
Immediately call FirebaseAnalytics.resetAnalyticsData (to clear any event data cached on the device)
Uninstall the app
Rinse & repeat 7 or 8 times
However, even 24 hours later, 100% of the Firebase events are still present in the events table. No discernable state change has taken place on the Firebase server as a result of the upserts.
Question
So, what am I doing wrong? how do I successfully delete user data from Google Analytics for Firebase?
EDIT
Here's the code I'm using to make a request (from node.js):
const request = require( 'request' );
...
_deletePersonalData( data )
{
return new Promise( (resolve, reject) => {
request.post({
url: 'https://www.googleapis.com/analytics/v3/userDeletion/userDeletionRequests:upsert',
body: {
kind: 'analytics#userDeletionRequest',
id: {
type: 'APP_INSTANCE_ID',
userId: data.firebaseAppInstanceId
},
firebaseProjectId: (REDACTED)
},
headers: {
Authorization: 'Bearer ' + iap.getCurAccessToken()
},
json: true
}, (err, res, body) => {
console.log( 'user-deletion POST complete' );
console.log( 'Error ' + err );
console.log( 'Body ', body );
if( err )
{
reject( err );
return;
}
if( body.error )
{
reject( new Error( 'The Google service returned an error: ' + body.error.message + ' (' + body.error.code + ')' ) );
return;
}
resolve({ deletionRequestTime: body.deletionRequestTime });
});
});
}
Here's a sample request body:
{
kind: 'analytics#userDeletionRequest',
id: {
type: 'APP_INSTANCE_ID',
userId: (REDACTED 32-char hexidecimal string)
},
firebaseProjectId: (REDACTED)
}
And here's the console output for that same request (same userId and everything):
user-deletion POST complete
Error: null
Body: { kind: 'analytics#userDeletionRequest',
id:
{ type: 'APP_INSTANCE_ID',
userId: (REDACTED 32-char hexidecimal string) },
firebaseProjectId: (REDACTED),
deletionRequestTime: '2018-08-29T17:32:06.949Z' }
Firebase support just got back to me, and I quote:
Upsert method deletes any individual user data we have logged, but aggregate metrics are not recomputed. This means that you might not see any changes in the events tab in your Analytics console.
So, basically my mistake was expecting the events to disappear from the console.
This, of course, raises the question of how one determines that the API is actually working... but maybe the HTTP 200 is enough.

alexa skill local could not write to dynamodb

I am writing a node.js skill using ask-sdk and using alexa-skill-local to test the endpoint. I need to persist data to DynamoDb in one of the handler. But I keep getting "missing region error". Please find my code below:
'use strict';
// use 'ask-sdk' if standard SDK module is installed
const Alexa = require('ask-sdk');
const { launchRequestHandler, HelpIntentHandler, CancelAndStopIntentHandler, SessionEndedRequestHandler } = require('./commonHandlers');
const ErrorHandler = {
canHandle() {
return true;
},
handle(handlerInput, error) {
return handlerInput.responseBuilder
.speak('Sorry, I can\'t understand the command. Please say again.')
.reprompt('Sorry, I can\'t understand the command. Please say again.')
.getResponse();
},
};
////////////////////////////////
// Code for the handlers here //
////////////////////////////////
exports.handler = Alexa.SkillBuilders
.standard()
.addRequestHandlers(
launchRequestHandler,
HelpIntentHandler,
CancelAndStopIntentHandler,
SessionEndedRequestHandler,
ErrorHandler
)
.withTableName('devtable')
.withDynamoDbClient()
.lambda();
And in one of the handler I am trying to get persisted attributes like below:
handlerInput.attributesManager.getPersistentAttributes().then((data) => {
console.log('--- the attributes are ----', data)
})
But I keep getting the following error:
(node:12528) UnhandledPromiseRejectionWarning: AskSdk.DynamoDbPersistenceAdapter Error: Could not read item (amzn1.ask.account.AHJECJ7DTOPSTT25R36BZKKET4TKTCGZ7HJWEJEBWTX6YYTLG5SJVLZH5QH257NFKHXLIG7KREDKWO4D4N36IT6GUHT3PNJ4QPOUE4FHT2OYNXHO6Z77FUGHH3EVAH3I2KG6OAFLV2HSO3VMDQTKNX4OVWBWUGJ7NP3F6JHRLWKF2F6BTWND7GSF7OVQM25YBH5H723VO123ABC) from table (EucerinSkinCareDev): Missing region in config
at Object.createAskSdkError (E:\projects\nodejs-alexa-sdk-v2-eucerin-skincare-dev\node_modules\ask-sdk-dynamodb-persistence-adapter\dist\utils\AskSdkUtils.js:22:17)
at DynamoDbPersistenceAdapter.<anonymous> (E:\projects\nodejs-alexa-sdk-v2-eucerin-skincare-dev\node_modules\ask-sdk-dynamodb-persistence-adapter\dist\attributes\persistence\DynamoDbPersistenceAdapter.js:121:45)
Can we read and write attributes from DynamoDb using alexa-skill-local ? Do we need some different setup to achieve this ?
Thanks
I know that this is a really old topic, but I had the same problem few days ago, and I'm gonna explain how I did it work.
You have to download DynamoDB Locally and follow the instructions from here
Once that you have configure your local DynamoDB and check that it is working. You have to pass it through your code, to DynamoDbPersistenceAdapter constructor.
Your code should look similar to:
var awsSdk = require('aws-sdk');
var myDynamoDB = new awsSdk.DynamoDB({
endpoint: 'http://localhost:8000', // If you change the default url, change it here
accessKeyId: <your-access-key-id>,
secretAccessKey: <your-secret-access-key>,
region: <your-region>,
apiVersion: 'latest'
});
const {DynamoDbPersistenceAdapter} = require('ask-sdk-dynamodb-persistence-adapter');
return new DynamoDbPersistenceAdapter({
tableName: tableName || 'my-table-name',
createTable: true,
dynamoDBClient: myDynamoDB
});
Where <your-acces-key-id>, <your-secrete-access-key> and <your-region> are defined at aws config and credentials files.
The next step is launch your server with alexa-skill-local command as always.
Hope this will be helpfull! =)
Presumably you have an AWS config profile that your skill is using when running locally.
You need to edit the .config file and set the default region (ie us-east-1) there. The region should match the region where your table exists.
Alternatively, if you want to be able to run completely isolated, you may need to write come conditional logic and swap the dynamo client with one targeting an instance of DynamoDB Local running on your machine.

Resources