I've been putting together a mechanism to sync activity data collected by the MS Band with our backend via the cloud API and getting all the boilerplate setup for the OAuth flows... The intent being to periodically run this data through our backend processes to categorise periods of meaningful walk based activity.
I've been experimenting with the data available and as far as I can tell we cannot get access to the raw step data (or at a fine grained level )? We have successfully been able to request summary info by hour/day, however this is not fit for our purpose.
What I'd like is to access step data in the form [startTimeStamp,endTimeStamp,stepsTaken,...] where each record represents a continuous period of movement by the wearer.
We would also be able to work with data summarised by minute as this would give enough context to our use case.
Is this possible via the cloud API? or are there any plans to implement the Period "Minute" on the summary API endpoint?
https://api.microsofthealth.net/v1/me/Summaries/Minute?startTime=2015-12-09T14%3A00%3A00.369Z
If this isn't possible perhaps there is another way to make this data available? (via HealthKit on iOS or Fit on Android?)
As a complete alternate perhaps it might be possible to get the accumulated step data detail from the band via bluetooth in a similar fashion to the native MS Health App?
We already use the SDK to stream realtime Heart Rate data during user cardio sessions, but there appears to be no way to extract the historical step info from the band directly.
Thanks!
the Band itself monitors and logs the steps over time. When sync'ing, that log is transferred to the Cloud via the Microsoft Health app. The app then pulls the "steps for the day" from the Health service.
These logs are not exposed to apps via the SDK. The only way to calculate steps per custom short period yourself is to have your app sample the counter in the background on a frequent enough basis in order to do the calculation.
Related
(Updated description)
Frontend: Android
Core requirements: I would like to write my own code and have it executed on the server. I want the whole backend to be automated (no admin creating tables in a database and inserting records). I still want to benefit from some basic BaaS functions like sending notifications to users, server maintenance, etc. to speed up the MVP development process.
Description of MVP functionality - survey app for entrepreneurs:
An entrepreneur adds the survey and information about it (questions, possible answers). It is sent to the server and saved. There are different variants of surveys (single choice, multi-choice, open questions, etc.), so a specific document has to be created automatically by the backend code. Analogically, the creation of a document for responses has to be handled by the backend. The same in the case of the document for the final results of the survey research.
The respondent receives a notification about an available survey. The mobile app retrieves information about the survey from the server and respondent completes the survey.
The application sends the respondent's responses to the server, the server saves the information.
X respondents perform points 2 and 3.
When the survey is completed (the number of respondents set by the entrepreneur is reached), the server processes the data, collected from all respondents and saves the results of the research (in the appropriate document).
The entrepreneur receives a notification about the completed research. The application downloads the results from the server.
Additional requirements:
Server has to be able to serve many entrepreneurs and respondents at the same time without any problems like data corruption.
No admin needed for creating tables or inserting records - Backend is 100% automated.
Certainly!
You could use your admin client side of the application to upload the questions, corresponding answers, answer response limit, and completion flag (Step 1).
You could then retrieve the data from your user side of the client app from Firebase Firestore, and have the users complete the surveys and upload the answers back to Firebase Firestore. (Steps 2 & 3).
Step 5 could be achieved by using Firebase cloud functions, which listen when a Firestore document variable has reached the response limit, it could then be marked as complete. Step 6 could also be activated in the cloud function, sending a notification via FCM to your admin client.
I know this answer doesn't go into any code specifics, just wanted to let you know that this is most certainly possible with Firebase :)
I would certainly recommend creating an admin client app in addition to the user client app, rather than placing them in the same app!
I am building a web app with the following stack:
UI - React
Backend framework - NestJS
Infrastructure - Google Firestore document DB, services deployed in Heroku
I need to calculate finance portfolio metrics on a daily basis for all users and display them when the user logs in. I am in a bit of a dilemma what approach to take and I have several ideas, so I hope you can give me some guidance.
Scheduled microservice
I can build and schedule a microservice in Python (the finance framework is in Python) that will run every day and calculate the needed metrics for the users and update the database. Seems straightforward but it might consume a lot of compute resources, especially when the user base grows large.
Cloud Functions
Google Firestore supports cloud functions that can trigger on specific events. I can leverage that and run the calculation microservice when the data is requested - that way I will calculate the information only on-demand. The downside is that if the data has not been requested for a long time, I will have to calculate the metrics for a larger period of time and this might take a while.
P.S. Just saw that there are also scheduled cloud functions - possible implementation might check if the data is calculated today (user has logged in at least once) and if not, calculate it.
I will be happy to discuss any other options that might be available.
I am using Firebase as my authentication and database platform in my React Native-Expo app. I have not yet decided if I will be using the realtime-database or Firestore database.
I need to perform statistical analysis on daily data gathered from my users, which is stored in the database. I.e. the users type in their daily intake of protein, from it I would like to calculate their weekly average, expected monthly average, provide suggestions of types of food if protein intake is too low and etc.
What would be the best approach in order to achieve the result wanted in my specific situation?
I am really unfamiliar and stepping into uncharted territory regarding on how I can accomplish this. I have read that Firebase Analytics generates different basic analytics regarding usage of the app, number crash-free users etc. But can it perform analytics on custom events? Can I create a custom event for Firebase analytics to keep track of a certain node in my database, and output analytics from that? And then of course, if yes, does it work with React Native-Expo or do I need to detach from Expo? In addition, I have read that Firebase Analytics can be combined with Google BigQuery. Would this be an alternative for my case?
Are there any other ways of performing such data analysis on my data stored in Firebase database? For example, export the data and use Python and SciKit Learn?
Whatever opinion or advice you may have, I would be grateful if you could share it!
You're not alone - many people building web apps on GCP have this question, and there is no single answer.
I'm not too familiar with Firebase Analytics, but can answer the question for Firestore and for your custom analytics (e.g. weekly avg protein consumption)
The first thing to point out is that Firestore, unlike other NoSQL databases, is storage only. You can't perform aggregations in real time like you can with MongoDB, so the calculations have to be done somewhere else.
The best practice recommended by GCP in this case is indeed to do a regular export of your Firestore data into BQ (BigQuery), and you can run analytical calculations there in the meantime. You could also, when a user inputs some data, send that to Pub/Sub and use one of GCP Dataflow's streaming templates to stream the data into BQ, and have everything in near real time.
Here's the issue with that however: while this solution gives you real time, and is very scalable, it gets expensive fast, and if you're more used to Python than SQL to run analytics it can be a steep learning curve. Here's an alternative I've used for smaller webapps, which scales well for <100k users and costs <$20 a month on GCP's current pricing:
Write a Python script that grabs the data from Firestore (using the Firestore Python SDK), generates the analytics you need on it, and writes the results back to a Firestore collection
Create an endpoint for that function using Flask or Django
Deploy that server application on Cloud Run, preventing unauthenticated invocations (you'll only be calling it from within GCP) - see this article, steps 1 and 2 only. You can also deploy the Python script(s) to GCP's Vertex AI or hosted Jupyter notebooks if you're more comfortable with that
Use Cloud Scheduler to call that function every x minutes - see these docs for authentication
Have your React app query the "analytics results" collection to get the results
My solution is a FlutterWeb based Dashboard that displays relevant data in (near) realtime like the Regular Flutter IOS/Android app and likewise some aggregated data.
The aggregated data is compiled using a few nodejs based triggers in the database that does any analytic lifting and hence is also near realtime. If you study pricing you will learn, that function invocations are pretty cheap unless of-course you happen to make a 'desphew' :)
I came up with a great solution.
I used the inbuilt firebase BigQuery plugin. Then I used Cube.js (deployed on GCP - cloud run on docker) on top of bigquery.
Cube.js just makes everything just so easy. You do need to make a manual query It tries to do optimize queries. On top of that, it uses caching so you won't get big bills on GCP. I think this is the best solution I was able to find. And this is infinitely scalable and totally real-time.
Also if you are a small startup then it is mostly free with GCP - free limits on cloud run and BigQuery.
Note:- This is not affiliated in any way with cubejs.
I would like to use Firebase for analytics on iOS and Android app. My users are most of the time in remote area with poor or no network. I would like to optimize battery life so I don't want firebase to create web requests all the time. Is there a way to dispatch data only on command ?
I would like to have the same behaviour than google analytics with the analytics.setLocalDispatchPeriod(0); and send data only when the user is connected to wifi for example.
The SDK already tries to minimize the upload interval times to one per hour to avoid draining battery. If there is any problem with the network, it won't retry immediately but in hours later or back-off if needed. It also has several methods to optimize data latency so it won't be good to freely control the scheduling system.
i have a realtime platform when users are staying on pages for a long duration, i found that after 5 minutes (more or less) the GA realtime stop show them so i created timer that each 4 minutes send pageview and this way all users remain "connected" to GA.
I wonder if it's a good approach or it's can may produce un-accurate data on the reports later.
Is anyone experienced that?
Your terminology seems a little off - users do not become "disconnected" from Google Analytics, the difference between realtime reports and data from the reporting api is that the former shows only a subset of ad hoc computed dimensions and metrics whereas the reporting api shows, after some processing latency, the full set of metrics and dimensions, including stuff that required more processing time like session- and user scoped data.
Other than that your approach is fine. There is a limit on the number of API calls you are allowed to make - the documentation has an example on how to calculate your calls to stay within the limits, and Google suggests to implement some sort of serverside caching if you do need a lot of realtime dashboards.
But this is not going to affect the data quality of reports in any way. Realtime API is a read-only API, the worst thing that can happen is that you exceed your quota and get blocked for the rest of the day. So there is no way this would create "un-accurate data on the reports later".