Firebase Reporting Options - firebase

All my app's data is stored in Firebase. I'd like to build some reports with my data that aren't necessarily accessible through the web/app front-end. I don't see any good options for this in the Console. Has anyone found a good reporting solution for Firebase? I am looking for something like Crystal Reports or just an easy way to render Firebase data based on a query.
Thanks,
Rima.

I found a issue with the solution above. Firebase stores its data in JSON format, which cannot be consumed by solutions such as BigQuery, because it is expecting JSONL format and you get an error. It beats me why Google are not providing an elegant solution when integrating between two of their products, but I believe they have something planned.

Firebase does not have any "built-in" reporting tools other than the defined querying APIs.
If your database is small, you dump the JSON from the Firebase Console and then manually run analysis on it.
If your database is large, you can upgrade to the Flame or Blaze plans and sign up for daily private backups. This will create a JSON dump in the background without affecting your database performance and store it in the cloud. You could then use tools to grab that dump and perform advanced reporting on it.

1) Via BigQuery
Official Docs:
https://cloud.google.com/bigquery/docs/loading-data-cloud-firestore
Codelab Walkthrough:
https://codelabs.developers.google.com/codelabs/modern-data-pipeline-firestore-bigquery-dataflow-templates/index.html?index=..%2F..next17
Connect whatever BI tool you want to BigQuery. Google Data Studio is
free as is Metabase. Almost every Enterprise BI tool has a BigQuery
connector.
From https://www.reddit.com/r/Firebase/comments/arps42/reportingbi_tools_and_firestore/
2) Via "Custom Data Sources"
Cloud Firestore (and probably Realtime-Db) has a RESTFUL API. Many popular reporting tools support "custom", "restful", "ajax", and/or "HTTP" sources.
You should search your favorite reporting tools, and internet search accordingly.
I can see that Stimulsoft seems to support custom/RESTFUL sources. A PowerBI data connector seems to provide a lot of latitude - https://github.com/Microsoft/DataConnectors
Of course, this means that you need to create several data sources, and they probably won't be as optimised as a built-in source type. For example, the report engine probably won't know how to translate any front-end UI filters into custom-source query filters. Perhaps some platforms support the ability for you to create your own adaptors.

Related

Firebase Cloud Firestore "Request/response" documentation

Most of "bigger" project I was working with was using REST API for Frontend->Backend communication. I was using Firebase Cloud Firestore for some small (one-day/hackathon) projects. Now I'm thinking about using Firestore for some bigger project but I'm not sure if this will work.
For "standard", REST api project I had Swagger documentation, where each developer could see list of all endpoints with request/response data structures. How does it work with Firestore? Can I create similar documentation for developers to check data structure, so they will know what can they add and what should they read? Or maybe there is another way?
I'm thinking, maybe there is no tool for this kind of documentation because frontend data structures are defining database structure? But what if I am connecting database from two or more platform (ex. web, mobile and cloud functions)? How can I synchronize knowledge about data structures between all the developers?
I was looking for some answers but couldn't find anything useful expect advice to manually maintain some documentation. How does it work in your projects? Is there some automation? Manually written documentation? Or no documentation - everything "in code"?
I understand your concerns, but unfortunately, there is no such tool available for Cloud Firestore to generate the documentation for database structure as Swagger.
I believe you can do it programatically.
From
Generating Swagger Docs in Firebase Cloud Functions project
I'm using express and nodejs in my Firebase Function implementations, and for me, Swagger doc generation can be implemented via the following libraries:.
https://github.com/scottie1984/swagger-ui-express
https://github.com/Surnet/swagger-jsdoc
You can find other libraries at:
https://swagger.io/tools/open-source/open-source-integrations
In addition to the responses there, the following service allows you to access Firestore metadata, click the explorer tab, looks promising for your use case https://aapi.io/api-directory/Google_CloudFirestore_GoogleCloudFirestoreAPI_v1beta1 though not necessarily more so than the links above.

Automatically Upload Job offers to Linkedin

I've read much about implementing LinkedIn into my website, but is it possible to upload job offers from an external software directly to LinkedIn?
I’m working with an SAP based recruiting management software. One of the function it is capable of, is to create job offers. But until now, they are only internal. I want to implement a function, which allows the user to automatically upload a job offer to LinkedIn.
Does LinkedIn provide some kind of support for such kind of work? XING for example, offers you documentations for a connection via API or XML. Is there something additional existing for LinkedIn?
Looks like it isn't available for the normal APIs. They have a specific section based on their "Talent Solutions" that may be of interest. Looks to be a paid program
https://developer.linkedin.com/partner-programs/talent

How to migrate data from Parse.com to Firebase

I have two production apps that are currently using Parse.com. I have no plans on using Parse server, and I wanted to switch to the firebase service. I was wondering if there was a way to migrate my database from Parse.com to Firebase
There are differences between Parse and Firebase that makes a straight migration not as easy as you would hope.
Parse is based on a relational database, where as Firebase stores all it's data in JSON - thus a "copy and paste" job isn't going to work here.
On top of that the way that the two platforms organise user authentication is completely different.
So unfortunately no easy solution here.
Firebase has a import JSON option so if you get your data out of Parse.com as JSON, it can be imported.
However, the structure Parse uses to create relationships between data is (probably) going to be different than Firebase, so it's going to take some planning and coding to make the transition.
Once we had a plan, we found it easiest to just craft an importer App that would take the Parse.com data structure, and massage it to a Firebase format that worked for our app.
In some cases we had to start from scratch as the thought process is different from Parse (objects) to Firebase.
Firebase API is completely different from the Parse one. It means that you have to learn other API, SDKs, etc, and rewrite your frontend code.
Does not exist an easy path to migrate from Parse to Firebase.
Moreover I think it is not a good decision. Parse Server community is growing and it is becoming even better than original Parse. In a short time, Parse Server will become the best framework for backend and API development.
My recommendation to you is to migrate to a Parse Hosting provider. Using this kind of solution you will use same Parse APIs and features. It will not require you learn other technology nor rewrite any frontend code.
You can find some options in parse server repository:
https://github.com/ParsePlatform/parse-server#parse-server-sample-application
For a full disclaimer, I am co-founder of https://www.back4app.com that is the first parse server mover.

Best way to export and import all Apigee Edge objects related to an org?

Are there scripts for exporting and importing all Apigee Edge objects, such as developers, users, apps, caches, key value maps, etc?
To clarify, it would be nice to have non-runtime objects as a priority vs. the runtime data contained within. E.g., the current content of caches are not as critical as just having the cache object available.
I have released a tool that can be used to retrieve Apigee organization settings. This tool has been in use internally at Apigee for some time, but this is the first time it has been released to the public. It uses the Apigee management API to pull configuration data, and that data to be pulled is configurable. The data is stored in a hierarchical directory structure, which can be archived, explored, or used to compare organizations. It can be used with both the Apigee Edge cloud and on-prem offerings.
A few caveats:
This tool does not retrieve all data from an organization. For example, it does not retrieve API proxies. Use the Apigee management UI or management API to retrieve API proxies.
The tool is composed of a few bash scripts. It has been successfully run on Linux and Mac OS X.
The tool does not write data back into the organization, although the files it retrieves can often be POSTed back to the organization using the management API.
This tool is released as-is. It is not officially supported by Apigee.
Find the tool at the api-platform samples site (https://github.com/apigee/api-platform-samples) in the tools/org-snapshot directory.
There is work planned to provide a tool that will export/import provisional data (such as apps, developer, products). Other aspects of an org's configuration require access to the production Cassandra database, which cannot be given out publicly. We have a provisional tool for in-house use that we are currently hardening. If the consumer tool (when it is available) doesn't provide all of the backup support you need, you will need to log a support ticket for them to run the in-house tool.
There are scripts for importing a set of objects (developers, apps, API products) that work with the sample proxies that you can find on GitHub:
https://github.com/apigee/api-platform-samples/tree/master/setup
For Perl programmers: see also Apigee::Edge on CPAN

Data from Google Analytics

So Google Analytics does not have an API that we can use to get our data, so is there an efficient way to programaticly fetch the data collected by Google, without logging it locally?
Edit:
I would prefer a Python or PHP solution but anything will work.
Google just announced that they're making available a data export API for Google Analytics. It sounds like that's exactly what you're looking for.
Per their announcement, the feature's currently in private beta, but I figure it'll be rolled out to all accounts in coming weeks/months. Depending on your needs, you may just want to wait, instead of building a short-term hackish solution.
If you're interested, I presume that the functionality's being rolled out first to members of the Google Analytics Trusted Tester program.
Also, I forgot about this: I never actually completely implemented this for a client because the deal fell through...
But you can customize the dashboard to include the sections of Google Analytics that your report might need and have a scheduled email. If the reports do not need to be too detailed and if Google already aggregates the data in the way you need it, then this might work for you.
The Google Analytics API is now open to everyone and looks like it contains the full data set
Well, it depends on what you want to do with the data. If you only want to process part of it, then I don't think it is difficult.
Here's a basic web search with a hit explanations from Google and someone else:
http://www.google.com/support/analytics/bin/answer.py?hl=en&answer=55561
http://blogoscoped.com/archive/2008-01-17-n73.html
There is a completely programmatic way to access the data using greqo(PHP), but the analytics class is in beta. Check it out here.
If beta is not acceptable, you can use a mixture of the XML and Yahoo Pipes to get what you need.
Basic Method
Obtain the tracking data in a usable
format – We can schedule Google
Analytics to email this as an XML file
on a regular basis.
Make the XML file accessible online –
By emailing an attachment to Google
Groups, the file is automatically
given a public URL.
Work out the URL of the most recent
report – Since Google Groups provides
RSS/Atom feeds for all messages, we
can easily find the URL of the most
recent message and therefore work out
the URL of the XML report.
Prepare the data for use – We need to
manipulate the XML and massage it into
a handy JSON format that we can use on
our blog, which can all be done using
Yahoo Pipes.
Taken from here.
I implemented a solution where we scheduled the analytics report to be emailed to a gmail account each day and I pulled the report on demand via POP3. It's pretty easy and works fast. I've heard Epic1 will do this for you as well. I'm researching that now.
If you're using Python, Pandas io is also very helpful. Pandas has an interface on top of the Google Analytics API. It's pretty simple to get up and running and integrates with Pandas so you get the aggregation, time series features, and other data analysis library features.
instructions on how to authenticate and shows examples: http://blog.yhathq.com/posts/pandas-google-analytics.html
more examples: http://quantabee.wordpress.com/2012/12/17/google-analytics-pandas/
I've also posted a few queries to get started
https://github.com/sk8asd123/ga_pandas
Its been a while since I had to deal with this, but Google Analytics has an XML output type, so you can parse that to get the data in your own system. However, I believe that there is no way to get the xml file programatically, so someone still has to go in and generate the file and feed it to your app.
Good question though, I'd love to see if there is a 100% automated solution.
We just released a product - Megalytic - that makes it very easy to create custom reports using data from the Google Analytics API. You can email these reports to others without sharing your Google Analytics account. Also, create links to reports, download as PDF, etc.

Resources