Is there a way to back up my App Services / Usergrid data - apigee

App Services is a great place to store data but now that I have a lot of critial info in there I realized there isn't a way to create a backup or roll back to an earlier state (in case I did something stupid like -X DELETE /users)
Any way to back up this data either online or offline?

Apart from API access to fetch records x by x and storing locally, there is no solution at the moment. Team is planning an S3 integration (export data to S3) but no completion date is defined for that yet.

Looks like the only way is to query the data using e.g. CURL and save the results to a local file. I dont believe there is a way to export natively.
http://apigee.com/docs/app-services/content/working-queries

From 2014/2015 Usergrid versions it is possible to make exports and imports using "Usergrid tools"
On this page it is explained how to install them :
https://github.com/apache/incubator-usergrid/tree/master/stack/tools
Basically once you run
$ java -jar usergrid-tools.jar export
and this will export your data as json files in an export directory.
There are several export and import tools avaible, the best way to see them is to visit this page :
https://github.com/apache/incubator-usergrid/tree/6d962b7fe1cd5b47896ca16c0d0b9a297df45a54/stack/tools/src/main/java/org/apache/usergrid/tools

Related

Connect airflow to google fusion

I'd like to write python script which manages my google data fusion pipelines and instances (creates new, deletes, starts, etc). For that purpose I use airflow installed as library. I've read some tutorials and documentations but I still can't make that script connect with data fusion instance. I've tried to use next string:
export AIRFLOW_CONN_GOOGLE_CLOUD_DEFAULT='google-cloud-platform://?extra__google_cloud_platform__key_path=%2Fkeys%2Fkey.json&extra__google_cloud_platform__scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform&extra__google_cloud_platform__project=airflow&extra__google_cloud_platform__num_retries=5'
with my data json key file and Project id but it still doesn't work. Can you give me an example of creating that connection?
You can find an example python script here:
https://airflow.readthedocs.io/en/latest/_modules/airflow/providers/google/cloud/example_dags/example_datafusion.html
This page provides a breakdown for each Data Fusion Operator if you would like to learn more about them:
https://airflow.readthedocs.io/en/latest/howto/operator/gcp/datafusion.html

Is there a way to get data GUI view in Meteor.js?

Is there a way to get data GUI view in Meteor.js?
If yes please enlighten me, as I am new to meteor. I just need to know what is the best way to access database in meteor mongodb...
Thanks!!!
Use either:
Z Mongo Admin, a meteor package similar to Django admin. This is
probably the closest to what you're looking for:
http://www.youtube.com/watch?v=ixJyB8Z-tU8&list=UU3fBiJrFFMhKlsWM46AsAYw
One of the many Mongo GUIs, I use both robomongo
(http://robomongo.org/) and mongohub
(https://github.com/bububa/MongoHub-Mac) on OSX.
I've found that this little package called Mongol is good for development to have a quick and easy way to look at your data right from within your web app.
https://github.com/msavin/Mongol
RoboMongo provides a good UI for Mongo DB (Download link). When i tried with meteor, follow these steps:
Run Meteor cd ing into your Project Folder (usually runs at http://localhost:3000/)
Download and Install the Robomongo from the Download link
Create a new connection in the robomongo UI, clicking on the following icon:
Connect to port number 3001 instead of default 27017, and you can see the mongo db and it's contents
There's no GUI given. You can access db from command line by using meteor mongo while your server is running.
Appreciate the mention for Mongol!
I wanted to say that there is now one more option for viewing and interacting with data in your Meteor app called Meteor Candy. It's a little less raw than viewing JSON, which makes it far more palatable for non-developers. And of course, there is still an option to view raw JSON :)

How to migrate data & settings from one firebase to another?

Are there any tools to aid in data migration from dev to staging to prod? If not, are there plans to build them?
I know you can Export JSON and Import JSON from Forge, but that doesn't include authorization and security settings.
All of our data is available through a REST API, so you could easily write a script to do this yourself. You can export the data by setting format=export (this includes all of the priority data in the response):
curl https://myapp.firebaseIO.com/.json?format=export&auth=YOUR_FIREBASE_SECRET
As for exporting the security rules, you can access them here:
curl https://myapp.firebaseIO.com/.settings/rules/.json?auth=YOUR_FIREBASE_SECRET
You can then write them back to the new Firebase using PUT.
The various Auth settings can't easily be automatically transferred (such as the Authorized Origins), but they probably shouldn't be as they'll differ between staging and production.
What Andrew said above is mostly correct, however this can be a pain with large firebases.
There is an import project at https://github.com/firebase/firebase-import that will help import large firebases by breaking up the put requests.
Also something to note, you will need to use quotes around the curl url, otherwise the & will background the process. So what Andrew gave above will work instead as
curl -o outputfile.json "https://myapp.firebaseIO.com/.json?format=export&auth=YOUR_FIREBASE_SECRET"
Then you can use the import module I linked with that json file.
Good Luck!
If you want an option that doesn't require cURL, and you have the firebase-tools project installed, you can run this:
firebase database:get --export -o backup.json /
Note that this should be run from a working directory configured as a Firebase project. The advantage of this option is it will use the Auth you've set up for that project, so you don't need to hard-code auth keys into command lines (for the security-conscious) and it doesn't rely on the deprecated auth-key pattern.
Command-line Fu: Another cool technique if you want separate files for each top-level key is calling:
for i in `firebase database:get --shallow / | jq -r 'keys[]'`; do
echo "Downloading $i..."
firebase database:get --export -o $i.json /$i
done
You will need the "jq" tool installed for this to work. Exporting each collection separately can be really useful if you later want to restore or work with just a portion of your data.
Firebase is working on a new service "S3 Customer Backups" that will copy a .gz compressed backup of your entire firebase nightly into an s3 bucket you give them. I'm evaluating the beta of this service right now, but if it is something you need, I recommend asking support about it.
Our firebase got too large for the curl operation to complete, and this new solution will enable us to manage our dev environments. So if you have a large firebase, setup the S3 Customer Backups then use firebase-import to shove the data into your dev/staging firebases. Victory!
I just created this ruby gem for cloning a firebase remote config data from an existing project o a new one project.

Gather data from drupal and export to CSV on schedule

We have a drupal site and we wish to export data from this site in the form of several CSV files. I'm aware of the Views module addins that make this a very simple process on demand, but what we're looking for is a way to automate this process through cron.
Most likely, we'll end up having to either write a standalone PHP file we can then access with cron to complete this action, or a custom module.
I first wanted to check to ensure that there isn't already a module or set of modules out there that will do what we're looking for. How would you accomplish this issue?
The end result is that these csv files will reside on the server for other services to pick up and import into their own systems or be distributed with rsync or something similar.
Best practices suggestions would also be appreciated!
if you want to do with cron,
Set up views with cvs data in them
Then add wget <path to your cvs view> or the path of a script which does everything you need, in your crontab.

need help in choosing the right tool

I have a client who has set-up a testing environment in some AI language. It basically runs some predefined test cases and stores the results in as log files (comma separated txt files). My job is to identify and suggest a reporting system and I have these options in mind. either
1. Importing the logs into MSSQL and use the reporting(SSRS) it uses
2. or us import the logs to MySQL and use PHP to develop custom reporting.
I am thinking that going with option2 is better. The reason for this is, the logs are inconsistent and contain unexpected wild characters that normally DB's don't accept. So, I can write some scripts in php before loading them to the database.
Can anyone please suggest if this is your problem what will you suggest to do?
It depends how fancy you need to be. If the data is in CSV files, you could even go so simple as to load it into Excel (or their favorite spreadsheet tool), and use spreadsheet macros to analyze it.

Resources