cloud functions python to access Datastore - google-cloud-datastore

I am looking for a tutorial or document on how to access datastore using cloud functions (python).
However, it seems there is only tutorial for nodejs.
https://github.com/GoogleCloudPlatform/nodejs-docs-samples/tree/master/functions/datastore
Can anybody help me out?
Thanks

There are no special setup needed to access datastore from cloud functions in python.
You just need to add google-cloud-datastore into requirements.txt and use datastore client as usual.
requirements.txt
# Function dependencies, for example:
# package>=version
google-cloud-datastore==1.8.0
main.py
from google.cloud import datastore
datastore_client = datastore.Client()
def foo(request):
"""Responds to any HTTP request.
Args:
request (flask.Request): HTTP request object.
Returns:
The response text or any set of values...
"""
query = datastore_client.query(kind=<KindName>)
data = query.fetch()
for e in data:
print(e)
Read more:
Python Client for Google Cloud Datastore
Setting Up Authentication for Server to Server Production Applications

Related

Kubernetes Client API from Google Cloud Functions (Firebase) Token Refresh

I want to start Kubernetes jobs on a GKE cluster from a Google Cloud Function (Firebase)
I'm using the Kubernetes node client https://github.com/kubernetes-client/javascript
I've created a Kubernetes config file using `kubectl config view --flatten -o json'
and loaded it
const k8s = require('#kubernetes/client-node');
const kc = new k8s.KubeConfig();
kc.loadFromString(config)
This works perfectly locally but the problem is when running on cloud functions the token can't be refreshed so calls fail after a while.
My config k8s config files contains
"user": {
"auth-provider": {
"name": "gcp",
"config": {
"access-token": "redacted-secret-token",
"cmd-args": "config config-helper --format=json",
"cmd-path": "/usr/lib/google-cloud-sdk/bin/gcloud",
"expiry": "2022-10-20T16:25:25Z",
"expiry-key": "{.credential.token_expiry}",
"token-key": "{.credential.access_token}"
}
}
I'm guessing the command path points to the gcloud sdk which is used to get a new token when the current one expires. This works locally but on cloud functions it doesn't as there is no /usr/lib/google-cloud-sdk/bin/gcloud
Is there a better way to authenticate or a way to access the gcloud binary from cloud functions?
I have a similar mechanism (using Cloud Functions to authenticate to Kubernetes Engine) albeit written in Go.
This approach uses Google's Kubernetes Engine API to get the cluster's credentials and construct the KUBECONFIG using the values returned. This is equivalent to:
gcloud container clusters get-credentials ...
APIs Explorer has a Node.js example for the above method. The example uses Google's API Client Library for Node.JS for Kubernetes Engine also see here.
There's also a Google Cloud Client Library for Node.js for Kubernetes Engine and this includes getCluster which (I assume) is equivalent. Confusingly there's getServerConfig too and it's unclear from reading the API docs as to the difference between these methods.
Here's a link to the gist containing my Go code. It constructs a Kubernetes Config object that can then be used by the Kubernetes API to authenticate you to a cluster..

Can I use GCP Client API in Airflow tasks directly?

I'm using Airflow(GCP Composer) now.
I know it has GCS hook and I can download some GCS files.
But I'd like to read a file partially.
Can I use this python logic with PythonOperator in DAG?
from google.cloud import storage
def my_func():
client = storage.Client()
bucket = client.get_bucket("mybucket")
blob = bucket.get_blob("myfile")
data = blob.download_as_bytes(end=100)
return data
In Airflow task, is direct Client API call which is not using hooks forbidden?
You can but a more Airflowy to handle missing functionality in the hook is to extend the hook:
from airflow.providers.google.cloud.hooks.gcs import GCSHook
class MyGCSHook(GCSHook):
def download_bytes(
self,
bucket_name: str,
object_name: str,
end:str,
) -> bytes:
client = self.get_conn()
bucket = client.bucket(bucket_name)
blob = bucket.blob(blob_name=object_name)
return blob.download_as_bytes(end=end)
Then you can use the hook function in PythonOperator or in a custom operator.
To note that GCSHook has download function as you mention.
What you may have missed is that if you don't provide filename it will download as bytes (see source code). It doesn't allow to configure the end parameter as you expect but this should be an easy fix to PR for Airflow if you are looking to contributing to Airflow open source.

How to sync Firebase Database with Google Sheets?

I am working in an Ionic 3 project with ts to integrate Firebase into my app.
The below code I used to integrate firebase with Ionic project
constructor(angFire: AngularFireDatabase){
}
books: FirebaseListObservable<any>;
To send the data from my app to firebase, I used push method and to update entries I used update($key). Now I have all the data's in Firebase backend.
Now, how can I sync the firebase Database with Google Sheets so that each and every entry added to firebase backend has to get updated into sheets. I used a third party ZAPIER for this integration, but it would be nice if I get to learn on how to do this sync on my own.
Upon surfing, there are many tutorials to get the data's from the google sheets into Firebase. But I didn't come across any tutorials for vice versa.
I followed the below tutorial but it doesn't point to spreadsheets.
https://sites.google.com/site/scriptsexamples/new-connectors-to-google-services/firebase
Any help would be greatly appreciated!
I looked into importing Firebase right into Google Scripts either through the JavaScript SDK or or the REST API. Both have requirements/steps that Google Scripts cannot satisfy or that are extremely difficult to satisfy.
There is no foreseeable method of downloading the JavaScript SDK inside a Google Script because almost every method requires a DOM, which you don't have with a Google Sheet.
The REST API requires GoogleCredentials which, at a short glance, appear very difficult to get inside Google Scripts as well
So, the other option is to interact with Firebase in a true server side environment. This would be a lot of code, but here are the steps that I would take:
1) Setup a Pyrebase project so you can interact with your Firebase project via Python.
import pyrebase
config = {
"apiKey": "apiKey",
"authDomain": "projectId.firebaseapp.com",
"databaseURL": "https://databaseName.firebaseio.com",
"storageBucket": "projectId.appspot.com",
"serviceAccount": "path/to/serviceAccountCredentials.json"
}
firebase = pyrebase.initialize_app(config)
...
db = firebase.database()
all_users = db.child("users").get()
2) Setup a Google Scripts/Sheets project as a class that can interact with your Google Sheet
from __future__ import print_function
import httplib2
import os
from apiclient import discovery
from oauth2client import client
from oauth2client import tools
from oauth2client.file import Storage
try:
import argparse
flags = argparse.ArgumentParser(parents=[tools.argparser]).parse_args()
except ImportError:
flags = None
# If modifying these scopes, delete your previously saved credentials
# at ~/.credentials/sheets.googleapis.com-python-quickstart.json
SCOPES = 'https://www.googleapis.com/auth/spreadsheets.readonly'
CLIENT_SECRET_FILE = 'client_secret.json'
APPLICATION_NAME = 'Google Sheets API Python Quickstart'
class GoogleSheets:
...
# The rest of the functions from that link can go here
...
def write(self, sheet, sheet_name, row, col):
"""
Write data to specified google sheet
"""
if sheet == None or sheet == "":
print("Sheet not specified.")
return
day = time.strftime("%m/%d/%Y")
clock = time.strftime("%H:%M:%S")
datetime = day + " - " + clock
values = [[datetime]]
spreadsheetId = sheet
rangeName = sheet_name + "!" + str(row) + ":" + str(col)
body = {
'values': values
}
credentials = self.get_credentials()
http = credentials.authorize(httplib2.Http())
discoveryUrl = ('https://sheets.googleapis.com/$discovery/rest?'
'version=v4')
service = discovery.build('sheets', 'v4', http=http,
discoveryServiceUrl=discoveryUrl)
result = service.spreadsheets().values().update(
spreadsheetId=spreadsheetId, range=rangeName,
valueInputOption="RAW", body=body).execute()
3) Call the Google Sheets somewhere inside your Pyrebase project
from GoogleSheets import GoogleSheets
...
g = GoogleSheets()
g.write(<project-id>, <sheet-name>, <row>, <col>)
...
4) Set up a cron job to run the python script every so often
# every 2 minutes
*/2 * * * * /root/my_projects/file_example.py
You will need some basic server (Heroku, Digital Ocean) to run this.
This is not extensive because there is a lot of code to be written, but you could get the basics done. Makes we want to make a package now.
You can go for Zapier which is a 3rd party service through which you can easily integrate your Firebase and Google spreadsheets and vice versa. It has also got some support for google docs and other features.
https://zapier.com/zapbook/firebase/google-sheets/
Firebase can't be used as a trigger in Zapier, only as an action, so you can't send data from it to Google Sheets.

Can Firebase RemoteConfig be accessed from cloud functions

I'm using Firebase as a simple game-server and have some settings that are relevant for both client and backend and would like to keep them in RemoteConfig for consistency, but not sure if I can access it from my cloud functions in a simple way (I don't consider going through the REST interface a "simple" way)
As far as I can tell there is no mention of it in the docs, so I guess it's not possible, but does anyone know for sure?
firebaser here
There is a public REST API that allows you to read and set Firebase Remote Config conditions. This API requires that you have full administrative access to the Firebase project, so must only be used on a trusted environment (such as your development machine, a server you control or Cloud Functions).
There is no public API to get Firebase Remote Config settings from a client environment at the moment. Sorry I don't have better news.
This is probably only included in newer versions of firebase (8th or 9th and above if I'm not mistaken).
// We first need to import remoteConfig function.
import { remoteConfig } from firebase-admin
// Then in your cloud function we use it to fetch our remote config values.
const remoteConfigTemplate = await remoteConfig().getTemplate().catch(e => {
// Your error handling if fetching fails...
}
// Next it is just matter of extracting the values, which is kinda convoluted,
// let's say you want to extract `game_version` field from remote config:
const gameVersion = remoteConfigTemplate.parameters.game_version.defaultValue.value
So parameters are always followed by the name of the field that you defined in Firebase console's remote config, in this example game_version.
It's a mouthful (or typeful) but that's how you get it.
Also note that if value is stored as JSON string, you will need to parse it before usage, commonly: JSON.parse(gameVersion).
Similar process is outlined in Firebase docs.

Google Datastore API from Datalab

I am working with Google's Datalab service, on a Google managed Computer engine(default), and I would like to call my Google Datastore's API. The documentation points to using the from google.appengine.ext import db library.
But when I execute this in a datalab code block I get ImportError: No module named appengine.ext.
I realize that this likly means that the App Engine SDK is not installed on the Datalab compute engine, My quetion is how can I then access the My Datastore namespace from my Datalab notebook?
It seems that I was better off using the gcloud package. Seeing as I updated the gcloud package before they where able to update documentation this is an example of the code I used:
from gcloud import datastore
from gcloud.datastore.key import Key
from gcloud.datastore.entity import Entity
import datetime
client = datastore.Client('project_id','namespace')
key = client.key('kind_name')
entity = datastore.Entity(key=key)
entity['datetime'] = datetime.datetime.now()
entity['some_other_column'] = 1
query = datastore.Query(client,kind='kind_name')
for result in query.fetch():
print result

Resources