OpenStack keystoneclient get user by name - openstack

I know it is possible to get a user by id but all I have available is the name. Anyone know how to get a user by name using the keystone client v 2.0?
from keystoneclient.v2_0 import client
keystone = client.Client(username=USER,
password=PASS,
tenant_name=TENANT_NAME,
auth_url=KEYSTONE_URL)
user = keystone.users.get(USER_ID)
need something like the following **
keystone.users.getByName(USER_NAME)

Figured out a way to do this from keystoneclient. Sort of.
Example:
#!/usr/bin/env python
from keystoneclient.v2_0 import client
from keystoneclient import utils
keystone = client.Client(username='admin',
password='stack',
tenant_name='demo',
auth_url='http://192.168.122.236:5000/v2.0/')
def do_user_get(kc, args):
"""Display user details."""
user = utils.find_resource(kc.users, args)
utils.print_dict(user._info)
do_user_get (keystone, 'demo')
Makes use of utils in addition to the client.users
There are some extra parsing functions in util you might want to check out.

Related

How to test HTTP endpoints in namek 2.14.1

I am beginner to Nameko framework. I have a very simple Serivce with one GET & POST endpoint, which works as expected when i run locally.
I am trying to create a test case for my Nameko service and I cant seem to find a documentation which explains clearly how I can go about it.
from nameko.web.handlers import http
import json
class SampleService:
name = "sample_service"
#http("GET", "/health")
def health(self, request):
return 200, json.dumps({'status': "healthy"})
#http("POST", "/create")
def create_user(self, request):
data = request.get_json(force=True)
print(data)
return 200, json_dumps({'status': 'created'})
The Best reference I had for testing this was https://github.com/nameko/nameko-examples/tree/master/gateway/test/interface and I am not entirely sure if this code is up todate and can be easily replicated.
Any help on this would be much appreciated.

Can I share an object between telegram commands of a bot?

I want to create an object when the user press /start in a Telegram bot, and then share this object among all the commands of the bot. Is this possible? As far as I understand, there's only one thread of your bot running in your server. However, I see that there is a context in the command functions. Can I pass this object as a kind of context? For example:
'''
This is a class object that I created to store data from the user and configure the texts I'll display depending on
the user language but maybe I fill it also with info about something it will buy in the bot
'''
import configuration
from telegram import Update, ForceReply
from telegram.ext import Updater, CommandHandler, MessageHandler, Filters, CallbackContext
# Commands of the bot
def start(update: Update, context: CallbackContext) -> None:
"""Send a message when the command /start is issued."""
s = configuration.conf(update) #Create the object I'm saying
update.message.reply_markdown_v2(s.text[s.lang_active],
reply_markup=ForceReply(selective=True),
)
def check(update: Update, context: CallbackContext) -> None:
"""Send a message when the command /start is issued."""
s = configuration.conf(update) # I want to avoid this!
update.message.reply_markdown_v2(s.text[s.lang_active],
reply_markup=ForceReply(selective=True),
)
... REST OF THE BOT
python-telegram-bot already comes with a built-in mechanism for storing data. You can do something like
try:
s = context.user_data['config']
except KeyError:
s = configuration.confi(update)
context.user_data['config'] = s
This doesn't have to be repeated in every callback - you can e.g.
use a TypeHandler in a low group to create the config if needed. then in all handlers in higher groups, you don't need to worry about it
use a custom implementation of CallbackContext that adds a property context.user_config
Disclaimer: I'm currently the maintainer of python-telegram-bot.

Python passlib generate one time secret code

What is the easiest way to generate a one-time password (sms secret code with N lengths of symbols) with passlib?
How I'm creating it now:
from secrets import randbelow as secrets_randbelow
def create_secret_code() -> str: # TODO use OTP
secret_code = "".join([str(secrets_randbelow(exclusive_upper_bound=10)) for _ in range(config.SECRET_CODE_LEN)])
print_on_stage(secret_code=secret_code)
return secret_code
Obviously, it needs to check that generated code already not in a use (for example - making it via Redis).
I also already have an passlib object into my code to hashing and verifying passwords
from passlib.context import CryptContext
pwd_context = CryptContext(schemes=["bcrypt"], deprecated="auto")
I found this class, but can't figure out how to just generate sms secret code with N lengths of symbols
P.S. I added a fastapi tag because I'm using an fastapi and passlib is used as standard cryptography tool for it, docs
You can initialize the TOTP class with the number of digits you want for the token, like this:
TOTP(digits=10)
Here's a complete example, using your config.SECRET_CODE_LEN:
from passlib.totp import TOTP
otp = TOTP('s3jdvb7qd2r7jpxx', digits=config.SECRET_CODE_LEN)
token = otp.generate()
print(token.token)

How to make an Airflow DAG read from a google spread sheet using a stored connection

I'm trying to build up Airflow DAGs that read data from (or write data to) some Google spread sheets.
Among the connections in Airflow I've saved a connection of type "Google Cloud Platform" which includes project_id, scopes and on "Keyfile JSON", a dictionary with
"type","project_id","private_key_id","private_key","client_email","client_id",
"auth_uri","token_uri","auth_provider_x509_cert_url","client_x509_cert_url"
I can connect to the Google Spread Sheet using
cred_dict = ... same as what I saved in Keyfile JSON ...
creds = ServiceAccountCredentials.from_json_keyfile_dict(cred_dict,scope)
client = gspread.authorize(creds)
sheet = client.open(myfile).worksheet(mysheet) # works!
But I would prefer to not write explicitly the key in the code and, instead, import it from Airflow connections.
I'd like to know if there is a solution of the like of
from airflow.hooks.some_hook import get_the_keyfile
conn_id = my_saved_gcp_connection
cred_dict = get_the_keyfile(gcp_conn_id=conn_id)
creds = ServiceAccountCredentials.from_json_keyfile_dict(cred_dict,scope)
client = gspread.authorize(creds)
sheet = client.open(myfile).worksheet(mysheet)
I see there are several hooks to GCP connections https://airflow.apache.org/howto/connection/gcp.html but my little knowledge makes me failing in understanding which one to use and which function (if any) extract the keyfile from the saved connection.
Any suggestion would be greatly welcomed :)
Below is the code I'm using to connect to gspread sheets from Airflow using a stored connection.
import json
import gspread
from oauth2client.service_account import ServiceAccountCredentials
from airflow.contrib.hooks.gcp_api_base_hook import GoogleCloudBaseHook
def get_cred_dict(conn_id='my_google_connection'):
gcp_hook = GoogleCloudBaseHook(gcp_conn_id=conn_id)
return json.loads(gcp_hook._get_field('keyfile_dict'))
def get_client(conn_id='my_google_connection'):
cred_dict = get_cred_dict(conn_id)
creds = ServiceAccountCredentials.from_json_keyfile_dict(cred_dict, scope)
client = gspread.authorize(creds)
return client
def get_sheet(doc_name, sheet_name):
client = get_client()
sheet = client.open(doc_name).worksheet(sheet_name)
return sheet
With Airflow 2.5.1 (year 2023) the following code works too.
from airflow.providers.google.common.hooks.base_google import GoogleBaseHook
import gspread
# Create a hook object
# When using the google_cloud_default we can use
# hook = GoogleBaseHook()
# Or for a deligate use: GoogleBaseHook(delegate_to='foo#bar.com')
hook = GoogleBaseHook(gcp_conn_id='my_google_cloud_conn_id')
# Get the credentials
credentials = hook.get_credentials()
# Optional, set the delegate email if needed later.
# You need a domain wide delegate service account to use this.
credentials = credentials.with_subject('foo#bar.com')
# Use the credentials to authenticate the gspread client
gc = gspread.Client(auth=credentials)
# Create Spreadsheet
gc.create('Yabadabadoooooooo') # Optional use folder_id=
gc.list_spreadsheet_files()
Wxll
Resources:
gspread Client documentation
GoogleBaseHook documentation

Airflow Custom Metrics and/or Result Object with custom fields

While running pySpark SQL pipelines via Airflow I am interested in getting out some business stats like:
source read count
target write count
sizes of DFs during processing
error records count
One idea is to push it directly to the metrics, so it will gets automatically consumed by monitoring tools like Prometheus. Another idea is to obtain these values via some DAG result object, but I wasn't able to find anything about it in docs.
Please post some at least pseudo code if you have solution.
I would look to reuse Airflow's statistics and monitoring support in the airflow.stats.Stats class. Maybe something like this:
import logging
from airflow.stats import Stats
PYSPARK_LOG_PREFIX = "airflow_pyspark"
def your_python_operator(**context):
[...]
try:
Stats.incr(f"{PYSPARK_LOG_PREFIX}_read_count", src_read_count)
Stats.incr(f"{PYSPARK_LOG_PREFIX}_write_count", tgt_write_count)
# So on and so forth
except:
logging.exception("Caught exception during statistics logging")
[...]

Resources