Falcon sqlite3 connection - sqlite

I am trying to create my first REST api. I heard that the Falcon is good and easy for beginners. I read the official docs and there is nothing about how to connect to the database.
I have seen the flask docs as well and there is well written everything.
def get_db():
"""Opens a new database connection if there is none yet for the
current application context.
"""
if not hasattr(g, 'sqlite_db'):
g.sqlite_db = connect_db()
return g.sqlite_db
Is there any way to connect SQlite with falcon?

You can use any orm to ease the process of database connection and data handling.
A simple SQLite connection using sqlalchemy is :
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
engine = create_engine('sqlite:///dbname.db', echo=True)
session = sessionmaker()
session.configure(bind=engine)
This will create a db connection and the same can be used to perform database operations.

Related

Importing and Accessing a sqlite Created Database IN FLASK without using SQL_ALCHEMY

I am generally new to programming, with a large capacity for self-learning. I have found myself, after taking Harvard's CS50 program unable to use a database in FLASK (using python).
I have created the database with sqlite and have it saved to my work environment and a copy in exterior folders.
I would not like to use FLASK ALCHEMY as I am 100% comfortable in SQL and would not like to start defamiliarizing myself with the basic usage.
I am using visual studio and have Flask already properly installed, and being used to define working routes already.
The database is named trial.db and it can be assumed to have been properly set-up.
import os
from cs50 import SQL
from flask import Flask, flash, redirect, render_template, request,
session, jsonify
from flask_session import Session
from tempfile import mkdtemp
from werkzeug.exceptions import default_exceptions, HTTPException,
InternalServerError
from werkzeug.security import check_password_hash,
generate_password_hash
from helpers import apology, login_required, lookup, usd
app = Flask(__name__)
app.config["TEMPLATES_AUTO_RELOAD"] = True
#app.after_request
def after_request(response):
response.headers["Cache-Control"] = "no-cache, no-store, must-
revalidate"
response.headers["Expires"] = 0
response.headers["Pragma"] = "no-cache"
return response
app.jinja_env.filters["usd"] = usd
app.config["SESSION_FILE_DIR"] = mkdtemp()
app.config["SESSION_PERMANENT"] = False
app.config["SESSION_TYPE"] = "filesystem"
Session(app)
db = SQL("sqlite:///finance.db")
** THE ABOVE CODE is what I am use to calling based on CS50s library, which is excessively generous.**
SQL, as above is utilized like such:
cs50.SQL(url)
Parameters
url – a str that indicates database dialect and connection arguments
Returns
a cs50.SQL object that represents a connection to a database
Example usage:
db = cs50.SQL("sqlite:///file.db") # For SQLite, file.db must exist
Please help, and Thank you

Create a dynamic database connection in Airflow DAG

I am using Apache-Airflow 2.2.3 and I know we can create connections via admin/connections. But I trying for a way to create a connection using dynamic DB server details.
My DB host, user, password details are coming through the DAGRun input config and I need to read and write the data to DB.
You can read connection details from the DAGRun config:
# Say we gave input {"username": "foo", "password": "bar"}
from airflow.models.connection import Connection
def mytask(**context):
username = context["dag_run"].conf["username"]
password = context["dag_run"].conf["password"]
connection = Connection(login=username, password=password)
However, all operators (that require a connection) in Airflow take an argument conn_id that takes a string identifying the connection in the metastore/env var/secrets backend. At the moment it is not possible to provide a Connection object.
Therefore, if you implement your own Python functions (and use the PythonOperator or #task decorator) or implement your own operators, you should be able to create a Connection object and perform whatever logic using that. But using any other existing operators in Airflow will not be possible.

aiohttp nested applications and passing app data

I am trying out aiohttp (to test against Flask, and just to learn it) and am having an issue with passing data via the Application. The examples say that I can set a key value in the app in order to pass static info (e.g., a database connection). But, somehow this information is getting lost and I suspect it is in the nested applications, though not sure.
app.py:
import asyncio
from aiohttp import web
import logging
from data import data_handler
from data import setup_web_app as data_setup_web_app
logging.basicConfig()
log = logging.getLogger('data')
log.setLevel(logging.DEBUG)
async def my_web_app():
loop = asyncio.get_event_loop()
app = web.Application(loop=loop)
app['test'] = 'here'
data_setup_web_app(web, app)
return app
data.py:
from aiohttp import web
import logging
logging.basicConfig()
log = logging.getLogger('data')
log.setLevel(logging.DEBUG)
def setup_web_app(web, app):
data = web.Application()
data.add_routes([web.get('/{name}', data_handler, name='data')])
app.add_subapp('/data/', data)
async def data_handler(request):
name = request.match_info['name']
log.debug('test data is {}'.format(request.app['test']))
return web.json_response({'handler': name})
And I am using gunicorn to run it: gunicorn app:my_web_app --bind localhost:8080 --worker-class aiohttp.worker.GunicornWebWorker --workers=2
But when I go to http://127.0.0.1:8080/data/asdf in the browser I get a KeyError: 'test' in the data.py debug print statement.
I suspect the app data is not being passed through correctly to the nested applications, but not sure.
Now keys from main app are not visible from subapp and vise versa.
Please read the issue for more details.
I'd like to support a kind of chained map for this but the feature is not implemented yet.

Where does my sqlite database go?

This might be a dumb question, but I'm kind of confused as to how SQLAlchemy works with the actual database being used by my Flask application. I have a python file, models.py that defines a SQLAlchemy database schema, and then I have this part of my code that creates the database for it
if __name__ == '__main__':
from datetime import timedelta
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
engine = create_engine('sqlite://', echo=True)
Base.metadata.create_all(engine)
Session = sessionmaker(bind=engine)
session = Session()
# Add a sample user
user = User(name='Philip House', password="test")
session.add(user)
session.commit()
I run that file and it works fine, but now I'm confused to as what happens with the database..how can I access it in another application? I've also heard that it might just be in memory, and if that is the case, how do I make it a permanent database file I can use my application with?
Also in my application, this is how I refer to my sqlite database in the config file:
PWD = os.path.abspath(os.curdir)
DEBUG=True
SQLALCHEMY_DATABASE_URI = 'sqlite:///{}/arkaios.db'.format(PWD)
I dunno if that might be of any help.
Thanks!!
Here are the docs for connection to SQLAlchemy with SQLite.
As you guessed, you are in fact creating a SQLite database in memory when you use sqlite:// as your connection string. If you were to use sqlite:///{}/arkaios.db'.format(PWD) you would create a new database in your current directory. If this is what you intend to do so that you can access that database from other applications then you should import your connections string from your configuration file and use that instead of sqlite://.

sqlalchemy not loading sqlite db

My app is running on openshift and I'm not being able to load the database. These are my codes:
from sqlalchemy import Column, Integer, String,create_engine,ForeignKey,Time
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
from classes import Team,Match,Channel,Country,Mapping
import json
app = Flask(__name__)
engine = create_engine('sqlite:///../data/euro2012tvguide.sqlite')
Session = sessionmaker(bind=engine)
session = Session()
In the file data, I've the file euro2012tvguide.sqlite which is the sqlite db
In fact the problem was that there was a problem with the path, it should have been like this
engine = create_engine('sqlite://' + os.path.join(os.environ["OPENSHIFT_DATA_DIR"], 'euro2012tvguide.sqlite'))
I obtained much help from the openshift forum, here is the link, https://openshift.redhat.com/community/forums/openshift/sqlalchemy-not-loading-sqlite-db

Resources