fastapi + sqlalchemy + pydantic → how to read data return to schema - fastapi

I'm trying to use FastApi, sqlalchemy and pydantic.
I have in request body, a schema, a field type list, and optional named files (files: list[schemas.ImageBase]).
I need to read all the entered data one by one but it doesn't let me loop for the returned list.
This also happens to me when I get a query returned using for example:
def get_setting(svalue:int, s_name: str):
db = SessionLocal()
query = db.query(models.Setting)\
.filter(
models.Setting.svalue == svalue,
models.Setting.appuser == s_name
).all()
return query
in the -->
async def get_settings(svalue: int, name:str):
**values**=crud.get_setting(svalue=svalue,s_name=name)
return {"settings" : values}
But I can't loop (with the for) **values**
Why? I have to set something or I'm wrong using the query or pydantic?
I expect to be looking for a list or dictionary and being able to read the data

Related

fastAPI SQLmodel MultipleResultsFound: Multiple rows were found when exactly one was required

This is my delete function.
def delete_session(self,session_id: int, db):
with Session(engine) as session:
statement = select(db).where(db.session == session_id)
results = session.exec(statement)
sess = results.one()
print("sess: ", sess)
if not sess:
raise HTTPException(status_code=404, detail="Session not found")
session.delete(sess)
session.commit()
return {"Session Deleted": True}
I want to delete all records where session_id matches.
But its throwing following error
MultipleResultsFound: Multiple rows were found when exactly one was required
How can i delete multiple rows at once.
I tried using
sess = results.all()
but it say
sqlalchemy.orm.exc.UnmappedInstanceError: Class 'builtins.list' is not mapped
Thanks
Currently, you are trying to delete several data items, except that session.delete() only takes a single value, not a list of values.
You are using results.one() probably thinking that you can isolate your answers and return only one. However, it is explained in the documentation that if multiple entries are found in the parameter passed to one() then it will throw a MultipleResultsFound exception, hence your error.
Indeed, your statement returns a list, so multiple values.
In order to delete all your elements, you should not use one() but simply iterate with a for loop on your results and delete one by one, your data, as follows:
def delete_session(self, session_id: int, db):
with Session(engine) as session:
statement = select(db).where(db.session == session_id)
results = session.exec(statement).all()
for sess in results:
session.delete(sess)
session.commit()
return {"Session Deleted": True}

Passing additional arguments to _normalise_coerse methods in cerberus

I have some code see EOM; it's by no means final but is the best way (so far) I've seen/conceived for validating multiple date formats in a somewhat performant way.
I'm wondering if there is a means to pass an additional argument to this kind of function (_normalise_coerce), it would be nice if the date format string could be defined in the schema. something like
{
"a_date":{
"type": "datetime",
"coerce": "to_datetime",
"coerce_args": "%m/%d/%Y %H:%M"
}
}
Vs making a code change in the function to support an additional date format. I've looked through the docs and not found anything striking. Fairly good chance I'm looking at this all wrong but figured asking the experts was the best approach. I think defining within the schema is the cleanest solution to the problem, but I'm all eyes and ears for facts, thoughts and opinions.
Some context:
Performance is essential as this could be running against millions of rows in AWS lambdas (and Cerbie (my nickname for cerberus) isn't exactly a spring chicken :P ).
None of the schemas will be native python dicts as they're all defined in JSON/YAML, so it all needs to be string friendly.
Not using the built-in coercion as the python types cannot be parsed from strings
I don't need the datetime object, so regex is a possibility, just less explicit and less futureproof.
If this is all wrong and I'm grossly incompetent, please be gentle (づ。◕‿‿◕。)づ
def _normalize_coerce_to_datetime(self, value: Union(str, datetime, None)) -> Union(datetime, str, None):
'''
Casts valid datetime strings to the datetime python type.
:param value: (str, datetime, None): python datetime, datetime string
:return: datetime, string, None. python datetime,
invalid datetime string or None if the value is empty or None
'''
datetime_formats = ['%m/%d/%Y %H:%M']
if isinstance(value, datetime):
return value
if value and not value.isspace():
for format in datetime_formats:
try:
return datetime.strptime(value, format)
except ValueError:
date_time = value
return date_time
else:
return None
I have attempted to do this myself and have not found a way to pass additional arguments to a custom normalize_coerce rule. If you want to extend the Cerberus library to include custom validators then you can include arguments and then access these through the constraints in the custom validator. The below is an example that I have used for a conditional to default coercer, but as I needed to specify the condition and both the value to check against and the value to return I couldn't find a way to do this with the normalize_coerce and hence applied inside a validate rule and edited the self.document, as seen by the code.
Schema:
{
"columns":{
"Customer ID":{
"type":"number",
"conditional_to_default":{
"condition":"greater_than",
"value_to_check_against":100,
"value_to_return":22
}
}
}
}
def _validate_conditional_to_default(self, constraint, field, value):
"""
Test the values and transform if conditions are met.
:param constraint: Dictionary with the args needed for the conditional check.
:param field: Field name.
:param value: Field value.
:return: the new document value if applicable, or keep the existing document value if not
"""
value_to_check_against = constraint["value_to_check_against"]
value_to_return = constraint["value_to_return"]
rule_name = 'conditional_to_default'
condition_mapping_dict = {"greater_than": operator.gt, "less_than": operator.lt, "equal_to": operator.eq,
"less_than_or_equal_to": operator.le,
"greater_than_or_equal_to": operator.ge}
if constraint["condition"] in condition_mapping_dict:
if condition_mapping_dict[constraint["condition"]](value, value_to_check_against):
self.document[field] = value_to_return
return self.document
else:
return self.document
if constraint["condition"] not in condition_mapping_dict:
custom_errors_list = []
custom_error = cerberus.errors.ValidationError(document_path=(field, ), schema_path=(field, rule_name),
code=0x03, rule=rule_name, constraint="Condition must be "
"one of: "
"{condition_vals}"
.format(condition_vals=list(condition_mapping_dict.keys())),
value=value, info=())
custom_errors_list.append(custom_error)
self._error(custom_errors_list)
return self.document
This is probably the wrong way to do it, but I hope the above gives you some inspiration and gets you a bit further. Equally I'm following this to see if anyone else has found a way to pass arguments to the _normlize_coerce function.

from django.db.models.Value not including quotes when converting to query string

I am using the django.db.models.Value expression within a QuerySet function in Django 3.2.8. When passing a string value ("|" in my example below), the conversion to a query string fails to add the ', which makes the query to fail.
from django.db import models
from django.db.models import F, Value
from django.db.models.functions import Concat
class PurchaseOrder(models.Model):
id = models.AutoField(primary_key=True)
customer_name = models.CharField(max_length=50)
date = models.DateField()
qs = PurchaseOrder.objects.annotate(
concat=Concat(
F("customer_name"),
Value("~"),
F("date"),
output_field=models.CharField(),
)
).values("concat")
>>> print(str(qs.query))
SELECT CONCAT("purchaseorder"."customer_name", CONCAT(~, "purchaseorder"."date")) AS "concat" FROM "purchaseorder"
As it can be seen from the result above, the ~ character is missing the two ' it should be wrapped around: CONCAT('~', "purchaseorder"."date").
Is this the expected functionality of expression Value or a bug that should be reported?
I am inclined to think it is a bug, because of the following:
I initially solved the problem above by writing it as Value("'~'"), with the two ' inside my string. However, when running the query in a sqlite3 database during unit testing I got an error in the query. I realised that sqlite has a different syntax than Postgres (my production and local dev database) for function Contact:
Postgres: CONCAT([args])
Sqlite: arg1 || arg2 ...
In the sqlite case, Django also wraps every argument in the concatenation with Coalesce:
COALESCE(arg1, '') || COALESCE(arg2, '')
The query that was resulting from this in sqlite looked like the one below:
... COALESCE("purchaseorder"."customer_name", ) || COALESCE('~', ) || COALESCE("purchaseorder"."date",)
(Note that I passed ' inside Value("'~'") this time)
The above query will give an error as the second argument inside COALESCE must not be empty ('' is an acceptable input).
If the above problem is due to a bug, what would be the best workaround to make the sqlite query work?
It seems to be a bug:
Inside django.db.models.functions.text.ConcatPair, method as_sqlite calls method coalesce:
class ConcatPair(Func):
...
def as_sqlite(self, compiler, connection, **extra_context):
coalesced = self.coalesce()
return super(ConcatPair, coalesced).as_sql(
compiler, connection, template='%(expressions)s', arg_joiner=' || ',
**extra_context
)
...
def coalesce(self):
# null on either side results in null for expression, wrap with coalesce
c = self.copy()
c.set_source_expressions([
Coalesce(expression, Value('')) for expression in c.get_source_expressions()
])
return c
This is the mechanism that wraps the arguments inside the concatenation with the COALESCE function, and as it can be seen, it is using Value(''). This means that expression Value is meant to be used directly with a string, as the expression above is expected to produce COALESCE([expression], '') in SQL.
I have not yet figured out what the exact problem is within Value, but below is a workaround for the COALESCE problem inside the concatenation. Simply override the coalesce method inside ConcatPair, adding ' inside Value:
from django.db.models.functions import ConcatPair, Coalesce
def _coalesce(self):
c = self.copy()
c.set_source_expressions(
[Coalesce(expression, Value("''")) for expression in c.get_source_expressions()]
)
return c
ConcatPair.coalesce = _coalesce

How to get back one row's data in rusqlite?

rustc 1.38.0 (625451e37 2019-09-23)
rusqlite 0.20.0
I'm writing a program where I need to get back the id from the last insertion that sqlite just created.
db.execute("insert into short_names (short_name) values (?1)",params![short]).expect("db insert fail");
let id = db.execute("SELECT id FROM short_names WHERE short_name = '?1';",params![&short]).query(NO_PARAMS).expect("get record id fail");
let receiver = db.prepare("SELECT id FROM short_names WHERE short_name = "+short+";").expect("");
let id = receiver.query(NO_PARAMS).expect("");
println!("{:?}",id);
What I should be getting back is the id value sqlite automatically assigned with AUTOINCREMENT.
I'm getting this compiler Error:
error[E0599]: no method named `query` found for type `std::result::Result<usize, rusqlite::Error>` in the current scope
--> src/main.rs:91:100
|
91 | let id = db.execute("SELECT id FROM short_names WHERE short_name = '?1';",params![&short]).query(NO_PARAMS).expect("get record id fail");
| ^^^^^
error[E0369]: binary operation `+` cannot be applied to type `&str`
--> src/main.rs:94:83
|
94 | let receiver = db.prepare("SELECT id FROM short_names WHERE short_name = "+short+";").expect("");
| ------------------------------------------------^----- std::string::String
| | |
| | `+` cannot be used to concatenate a `&str` with a `String`
| &str
help: `to_owned()` can be used to create an owned `String` from a string reference. String concatenation appends the string on the right to the string on the left and may require reallocation. This requires ownership of the string on the left
|
94 | let receiver = db.prepare("SELECT id FROM short_names WHERE short_name = ".to_owned()+&short+";").expect("");
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^
error[E0277]: `rusqlite::Rows<'_>` doesn't implement `std::fmt::Debug`
--> src/main.rs:96:25
|
96 | println!("{:?}",id);
| ^^ `rusqlite::Rows<'_>` cannot be formatted using `{:?}` because it doesn't implement `std::fmt::Debug`
|
= help: the trait `std::fmt::Debug` is not implemented for `rusqlite::Rows<'_>`
= note: required by `std::fmt::Debug::fmt`
Line 94: I understand that rust's String is not the right type for the execute call, but I'm not sure what to do instead.
I suspect what needs to happen is the short_names table needs to be pulled form the database and then from the rust representation of the table for get the id that matches the short I'm trying to work with. I've been going off this example as a jumping off point, but It's dereferenced it's usefulness. The program I'm writing calls another program and then babysits it while this other program runs. To reduce over head I'm trying to not use OOP for this current program.
How should I structure my request to the database to get by the id I need?
Okay. First off, we are going to use a struct, because, unlike in Java, it is literally equivalent to not using one in this case, except that you gain in being able to keep things tidy.
You're trying to emulate Connection::last_insert_rowid(), which isn't a terribly smart thing to do, particularly if you are not in a transaction. We're also going to clear this up for you in a nice and neat fashion:
use rusqlite::{Connection};
pub struct ShortName {
pub id: i64,
pub name: String
}
pub fn insert_shortname(db: &Connection, name: &str) -> Result<ShortName, rusqlite::Error> {
let mut rtn = ShortName {
id: 0,
name: name.to_string()
};
db.execute("insert into short_names (short_name) values (?)",&[name])?;
rtn.id = db.last_insert_rowid();
Ok(rtn)
}
You can convince yourself that it works with this test:
#[test]
fn it_works() {
let conn = Connection::open_in_memory().expect("Could not test: DB not created");
let input:Vec<bool> = vec![];
conn.execute("CREATE TABLE short_names (id INTEGER PRIMARY KEY AUTOINCREMENT, short_name TEXT NOT NULL)", input).expect("Creation failure");
let output = insert_shortname(&conn, "Fred").expect("Insert failure");
assert_eq!(output.id, 1);
}
In rusqlite execute does not return a value. To return a value from a sqlite operation you need to use prepare and a variant of query. While much of Rust allows you to leave type up to the compiler, for rusqite you need to give the receiving variable a type.
There is not currently a way in rusqlite to take a single row out of a query. The type of rows is not a type iterator, so you need to progress over it with a while loop, that will progress based on the error type of rows. After the loop runs once it will return that there are no other row in rows and exit; if there is only one row from the query.
You can use query_named to modify the sql query your sanding. Using the named_params!{} macro will allow you to use a String to send information to the command.
use rusqlite::*;
fn main() {
let short = "lookup".to_string(); // example of a string you might use
let id:i64 = 0;
{ // open for db work
let db = Connection::open("YourDB.db").expect("db conn fail");
let mut receiver = db
.prepare("SELECT * FROM short_names WHERE short_name = :short;")
.expect("receiver failed");
let mut rows = receiver
.query_named(named_params!{ ":short": short })
.expect("rows failed");
while let Some(row) = rows.next().expect("while row failed") {
id=row.get(0).expect("get row failed");
}
} // close db work
println!("{}", id);
}
In the above example, we open a scope with {} around the database transaction, this will automatically close the db when it goes out of scope. Notice that we create our db connection and do all our work with the database solely inside the {}. This allows us to skip closing the db with the explicate command and is done by inference taken by the compiler from the scope: {}. The variables short and id, created in the scope of main(), are still available to the db scope and the rest of the scope of main(). While id is not assigned until the db scope, but it's defined outside of the scope, the scope of main, so that is where id's lifetime begins. id does not need to be mutable because it's only assigned once, if there is in fact only one row to retrieve, the while loop will only assign it once. Otherwise, if the database does not behave as expected this will result in an error.

Can I insert into a map by key in F#?

I'm messing around a bit with F# and I'm not quite sure if I'm doing this correctly. In C# this could be done with an IDictionary or something similar.
type School() =
member val Roster = Map.empty with get, set
member this.add(grade: int, studentName: string) =
match this.Roster.ContainsKey(grade) with
| true -> // Can I do something like this.Roster.[grade].Insert([studentName])?
| false -> this.Roster <- this.Roster.Add(grade, [studentName])
Is there a way to insert into the map if it contains a specified key or am I just using the wrong collection in this case?
The F# Map type is a mapping from keys to values just like ordinary .NET Dictionary, except that it is immutable.
If I understand your aim correctly, you're trying to keep a list of students for each grade. The type in that case is a map from integers to lists of names, i.e. Map<int, string list>.
The Add operation on the map actually either adds or replaces an element, so I think that's the operation you want in the false case. In the true case, you need to get the current list, append the new student and then replace the existing record. One way to do this is to write something like:
type School() =
member val Roster = Map.empty with get, set
member this.Add(grade: int, studentName: string) =
// Try to get the current list of students for a given 'grade'
let studentsOpt = this.Roster.TryFind(grade)
// If the result was 'None', then use empty list as the default
let students = defaultArg studentsOpt []
// Create a new list with the new student at the front
let newStudents = studentName::students
// Create & save map with new/replaced mapping for 'grade'
this.Roster <- this.Roster.Add(grade, newStudents)
This is not thread-safe (because calling Add concurrently might not update the map properly). However, you can access school.Roster at any time, iterate over it (or share references to it) safely, because it is an immutable structure. However, if you do not care about that, then using standard Dictionary would be perfectly fine too - depends on your actual use case.

Resources