How to pass a parameter in a Marshmallow schema - flask-restful

Using Python, Flask and marshmallow, if I have a schema:
class ParentSchema(Schema):
id = fields.Int(dump_only=True)
children = fields.Nested('ChildSchema', dump_only=True)
and a class Parent which has a method:
class Parent():
getChildren(self, params):
pass
How do I get Marshmallow to pass the necessary parameters to Parent.getChildren when serialising the object and then populate ParentSchema.children with the results?

So the solution is to add a get_attribute method to the schema class and to assign the parameters to the context attribute of the ParentSchema class. This alters the default behaviour that Marshmallow uses to extract class attributes when building the schema.
class ParentSchema(Schema):
id = fields.Int(dump_only=True)
children = fields.Nested('ChildSchema', dump_only=True)
def get_attribute(self, key, obj, default):
if key == 'children':
return obj.getChildren(self.context['params'])
else:
return getattr(obj, key, default)

Related

How could I annotate both property and params for a constructor value?

I defined an annotation class for serialize/deserialize a model.
If I define PROPERTY as Target and
MyClass( #PropertyName("a_name") val name: String )
I can access annotations thought my class property but not thought my class constructor params.
Viceversa if I define no Target or both PROPERTY and VALUE_PARAMETER.
For access annotation for both property and value params I should annotate like this:
#param:PropertyName("a_name")
#property:PropertyName("a_name")
But I guess nobody would like this solution.
As a workaround I should just Target the value params, then get both constructor ad properties, then match by name every constructor param with properties and access annotation from param and value from property.
It works, but I think I could find a better solution.
Any hints?

Lombok Annotations with DynamoDB annotations

I have a DAO like:
#Getter
#Setter
#DynamoDBTable(tableName="tableName")
public class DAO {
#DynamoDBHashKey
private String field1;
#DynamoDBIndexHashKey(globalSecondaryIndexName="index_name")
#DynamoDBRangeKey
private String field2;
}
Problem is when I am trying to use the DAO to make a load call using DynamoDBMapper with field1 as the hash key to obtain the item, it throws a DynamoDBException saying:
Null key found for public DAO.getField2()
but actually table has value corresponding to field2.
Question, is this because of Lombok annotation instead of the manual mutator code and or in general we use Lombok and DynamoDBAnnotations together?
Here is a little more of an explanation and a TL;DR
You are calling the load method, which is mapped to the GetItem call. The DynamoDBMapper is trying to map that request based on your annotations. Your class has the #DynamoDBRangeKey annotation, and the GetItem call needs the full primary key to get the item, which means that the mapper will build out the primary key for the object.
Since Lombok has already generated your code (before runtime), it is not affecting the annotations you have already placed. And also since your annotations are on the fields rather than applying them to the getters, the mapper it is calling the generated Lombok getter. When it tries to serialize into a request, however, that getter is returning null because you have only set the hashKey.
TL;DR: load() translates to GetItem API which requires both the hashKey and the rangeKey since both annotations are present on your class.

Cloud Endpoints - Retrieving a single entity from datastore (by a property other than the helper methods provided by EndpointsModel)

This question completely follows on from a related question I had asked (and was answered) here: Error when trying to retrieve a single entity
As I understand, to retrieve a single entity from the datastore using a property other than helper methods already provided (e.g. 'id') requires turning a simple data property into an EndpointsAliasProperty? If yes, how would I go about doing that? Or is it that we can only use 'id' (helper methods provided by EndpointsModel) and we cannot use any of the properties that we define (in this case 'title')?
The distinction between the custom EndpointsAliasPropertys and one of the data properties you defined is how they are used. They are all used to create a protorpc message, and that message is then converted into an EndpointsModel with your custom data in it. THIS is where the magic happens.
Breaking it down into steps:
1. You specify your data
from google.appengine.ext import ndb
from endpoints_proto_datastore.ndb import EndpointsModel
class MyModel(EndpointsModel):
my_attr = ndb.StringProperty()
2. You pick your fields for your method
class MyApi(...):
#MyModel.method(request_fields=('id', 'my_attr'), ...)
def my_method(self, my_model_entity):
...
3. A protorpc message class is defined from your fields
>>> request_message_class = MyModel.ProtoModel(fields=('id', 'my_attr'))
>>> request_message_class
<class '.MyModelProto_id_my_attr'>
>>> for field in request_message_class.all_fields():
... print field.name, ':', field.variant
...
id : INT64
my_attr : STRING
This happens every time a request is handled by a method decorated with #MyModel.method.
4. A request comes in your application and a message is created
Using the protorpc message class, a message instance is parsed from the JSON which gets passed along to your Endpoints SPI (which is created by endpoints.api_server).
When the request comes in to your protorpc.remote.Service it is decoded:
>>> from protorpc import remote
>>> protocols = remote.Protocols.get_default()
>>> json_protocol = protocols.lookup_by_content_type('application/json')
>>> request_message = json_protocol.decode_message(
... request_message_class,
... '{"id": 123, "my_attr": "some-string"}'
... )
>>> request_message
<MyModelProto_id_my_attr
id: 123
my_attr: u'some-string'>
5. The protorpc message is cast into a datastore model
entity = MyModel.FromMessage(request_message)
THIS is the step you really care about. The FromMessage class method (also provided as part of EndpointsModel) loops through all the fields
for field in sorted(request_message_class.all_fields(),
key=lambda field: field.number):
and for each field with a value set, turns the value into something to be added to the entity and separates based on whether the property is an EndpointsAliasProperty or not:
if isinstance(value_property, EndpointsAliasProperty):
alias_args.append((local_name, to_add))
else:
entity_kwargs[local_name] = to_add
After completing this loop, we have an ordered list alias_args of all key, value pairs and a dictionary entity_kwargs of the data attributes parsed from the message.
Using these, first a simple entity is created
entity = MyModel(**entity_kwargs)
and then each of the alias property values are set in order:
for name, value in alias_args:
setattr(entity, name, value)
The extended behavior happens in setattr(entity, name, value). Since EndpointsAliasProperty is a subclass of property, it is a descriptor and it has a setter which can perform some custom behavior beyond simply setting a value.
For example, the id property is defined with:
#EndpointsAliasProperty(setter=IdSet, property_type=messages.IntegerField)
def id(self):
and the setter performs operations beyond simply setting data:
def IdSet(self, value):
self.UpdateFromKey(ndb.Key(self.__class__, value))
This particular method attempts to retrieve the entity stored in the datastore using the id and patch in any values from the datastore that were not included in the entity parsed from the request.
If you wanted to do this for a field like my_attr, you would need to construct a custom query which could retrieve the item with that unique my_attr value (or fail if not exactly one such entity exists).
This is problematic and you'd be better off using a unique field like the key or ID used to store the entity in the datastore.
The keys with ancestors sample gives a great example of creating your own custom properties.
If you REALLY insist on using my_attr to retrieve an entity, you could do so using a different property name (since my_attr is already used for the data property) such as fromMyAttr:
class MyModel(EndpointsModel):
def MyAttrSet(self, value):
...
#EndpointsAliasProperty(setter=MyAttrSet)
def fromMyAttr(self):
...
Here, the MyAttrSet instance method would form the query:
def MyAttrSet(self, value):
query = MyModel.query(MyModel.my_attr == value)
results = query.fetch(2)
reject results that aren't unique for my_attr:
if len(results) == 0:
raise endpoints.NotFoundException('Not found.')
if len(results) == 2:
raise endpoints.BadRequestException('Colliding results.')
and copy over the values for the already stored entity if we do find a unique one:
matching_entity = results[0]
self._CopyFromEntity(matching_entity)
self._from_datastore = True

Pylons, SQlite and autoincrementing fields

Hey!
Just started working with Pylons in conjunction with SQLAlchemy, and my model looks something like this:
from sqlalchemy import Column
from sqlalchemy.types import Integer, String
from helloworld.model.meta import Base
class Person(Base):
__tablename__ = "person"
id = Column(Integer, primary_key=True)
name = Column(String(100))
email = Column(String(100))
def __init__(self, name='', email=''):
self.name = name
self.email = email
def __repr__(self):
return "<Person('%s')" % self.name
To avoid sqlite reusing id's that might have been deleted, I want to add AUTOINCREMENT to the column "id". I've looked through the documentation for sqlalchemy and saw that the sqlite_autoincrement can be issued.
An example where this attribute is given can be found here.
sqlite_autoincrement seems though to be issued when creating the table itself, and I just wondered how it can be supplied when using a declarative style of the model such as mine.
Try including a __table_args__ attribute with the arguments you would pass to Table constructors in the traditional (non-declarative) data definition style, e.g.:
class Person(Base):
__tablename__ = "person"
__table_args__ = {'sqlite_autoincrement': True}
If you have to include several arguments, use this form instead (dict has to be last):
__table_args__ = (
Unique('foo'),
# ...
{'sqlite_autoincrement': True}
)
From the Table configuration section of the Declarative SQLAlchemy documentation:
Table arguments other than the name, metadata, and mapped Column arguments are specified using the __table_args__ class attribute. This attribute accommodates both positional as well as keyword arguments that are normally sent to the Table constructor.

Scala: How to know if a class is an enumeration; isInstanceOf[Enumeration] doesn't work

I'm in scala writing a serializer that saves an object (or Model) to the database (for app engine), and I need to treat some fields as special cases. For example, if the field is of type Array[Byte], I Save it as a blob. And I need to treat Enumerations as special cases too, but I can't find out how to know if a type is an enumeration.
For example:
object UserType extends Enumeration {
val Anonym, Registered, Admin, Super = Value
}
var value = UserType.Admin
value.isInstanceOf[Enumeration] // this returns false
Neither I can do value.isInstanceOf[Enumeration.Value] since Value is private... anyway I think that would return false too.
Any idea?
Thanks!
value.isInstanceOf[Enumeration$Value]
You could figure this out using these methods:
scala> value.getClass
res102: java.lang.Class[_] = class scala.Enumeration$Val
scala> value.getClass.getSuperclass
res103: java.lang.Class[_ >: ?0] = class scala.Enumeration$Value
scala> value.getClass.getSuperclass.getSuperclass
res104: java.lang.Class[_ >: ?0] = class java.lang.Object

Resources