Hey!
Just started working with Pylons in conjunction with SQLAlchemy, and my model looks something like this:
from sqlalchemy import Column
from sqlalchemy.types import Integer, String
from helloworld.model.meta import Base
class Person(Base):
__tablename__ = "person"
id = Column(Integer, primary_key=True)
name = Column(String(100))
email = Column(String(100))
def __init__(self, name='', email=''):
self.name = name
self.email = email
def __repr__(self):
return "<Person('%s')" % self.name
To avoid sqlite reusing id's that might have been deleted, I want to add AUTOINCREMENT to the column "id". I've looked through the documentation for sqlalchemy and saw that the sqlite_autoincrement can be issued.
An example where this attribute is given can be found here.
sqlite_autoincrement seems though to be issued when creating the table itself, and I just wondered how it can be supplied when using a declarative style of the model such as mine.
Try including a __table_args__ attribute with the arguments you would pass to Table constructors in the traditional (non-declarative) data definition style, e.g.:
class Person(Base):
__tablename__ = "person"
__table_args__ = {'sqlite_autoincrement': True}
If you have to include several arguments, use this form instead (dict has to be last):
__table_args__ = (
Unique('foo'),
# ...
{'sqlite_autoincrement': True}
)
From the Table configuration section of the Declarative SQLAlchemy documentation:
Table arguments other than the name, metadata, and mapped Column arguments are specified using the __table_args__ class attribute. This attribute accommodates both positional as well as keyword arguments that are normally sent to the Table constructor.
Related
Using Python, Flask and marshmallow, if I have a schema:
class ParentSchema(Schema):
id = fields.Int(dump_only=True)
children = fields.Nested('ChildSchema', dump_only=True)
and a class Parent which has a method:
class Parent():
getChildren(self, params):
pass
How do I get Marshmallow to pass the necessary parameters to Parent.getChildren when serialising the object and then populate ParentSchema.children with the results?
So the solution is to add a get_attribute method to the schema class and to assign the parameters to the context attribute of the ParentSchema class. This alters the default behaviour that Marshmallow uses to extract class attributes when building the schema.
class ParentSchema(Schema):
id = fields.Int(dump_only=True)
children = fields.Nested('ChildSchema', dump_only=True)
def get_attribute(self, key, obj, default):
if key == 'children':
return obj.getChildren(self.context['params'])
else:
return getattr(obj, key, default)
I have a collection of entities with both parent keys and string ids. Sometimes I need to change the string id (update the entity with a new id). From this question (Modify a Google App Engine entity id?), it looks like I need to create a new entity and delete the old one.
Of course, I want to preserve all of the properties in the old entity when creating the new one, but there doesn't seem to be a clone method for NDB entities.
Is this the best way to change an entity's id, while preserving the parent?
# clone the old_entity and parent as new_entity
new_entity = MyModel(**old_entity.to_dict(), id=new_id, parent=old_entity.parent())
And then, I should be able to do this to replace the old entity with the new one:
new_entity.put() # save the new entity
old_entity.key.delete() # delete the old entity
def clone_entity(e, **extra_args):
klass = e.__class__
props = dict((v._code_name, v.__get__(e, klass)) for v in klass._properties.itervalues() if type(v) is not ndb.ComputedProperty)
props.update(extra_args)
return klass(**props)
example
b = clone_entity(a, id='new_id_here')
#sanch's answer works fine in most cases, but for some reason it will not copy attributes of type ndb.PickleProperty.
This modification will work for all attributes, including PickleProperty, and will also accept an optionnal new_class parameter to make a clone of another class.:
def clone_entity(e, **extra_args):
"""
Clone an ndb entity and return the clone.
Special extra_args may be used to:
- request cloned entity to be of a different class (and yet have attributes from original entity)
- define a specific parent, id or namespace for the cloned entity.
:param e: The ndb entity to be cloned.
:param extra_args: May include special args 'parent', 'id', 'namespace', that will be used when initializing new entity.
other extra_args, may set values for specific attributes.
:return: The cloned entity
"""
if 'new_class' in extra_args:
klass = extra_args.pop('new_class')
else:
klass = e.__class__
props = dict((v._code_name, v.__get__(e, klass)) for v in klass._properties.itervalues() if type(v) is not ndb.ComputedProperty)
init_args = dict()
for arg in ['parent', 'id', 'namespace']:
if arg in extra_args:
init_args[arg] = extra_args.pop(arg)
clone = klass(**init_args)
props.update(**extra_args)
clone.populate(**props)
return clone
This question completely follows on from a related question I had asked (and was answered) here: Error when trying to retrieve a single entity
As I understand, to retrieve a single entity from the datastore using a property other than helper methods already provided (e.g. 'id') requires turning a simple data property into an EndpointsAliasProperty? If yes, how would I go about doing that? Or is it that we can only use 'id' (helper methods provided by EndpointsModel) and we cannot use any of the properties that we define (in this case 'title')?
The distinction between the custom EndpointsAliasPropertys and one of the data properties you defined is how they are used. They are all used to create a protorpc message, and that message is then converted into an EndpointsModel with your custom data in it. THIS is where the magic happens.
Breaking it down into steps:
1. You specify your data
from google.appengine.ext import ndb
from endpoints_proto_datastore.ndb import EndpointsModel
class MyModel(EndpointsModel):
my_attr = ndb.StringProperty()
2. You pick your fields for your method
class MyApi(...):
#MyModel.method(request_fields=('id', 'my_attr'), ...)
def my_method(self, my_model_entity):
...
3. A protorpc message class is defined from your fields
>>> request_message_class = MyModel.ProtoModel(fields=('id', 'my_attr'))
>>> request_message_class
<class '.MyModelProto_id_my_attr'>
>>> for field in request_message_class.all_fields():
... print field.name, ':', field.variant
...
id : INT64
my_attr : STRING
This happens every time a request is handled by a method decorated with #MyModel.method.
4. A request comes in your application and a message is created
Using the protorpc message class, a message instance is parsed from the JSON which gets passed along to your Endpoints SPI (which is created by endpoints.api_server).
When the request comes in to your protorpc.remote.Service it is decoded:
>>> from protorpc import remote
>>> protocols = remote.Protocols.get_default()
>>> json_protocol = protocols.lookup_by_content_type('application/json')
>>> request_message = json_protocol.decode_message(
... request_message_class,
... '{"id": 123, "my_attr": "some-string"}'
... )
>>> request_message
<MyModelProto_id_my_attr
id: 123
my_attr: u'some-string'>
5. The protorpc message is cast into a datastore model
entity = MyModel.FromMessage(request_message)
THIS is the step you really care about. The FromMessage class method (also provided as part of EndpointsModel) loops through all the fields
for field in sorted(request_message_class.all_fields(),
key=lambda field: field.number):
and for each field with a value set, turns the value into something to be added to the entity and separates based on whether the property is an EndpointsAliasProperty or not:
if isinstance(value_property, EndpointsAliasProperty):
alias_args.append((local_name, to_add))
else:
entity_kwargs[local_name] = to_add
After completing this loop, we have an ordered list alias_args of all key, value pairs and a dictionary entity_kwargs of the data attributes parsed from the message.
Using these, first a simple entity is created
entity = MyModel(**entity_kwargs)
and then each of the alias property values are set in order:
for name, value in alias_args:
setattr(entity, name, value)
The extended behavior happens in setattr(entity, name, value). Since EndpointsAliasProperty is a subclass of property, it is a descriptor and it has a setter which can perform some custom behavior beyond simply setting a value.
For example, the id property is defined with:
#EndpointsAliasProperty(setter=IdSet, property_type=messages.IntegerField)
def id(self):
and the setter performs operations beyond simply setting data:
def IdSet(self, value):
self.UpdateFromKey(ndb.Key(self.__class__, value))
This particular method attempts to retrieve the entity stored in the datastore using the id and patch in any values from the datastore that were not included in the entity parsed from the request.
If you wanted to do this for a field like my_attr, you would need to construct a custom query which could retrieve the item with that unique my_attr value (or fail if not exactly one such entity exists).
This is problematic and you'd be better off using a unique field like the key or ID used to store the entity in the datastore.
The keys with ancestors sample gives a great example of creating your own custom properties.
If you REALLY insist on using my_attr to retrieve an entity, you could do so using a different property name (since my_attr is already used for the data property) such as fromMyAttr:
class MyModel(EndpointsModel):
def MyAttrSet(self, value):
...
#EndpointsAliasProperty(setter=MyAttrSet)
def fromMyAttr(self):
...
Here, the MyAttrSet instance method would form the query:
def MyAttrSet(self, value):
query = MyModel.query(MyModel.my_attr == value)
results = query.fetch(2)
reject results that aren't unique for my_attr:
if len(results) == 0:
raise endpoints.NotFoundException('Not found.')
if len(results) == 2:
raise endpoints.BadRequestException('Colliding results.')
and copy over the values for the already stored entity if we do find a unique one:
matching_entity = results[0]
self._CopyFromEntity(matching_entity)
self._from_datastore = True
I have three tables for different payment types, and want to create a table that holds payments made using all three. I'm not sure if I'm going about this the right way, but I was going to create a foreign key column in the table for each of the three, and the write a constraint such that exactly one of those columns has to be not null.
Is this the right way to go about this?
How do you go about writing this constraint?
Is there any way to do this from within SQLAlchemy on sqlite? (code for declarative classes would be much appreciated)
Have a single foreign key column, and a separate type column so you know which table to look in.
Ok I've got it - Is this the best way to do it? - I created a generic id field table as such:
class PaymentDetails(Base):
__tablename__ = 'payment_details'
id = Column(Integer, primary_key=True)
type = Column(PaymentType.db_type(), nullable=False)
where PaymentType uses the declarative enum recipe and and then subclassed this for the various payment methods:
#concrete
#do_unique_index('paypal_unique_details', 'env', 'main_email', 'sub_email')
class Paypal(Base):
__tablename__ = 'paypal_details'
id = Column(ForeignKey('payment_details.id'), primary_key=True)
# The rest of the implementation
#
#concrete
#do_unique_index('credit_card_unique_details', 'env', 'card_number')
class CreditCard(Base):
__tablename__ = 'card_details'
id = Column(ForeignKey('payment_details.id'), primary_key=True)
# The rest of the implementation
#
#concrete
#do_unique_index('time_code_unique_details', 'env', 'code')
class TimeCodes(Base):
__tablename__ = 'code_details'
id = Column(ForeignKey('payment_details.id'), primary_key=True)
# The rest of the implementation
#
(Where concrete and do_unique_index set the relevant __mapper_args__ and __table_args__). I then set the description field of the PaymentType enum values to be each of these classes, so that to look up a payment I can query for a PaymentDetails object, then get an id and a type from that, say id and Paypal, to perform a second query for the Paypal with that id.
My code for adding sets of details is fairly simple in that in a single transaction, it adds the next logical id to the PaymentDetails table with the type of the payment details we are trying to create, and then adds an entry to that table with the details I want to enter. I can then add methods to these ORM classes to handle the different ways that we would handle buying, selling, and refunding for each method, such that they can be treated as identical.
You then need to switch on FK constraints as van mentioned - I did so by adding the FK listener to the helper class I use for DB access.
I'm in scala writing a serializer that saves an object (or Model) to the database (for app engine), and I need to treat some fields as special cases. For example, if the field is of type Array[Byte], I Save it as a blob. And I need to treat Enumerations as special cases too, but I can't find out how to know if a type is an enumeration.
For example:
object UserType extends Enumeration {
val Anonym, Registered, Admin, Super = Value
}
var value = UserType.Admin
value.isInstanceOf[Enumeration] // this returns false
Neither I can do value.isInstanceOf[Enumeration.Value] since Value is private... anyway I think that would return false too.
Any idea?
Thanks!
value.isInstanceOf[Enumeration$Value]
You could figure this out using these methods:
scala> value.getClass
res102: java.lang.Class[_] = class scala.Enumeration$Val
scala> value.getClass.getSuperclass
res103: java.lang.Class[_ >: ?0] = class scala.Enumeration$Value
scala> value.getClass.getSuperclass.getSuperclass
res104: java.lang.Class[_ >: ?0] = class java.lang.Object