How to update single property of multiple entities in specific kind of datastore? - google-cloud-datastore

I want to update one property of each entity present in one particular kind of my datastore. In traditional sql, we do something like this as -
update <tablename> set <property> = <value>; {where clause is optional}
Now, how can I do same thing for datastore using golang code?

In Datastore you can't perform an update like that without retrieving the entities. You have to pull all entities in that kind, update the property on each, and re-upsert the now updated entities (preferably in a batch).
Go Datastore Queries: https://cloud.google.com/datastore/docs/concepts/queries#datastore-datastore-basic-query-go
Go Update Entities: https://cloud.google.com/datastore/docs/concepts/entities#datastore-datastore-update-go
Go Batch Upsert: https://cloud.google.com/datastore/docs/concepts/entities#datastore-datastore-batch-upsert-go

Related

DynamoDB sub item filter using .Net Core API

First of all, I have table structure like this,
Users:{
UserId
Name
Email
SubTable1:[{
Column-111
Column-112
},
{
Column-121
Column-122
}]
SubTable2:[{
Column-211
Column-212
},
{
Column-221
Column-222
}]
}
As I am new to DynamoDB, so I have couple of questions regarding this as follows:
1. Can I create structure like this?
2. Can we set primary key for subtables?
3. Luckily, I found DynamoDB helper class to do some operations into my DB.
https://www.gopiportal.in/2018/12/aws-dynamodb-helper-class-c-and-net-core.html
But, don't know how to fetch only perticular subtable
4. Can we fetch only specific columns from my main table? Also need suggestion for subtables
Note: I am using .net core c# language to communicate with DynamoDB.
Can I create structure like this?
Yes
Can we set primary key for subtables?
No, hash key can be set on top level scalar attributes only (String, Number etc.)
Luckily, I found DynamoDB helper class to do some operations into my DB.
https://www.gopiportal.in/2018/12/aws-dynamodb-helper-class-c-and-net-core.html
But, don't know how to fetch only perticular subtable
When you say subtables, I assume that you are referring to Array datatype in the above sample table. In order to fetch the data from DynamoDB table, you need hash key to use Query API. If you don't have hash key, you can use Scan API which scans the entire table. The Scan API is a costly operation.
GSI (Global Secondary Index) can be created to avoid scan operation. However, it can be created on scalar attributes only. GSI can't be created on Array attribute.
Other option is to redesign the table accordingly to match your Query Access Pattern.
Can we fetch only specific columns from my main table? Also need suggestion for subtables
Yes, you can fetch specific columns using ProjectionExpression. This way you get only the required attributes in the result set

How to modify the attributes in a DynamoDB GSI?

I have a DynamoDB GSI with only certain fields in the projection(attributes). I would like to add a new field to this list of attributes. Is it possible to do this without deleting the GSI and recreating it? I did not find an option to do that in DynamoDB console or in update_table cli.
According to UpdateTable API it is only possible to create and delete GSI. So to update the current GSI, the old GSI needs to be deleted and recreated again.

How to retrieve an entity using a property from datastore

Is it possible to retrieve an entity from gae datastore using a property and not using the key?
I could see I can retrieve entities with key using the below syntax.
quote = mgr.getObjectById(Students.class, id);
Is there an alternative that enables us to use a property instead of key?
Or please suggest any other ways to achieve the requirement.
Thanks,
Karthick.
Of course this is possible. Think of the key of an entity being like the primary key of an SQL row (but please, don't stretch the analogy too far - the point is it's a primary key - the implementations of these two data storage systems are very different and it causes people trouble when they don't keep this in mind).
You should look either here (JDO) to read about JDO queries or here (JPA) to read about JPA queries, depending what kind of mgr your post refers to. For JDO, you would do something like this:
// begin building a new query on the Cat-kind entities (given a properly annotated
// entity model class "Cat" somewhere in your code)
Query q = pm.newQuery(Cat.class);
// set filter on species property to == param
q.setFilter("species == speciesParam");
// set ordering for query results by age property descending
q.setOrdering("age desc");
// declare the parameters for this query (format is "<Type> <name>")
// as referenced above in filter statement
q.declareParameters("String speciesParam");
// run the query
List<Cat> results = (List<Cat>) q.execute ("siamese");
For JPA, you would use JPQL strings to run your queries.

Symfony2 and Doctrine: how to make an atomic operation?

Imagine this scenario:
I have an array of ids for some entities that have to be deleted from database (i.e. a couple of externals keys that identifies a record into a third table) and an array of ids for some entities that have to be updated/inserted (based on some criteria that, in this moment, doesn't matters).
What can I do for delete those entities ?
Load them from db (repository way)
Call delete() on the obtained objects
Call flush() onto my entity manager
In that scenario I can make all my operation atomical as I can update/insert other records before call flush().
But why have I to load from db some records just for delete them? So I wrote my personal DQL query (into repo) and call it.
The problem is that if I call that function into my repo, this operation is done immediatly and so my "atomicity" can't be guaranteed.
So, how can I "jump" over this obstacle by following the second "delete-option" ?
By using flush() you're making Doctrine to start transactions implicitly. It is also possible to use transactions explicitly and that approach should solve your problem.

How to implement a cross-database foreign key constraint?

Let's say I have two schemas: HR and Orders.
[HR].Employees [Orders].Entries
-------------- ----------------
Id_Employee ----> Employee
Fullname Id_Entry
Birthday Description
Amount
As you can see, what I'd want is to be able to establish a cross-database foreign key, but when I try this using a database link, I get:
-- From [Orders]
ALTER TABLE Entries
ADD CONSTRAINT FK_Entries_Employees FOREIGN KEY (Employee)
REFERENCES Employees#HR;
COMMIT;
ORA-02021: DDL operations are not allowed on a remote database
Is there a way around this? It's a legacy database, so I can't change the existing schema.
For the NHibernate crowd: I would then use this relation to map the NHibernate's domain objects.
One option would be to create a materialized view of Employees on [Orders] and then use that as the parent for the foreign key.
Of course, that has some drawbacks. In particular,
-- you won't be able to do a complete refresh of the materialized view without disabling the foreign key, so it'll have to fast refresh.
-- keys entered into EMPLOYEES won't be available to ENTRIES until the materialized view refresh. If that's critical, you may want to set it to refresh on commit.
Other alternatives are to handle the key enforcement yourself through a trigger or through a post cleanup process. Or convince the DBA's that these schemas can reside on the same database instance.
As far as I know constraints and referential integrity are only supported within one single database.
If you need to cross the boundaries of the database, you'd have to be creative. Maybe write some triggers checking for data in the other database or enforce these constraints on the application level. But then you may encounter the problem with transaction scope limited to one single database.

Resources