Cannot insert into column "Id" error while trying to insert entity that has a nested parent entity - mikro-orm

I have 2 entities with One to Many relationship in my Mikro-Orm (v3.6.15) model (connected to Postgresql - pg v8.3.0):
Uygulama (Parent)
#Entity({ tableName: "Uygulamalar", collection: "Uygulamalar" })
export class Uygulama {
#PrimaryKey({ fieldName: "Id" })
id!: number;
#Property({ fieldName: "Adi" })
adi!: string;
#Property({ fieldName: "Kod" })
kod!: string;
#Property({ fieldName: "UygulamaSahibi" })
uygulamaSahibi!: string;
#Property({ fieldName: "createdAt" })
createdAt = new Date();
#Property({ fieldName: "updatedAt", onUpdate: () => new Date() })
updatedAt = new Date();
#OneToMany({ entity: () => Modul, mappedBy: "rootUygulama", cascade: [] })
moduller = new Collection<Modul>(this);
}
Modul (Child)
export class Modul {
#PrimaryKey({ fieldName: "Id" })
id!: number;
#Property({ fieldName: "Adi" })
adi!: string;
#Property({ fieldName: "Kod" })
kod!: string;
#Property({ fieldName: "createdAt" })
createdAt = new Date();
#Property({ fieldName: "updatedAt", onUpdate: () => new Date() })
updatedAt = new Date();
#ManyToOne({ entity: () => Uygulama, joinColumn: "UygulamaId", cascade: [],})
rootUygulama!: Uygulama;
#OneToMany({ entity: () => Ekran, mappedBy: "rootModul", orphanRemoval: true,})
ekranlar = new Collection<Ekran>(this);
}
I have a rest endpoint (Expressjs) to create Modul object from posted Http request body :
router.post("/", async (req, res) => {
const modul = DI.modulRepository.create(req.body);
await DI.modulRepository.persistAndFlush(modul);
res.send(modul);
});
When I try to post JSON object below to create a new Modul (rootUygulama object is already in the database):
{
"adi": "Deneme Modülü 3",
"kod": "DM1",
"rootUygulama": {
"id": 66,
"adi": "Deneme Uygulaması",
"kod": "DU",
"uygulamaSahibi": "xxxxxx",
"createdAt": "2020-07-24T21:18:47.874Z",
"updatedAt": "2020-07-24T21:18:47.874Z",
"moduller": [
]
}
}
I get error :
[query] insert into "Uygulamalar" ("Adi", "Id", "Kod", "UygulamaSahibi", "createdAt", "updatedAt") values ('Deneme Uygulaması', 66, 'DU', 'szengin', '2020-07-25 00:18:47.874', '2020-07-25 00:18:47.874') returning "Id" [took 6 ms]
node_modules/mikro-orm/dist/utils/Logger.js:22
[query] rollback
node_modules/mikro-orm/dist/utils/Logger.js:22
(node:14344) UnhandledPromiseRejectionWarning: error: insert into "Uygulamalar" ("Adi", "Id", "Kod", "UygulamaSahibi", "createdAt", "updatedAt") values ($1, $2, $3, $4, $5, $6) returning "Id" - cannot insert into column "Id"
at Parser.parseErrorMessage (d:\NODEJS\BildirimYonetimi\backend\node_modules\pg-protocol\dist\parser.js:278:15)
at Parser.handlePacket (d:\NODEJS\BildirimYonetimi\backend\node_modules\pg-protocol\dist\parser.js:126:29)
at Parser.parse (d:\NODEJS\BildirimYonetimi\backend\node_modules\pg-protocol\dist\parser.js:39:38)
at Socket.<anonymous> (d:\NODEJS\BildirimYonetimi\backend\node_modules\pg-protocol\dist\index.js:8:42)
at Socket.emit (events.js:311:20)
at Socket.EventEmitter.emit (domain.js:482:12)
at addChunk (_stream_readable.js:294:12)
at readableAddChunk (_stream_readable.js:275:11)
at Socket.Readable.push (_stream_readable.js:209:10)
at TCP.onStreamRead (internal/stream_base_commons.js:186:23)
<node_internals>/internal/process/warning.js:32
(node:14344) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 2)
<node_internals>/internal/process/warning.js:32
(node:14344) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
When I send JSON as below object is created and inserted to database successfuly
{
"adi": "Deneme Modülü 3",
"kod": "DM1",
"rootUygulama": 66
}
Even if I set empty cascade array to relationship attribute repository tries to insert parent object and fails.
Am I missing something in configuration?
Edit:
For my Typescript client how will I define rootUygulama property?
export interface Modul {
id: number;
adi: string;
kod: string;
createdAt: Date;
updatedAt: Date;
rootUygulama: Uygulama;
ekranlar: Array<Ekran>;
}
Should it be like
rootUygulama: Uygulama | number;

This is not about cascading, the behaviour is correct. When you pass just the PK, it is considered as existing entity, if you pass an object, it is considered as new entity. It's not about the PK being set or not, if you want that behaviour, you need to program it yourself.
MikroORM works based on change set tracking, so only managed objects (entities loaded from database) can produce update queries. If you want to fire update query for the rootUygulama object, then use em.nativeUpdate(). If you want to still send the payload with it as object, but you care just about the PK, you could also explicitly merge that entity to the EM (that way it becomes managed, just like if you have loaded it from the db).
const modul = DI.modulRepository.create(req.body);
// if we see PK, we merge, so it won't be considered as new object
if (modul.rootUygulama.id) {
DI.em.merge(modul.rootUygulama);
// here we could also fire the `em.nativeUpdate(Uygulama, modul.rootUygulama);`
// to fire an update query, but then you should use `em.transactional()` to have that update query inside the same TX as the flush
}
await DI.modulRepository.persistAndFlush(modul);
res.send(modul);
Btw I would strongly suggest not to disable cascading unless you truly understand what you are doing, as the defaults are to cascade merge and persist, which you usually want/need.

Related

Error when executing CosmosDB stored procedure

I am learning to write CosmosDB stored procedures following the following information
Stored procedure docs
What I am trying to do is loop through a number of documents returned by a query and find the one that has the most exact match.
The flow is as follows
Check Start and End Date to make sure its a valid documents
1.5 Check that the input VariantNo is included in the Variants array in the document
Check if a user is included in the user array of the document OR if ALL is specified as a string in the array
Check if a store is included in the stores array OR if ALL is specified as a string in the stores array
The document looks as follows
{
"id": "12345",
"brand": "XXX",
"PromotionName": "Test Promo 1",
"PromotionType": "Deal",
"PromotionSticker": "Sticker 1",
"StartDate": "2020-05-14T00:00:00.1212122Z",
"EndDate": "2020-05-30T00:00:00.1212122Z",
"Variants": [
"0628462008001",
"0628462008002",
"0644324003002"
],
"Stores": [
"SE0623"
],
"Users": [
"ALL"
],
"DiscountInPercent": "30",
"RedPriceStores": null,
"CreatedDate": "20200515",
"CreatedBy": "SLAPI Promotions API ClientId: 123",
"UpdatedDate": null,
"UpdatedBy": null,
"Consumer": "YYYYY_V2",
"_rid": "HwVmAIFaOoEBAAAAAAAAAA==",
"_self": "dbs/HwVmAA==/colls/HwVmAIFaOoE=/docs/HwVmAIFaOoEBAAAAAAAAAA==/",
"_etag": "\"11005859-0000-0c00-0000-5ebe0f7e0000\"",
"_attachments": "attachments/",
"_ts": 1589514110
}
The beginnings of my stored procedure looks like this based on the template in CosmosDB
// SAMPLE STORED PROCEDURE
function getFinalPromotionPrice(item, store, user) {
var collection = getContext().getCollection();
// Query documents and take 1st item.
var isAccepted = collection.queryDocuments(
collection.getSelfLink(),
'SELECT * FROM c WHERE c.StartDate <= (SELECT VALUE GetCurrentDateTime()) AND c.EndDate >= (SELECT VALUE GetCurrentDateTime())',
function (err, feed, options) {
if (err) throw err;
// Check the feed and if empty, set the body to 'no docs found', 
// else take 1st element from feed
if (!feed || !feed.length) {
var response = getContext().getResponse();
response.setBody('no docs found');
}
else {
var response = getContext().getResponse();
var body = { prefix: prefix, feed: feed[0] };
response.setBody(JSON.stringify(body));
}
});
if (!isAccepted) throw new Error('The query was not accepted by the server.');
}
but I am getting this error when executing the stored procedure:
{"code":400,"body":{"code":"BadRequest","message":"Message: {\"Errors\":[\"Encountered exception while executing function. Exception = ReferenceError: 'prefix' is not defined\r\nStack trace: ReferenceError: 'prefix' is not defined\n at Anonymous function (script.js:20:13)\n at A
As you can check the error its expecting the "prefix" :
Exception = ReferenceError: 'prefix' is not defined
In the below line you are setting the value of prefix as "prefix" but you have not declared prefix anywhere in the code.
var body = { prefix: prefix, feed: feed[0] };
Change the above line to this if you don't require prefix in your SP body:
var body = { feed: feed[0] };

How to delete entities from one-to-many collection with only composite primary keys in MariaDB

#Entity()
export class Job {
#PrimaryKey({ type: BigIntType })
id: string;
#OneToMany(() => JobExperienceLevel,
jobExperienceLevel => jobExperienceLevel.job, {cascade: Cascade.ALL], orphanRemoval: true})
experienceLevels = new Collection<JobExperienceLevel>(this);
}
#Entity()
export class JobExperienceLevel {
#PrimaryKey()
#Enum({
items: () => JobExperienceLevelType
})
experienceLevel: JobExperienceLevelType;
#ManyToOne({nullable:false, primary: true, joinColumn: 'job_id'})
job: Job;
}
export enum JobExperienceLevelType {
ENTRY_LEVEL = 'ENTRY_LEVEL',
JUNIOR = 'JUNIOR',
REGULAR = 'REGULAR',
SENIOR = 'SENIOR'
}
After calling experienceLevels.removeAll() on some job entity it is generating the following query:
delete from `job_experience_level` where `experience_level` = 'SENIOR' and `job_id` is null
The database table 'job_experience_level' contains only composite primary keys (experience_level, job_id)
I have checked that before calling removeAll method there is one entity 'SENIOR' in the collection.
I am using entityrepository with persistAndFlush on the job entity.
The problem is that this query is wrong and it should populate the correct job_id.
I also tried to remove the #PrimaryKey() from experienceLevel property, but then there is no delete query in the transaction at all.
As discussed in the comments, there was a bug with orphan removal and composite keys. Upgrade to v3.6.7 to fix it.
https://github.com/mikro-orm/mikro-orm/blob/master/CHANGELOG.md#367-2020-04-16
Here is a testcase to make sure it really works:
https://github.com/mikro-orm/mikro-orm/commit/94c71c89a648e03fad38a93cced7fa92bbfd7ff7#diff-595473b980e4d4384f667f60dbddde2aR1

Mikro-orm order by ST_Distance_Sphere using MySQL driver

With MySQL, I am trying to order by ST_Distance_Sphere using QueryBuilder.
I have a entity:
import { Entity, PrimaryKey, Property } from "mikro-orm";
#Entity({ tableName: "studio" })
export default class StudioEntity {
#PrimaryKey()
public id!: number;
#Property()
public name!: string;
#Property({ columnType: "point srid 4326" })
public geometry!: object;
}
And I am trying:
export default class StudioStore {
private studioRepository: EntityRepository<StudioEntity>;
public constructor(ormClient: OrmClient) {
this.studioRepository = ormClient.em.getRepository(StudioEntity);
}
public async findPage(first: number): Promise<StudioEntity[]> {
const query = this.studioRepository.createQueryBuilder().select("*");
query.addSelect(
"ST_Distance_Sphere(`e0`.`geometry`, ST_GeomFromText('POINT(28.612849 77.229883)', 4326)) as distance",
);
query.orderBy({ distance: "ASC" });
return query.limit(first).getResult();
}
}
But I get a ORM error:
Trying to query by not existing property StudioEntity.distance
So, I try to add a property to the entity:
#Property({ persist: false })
public distance?: number;
But now I get a MySQL error:
Unknown column 'e0.distance' in 'order clause'
This is the generated SQL query:
[query] select `e0`.*, ST_Distance_Sphere(`e0`.`geometry`, ST_GeomFromText('POINT(28.612849 77.229883)', 4326)) as distance from `studio` as `e0` order by `e0`.`distance` asc limit 5 [took 4 ms]
You will need to fallback to knex, as QB currently supports only defined property fields in order by. You will also need to define that virtual distance property as you already did, so the value can be mapped to the entity.
https://mikro-orm.io/docs/query-builder/#using-knexjs
const query = this.studioRepository.createQueryBuilder().select("*");
query.addSelect("ST_Distance_Sphere(`e0`.`geometry`, ST_GeomFromText('POINT(28.612849 77.229883)', 4326)) as distance");
query.limit(first);
const knex = query.getKnexQuery();
knex.orderBy('distance', 'asc');
const res = await this.em.getConnection().execute(knex);
const entities = res.map(a => this.em.map(StudioEntity, a));
Not very nice I must say, totally forgot that it is possible to order by computed fields. Will try to address this in v4. I think it could even work as your second approach, as QB could simply check if the property is virtual (has persist: false), and then it would not prefix it.
edit: as of 3.6.6 the approach with persist: false should work out of box

Do CosmosDB Mongo API compound unique indexes require each field to be unique?

I'm trying to set up a collection of versioned documents in which I insert a new document with the same id and a timestamp whenever there's an edit operation. I use a unique compound index for this on the id and timestamp fields. CosmosDB is giving me MongoError: E11000 duplicate key error whenever I try to insert a document with a different id but an identical timestamp to another document. The MongoDB documentation says that I should be able to do this:
https://docs.mongodb.com/v3.4/core/index-unique/#unique-compound-index
You can also enforce a unique constraint on compound indexes. If you use the unique constraint on a compound index, then MongoDB will enforce uniqueness on the combination of the index key values.
I tried using a non-unique index but the Resource Manager template failed, saying that non-unique compound indexes are not supported. I'm using the node.js native driver v3.2.4. I also tried to use Azure Portal to insert documents but received the same error. This makes me believe it's not a problem between CosmosDB and the node.js driver.
Here's a small example to demonstrate the problem. I'm running it with Node v10.15.3.
const { MongoClient } = require('mongodb');
const mongoUrl = process.env.COSMOSDB_CONNECTION_STRING;
const collectionName = 'indextest';
const client = new MongoClient(mongoUrl, { useNewUrlParser: true });
let connection;
const testIndex = async () => {
const now = Date.now();
connection = await client.connect();
const db = connection.db('master');
await db.collection(collectionName).drop();
const collection = await db.createCollection(collectionName);
await collection.createIndex({ id: 1, ts: -1 }, { unique: true });
await collection.insertOne({ id: 1, ts: now, title: 'My first document' });
await collection.insertOne({ id: 2, ts: now, title: 'My other document' });
};
(async () => {
try {
await testIndex();
console.log('It works');
} catch (err) {
console.error(err);
} finally {
await connection.close();
}
})();
I would expect the two insert operations to work and for the program to exit with It works. What I get instead is an Error:
{ MongoError: E11000 duplicate key error collection: master.indextest Failed _id or unique key constraint
at Function.create (/home/node/node_modules/mongodb-core/lib/error.js:43:12)
at toError (/home/node/node_modules/mongodb/lib/utils.js:149:22)
at coll.s.topology.insert (/home/node/node_modules/mongodb/lib/operations/collection_ops.js:859:39)
at handler (/home/node/node_modules/mongodb-core/lib/topologies/replset.js:1155:22)
at /home/node/node_modules/mongodb-core/lib/connection/pool.js:397:18
at process._tickCallback (internal/process/next_tick.js:61:11)
driver: true,
name: 'MongoError',
index: 0,
code: 11000,
errmsg:
'E11000 duplicate key error collection: master.indextest Failed _id or unique key constraint',
[Symbol(mongoErrorContextSymbol)]: {} }
Is this expected behavior or a bug in CosmosDB's MongoDB API?

MDG ValidatedMethod with Aldeed Autoform: "_id is not allowed by the schema" error

I'm getting the error "_id is not allowed by the schema" when trying to use an autoform to update a collection via a ValidatedMethod.
As far as I can see from this example and the official docs there is no expectation for my schema to include the _id field, and I wouldn't expect to be updating the id from an update statement, so I have no idea why this error is happening.
If I switch from using the validated method to writing directly to the collection (with a schema attached to the collection that doesn't have the id in) everything works as expected, so I'm assuming the issue is with my the validate in my ValidatedMethod.
Any idea what I'm doing wrong?
Template: customer-edit.html
<template name="updateCustomerEdit">
{{> quickForm
collection="CustomerCompaniesGlobal"
doc=someDoc
id="updateCustomerEdit"
type="method-update"
meteormethod="CustomerCompanies.methods.update"
singleMethodArgument=true
}}
</template>
Template 'code behind': customer-edit.js
Template.updateCustomerEdit.helpers({
someDoc() {
const customerId = () => FlowRouter.getParam('_id');
const instance = Template.instance();
instance.subscribe('CustomerCompany.get', customerId());
const company = CustomerCompanies.findOne({_id: customerId()});
return company;
}
});
Update Validated Method:
// The update method
update = new ValidatedMethod({
// register the name
name: 'CustomerCompanies.methods.update',
// register a method for validation, what's going on here?
validate: new SimpleSchema({}).validator(),
// the actual database updating part validate has already been run at this point
run( newCustomer) {
console.log("method: update");
return CustomerCompanies.update(newCustomer);
}
});
Schema:
Schemas = {};
Schemas.CustomerCompaniesSchema = new SimpleSchema({
name: {
type: String,
max: 100,
optional: false
},
email: {
type: String,
max: 100,
regEx: SimpleSchema.RegEx.Email,
optional: true
},
postcode: {
type: String,
max: 10,
optional: true
},
createdAt: {
type: Date,
optional: false
}
});
Collection:
class customerCompanyCollection extends Mongo.Collection {};
// Make it available to the rest of the app
CustomerCompanies = new customerCompanyCollection("Companies");
CustomerCompaniesGlobal = CustomerCompanies;
// Deny all client-side updates since we will be using methods to manage this collection
CustomerCompanies.deny({
insert() { return true; },
update() { return true; },
remove() { return true; }
});
// Define the expected Schema for data going into and coming out of the database
//CustomerCompanies.schema = Schemas.CustomerCompaniesSchema
// Bolt that schema onto the collection
CustomerCompanies.attachSchema(Schemas.CustomerCompaniesSchema);
I finally got to the bottom of this. The issue is that autoform passes in a composite object that represents the id of the record to be changed and also a modifier ($set) of the data, rather than just the data itself. So the structure of that object is along the lines of:
_id: '5TTbSkfzawwuHGLhy',
modifier:
{
'$set':
{ name: 'Smiths Fabrication Ltd',
email: 'info#smithsfab.com',
postcode: 'OX10 4RT',
createdAt: Wed Jan 27 2016 00:00:00 GMT+0000 (GMT Standard Time)
}
}
Once I figured that out, I changed my update method to this and everything then worked as expected:
// Autoform specific update method that knows how to unpack the single
// object we get from autoform.
update = new ValidatedMethod({
// register the name
name: 'CustomerCompanies.methods.updateAutoForm',
// register a method for validation.
validate(autoformArgs) {
console.log(autoformArgs);
// Need to tell the schema that we are passing in a mongo modifier rather than just the data.
Schemas.CustomerCompaniesSchema.validate(autoformArgs.modifier , {modifier: true});
},
// the actual database updating part
// validate has already been run at this point
run(autoformArgs)
{
return CustomerCompanies.update(autoformArgs._id, autoformArgs.modifier);
}
});
Excellent. Your post helped me out when I was struggling to find any other information on the topic.
To build on your answer, if for some reason you want to get the form data as a single block you can use the following in AutoForm.
type="method" meteormethod="myValidatedMethodName"
Your validated method then might look something like this:
export const myValidatedMethodName = new ValidatedMethod({
name: 'Users.methods.create',
validate(insertDoc) {
Schemas.NewUser.validate(insertDoc);
},
run(insertDoc) {
return Collections.Users.createUser(insertDoc);
}
});
NB: The Schema.validate() method then requires an Object, not the modifier as before.
I'm unclear if there are any clear advantages to either method in general.
The type="method-update" is obviously the way you want to go for updating documents because you get the modifier. The type="method" seems to be the best way to go for creating a new document. It would likely also be the best option in most cases where you're not intending to create a document from the form data.

Resources