Related
I am trying to run a simple update query, but got an error when I tried to update the key of a map to the same value. Is there a technical reason this would be disallowed? or some kind of best-practice that I am violating by trying to do this?
Error:
ValidationException: Invalid UpdateExpression: Two document paths overlap with each other;
must remove or rewrite one of these paths; path one: [questions, What is xx?], path two: [questions, What is xx?]
Query object:
{
TableName: 'notesTable',
Key: { topic: 'My tooic' },
ExpressionAttributeNames: { '#qq': 'What is xx?', '#updq': 'What is xx?' },
ExpressionAttributeValues: { ':updans': 'new answer' },
UpdateExpression: 'REMOVE questions.#qq SET questions.#updq = :updans'
}
Multiple ways to deal with scenario when same key needs to be updated. Instead of removing and updating the same key, we can simply SET the key , which replaced the value anyhow.
So, simple way is to send different updateExpression each time.
const qq = "What is xx2?";
const updq = "What is xx?";
let expressionAttributeNames;
let UpdateExpression;
if (qq === updq) {
expressionAttributeNames = { "#updq": "What is xx?" };
UpdateExpression = "SET questions.#updq = :updans";
} else {
expressionAttributeNames = { "#qq": "What is xx1?", "#updq": "What is xx?" };
UpdateExpression = "REMOVE questions.#qq SET questions.#updq = :updans";
}
docClient.update(
{
TableName: "test",
Key: {
id: "My tooic",
},
ExpressionAttributeNames: expressionAttributeNames,
ExpressionAttributeValues: { ":updans": "new answer1" },
UpdateExpression: UpdateExpression,
},
function (error, result) {
console.log("error", error, "result", result);
}
);
for example i got 2 tables like;
table1 {Id, Name, Description}
table2 {Id, Table1Id, Name, Amount}
With bookshelfJS when i using withRelated something like;
new table1({Id: 1})
.fetchAll({
withRelated: ['Childs']})
.then(function(rows) {
callback(null, rows);
});
I expected my result something like;
{results: [{Id: '', Name: '', Description: '', Childs: [{Id: '', Name: '', Amount: 123}]}]}
I don't want to get Table1Id in the Childs list. How can I specify what columns in my output?
UPDATE
My models;
table1 = bookshelf.Model.extend({
tableName: 'table1',
Childs: function() {
return this.hasMany(table2, 'Table1Id');
}
});
table2 = bookshelf.Model.extend({
tableName: 'table2',
Parent: function() {
return this.belongsTo(Table1);
}
});
If I'm not select Table1Id
new table1({Id: 1})
.fetchAll({
withRelated: ['Childs':function(qb) {
qb.select('Id', 'Name', 'Description');
}]})
.then(function(rows) {
callback(null, rows);
});
then return empty for Childs[].
Should be;
new table1({Id: 1})
.fetchAll({
withRelated: ['Childs':function(qb) {
qb.select('Id', 'Table1Id', 'Name', 'Description');
}]})
.then(function(rows) {
callback(null, rows);
});
well here's the thing: this can be solved pretty easily, but you NEED to select the primary ID of the table in question, otherwise Bookshelf won't know how to tie the data together. The idea is that you get the query builder from the Knex.js and use the select method (http://knexjs.org/#Builder-select).
Here's the solution for your case:
new table1({
Id: 1
})
.fetchAll({
withRelated: [{
'Childs': function(qb) {
//always select the primary Id of the table, otherwise there will be no relations between the tables
qb.select('Id', 'Name', 'Amount'); //Table1Id is omitted!
}
}]
})
.then(function(rows) {
callback(null, rows);
});
Let me know if this solves your problem.
in your bookshelf.js file, add the visibility plugin as below
bookshelf.plugin('visibility');
in your table2 model, hide the unwanted field(s) as below
table2 = bookshelf.Model.extend({
tableName: 'table2',
hidden: ['Table1Id'],
Parent: function() {
return this.belongsTo(Table1);
}
});
you can learn more about the visibility plugin from here
https://github.com/tgriesser/bookshelf/wiki/Plugin:-Visibility
How do you "upsert" a property to a DynamoDB row. E.g. SET address.state = "MA" for some item, when address does not yet exist?
I feel like I'm having a chicken-and-egg problem because DynamoDB doesn't let you define a sloppy schema in advance.
If address DID already exist on that item, of type M (for Map), the internet tells me I could issue an UpdateExpression like:
SET #address.#state = :value
with #address, #state, and :value appropriately mapped to address, state, and MA, respectively.
But if the address property does not already exist, this gives an error:
'''
ValidationException: The document path provided in the update expression is invalid for update
'''
So.. it appears I either need to:
Figure out a way to "upsert" address.state (e.g., SET address = {}; SET address.state = 'MA' in a single command)
or
Issue three (!!!) roundtrips in which I try it, SET address = {}; on failure, and then try it again.
If the latter.... how do I set a blank map?!?
Ugh.. I like Dynamo, but unless I'm missing something obvious this is a bit crazy..
You can do it with two round trips, the first conditionally sets an empty map for address if it doesn't already exist, and the second sets the state:
db.update({
UpdateExpression: 'SET #a = :value',
ConditionExpression: 'attribute_not_exists(#a)',
ExpressionAttributeValues: {
":value": {},
},
ExpressionAttributeNames: {
'#a': 'address'
}
}, ...);
Then:
db.update({
UpdateExpression: 'SET #a.#b = :v',
ExpressionAttributeNames: {
'#a': 'address',
'#b': 'state'
},
ExpressionAttributeValues: {
':v': 'whatever'
}
}, ...);
You cannot set nested attributes if the parent document does not exist. Since address does not exist you cannot set the attribute province inside it. You can achieve your goal if you set address to an empty map when you create the item. Then, you can use the following parameters to condition an update on an attribute address.province not existing yet.
var params = {
TableName: 'Image',
Key: {
Id: 'dynamodb.png'
},
UpdateExpression: 'SET address.province = :ma',
ConditionExpression: 'attribute_not_exists(address.province)',
ExpressionAttributeValues: {
':ma': 'MA'
},
ReturnValues: 'ALL_NEW'
};
docClient.update(params, function(err, data) {
if (err) ppJson(err); // an error occurred
else ppJson(data); // successful response
});
By the way, I had to replace state with province as state is a reserved word.
Another totally different method is to simply create the address node when creating the parent document in the first place. For example assuming you have a hash key of id, you might do:
db.put({
Item: {
id: 42,
address: {}
}
}, ...);
This will allow you to simply set the address.state value as the address map already exists:
db.update({
UpdateExpression: 'SET #a.#b = :v',
AttributeExpressionNames: {
'#a': 'address',
'#b': 'state'
},
AttributeExpressionValues: {
':v': 'whatever'
}
}, ...);
Some kotlin code to do this recursively regardless how deep it goes. It sets existence of parent paths as condition and if condition check fails, recursively creates those paths first. It has to be in the library's package so it can access those package private fields/classes.
package com.amazonaws.services.dynamodbv2.xspec
import com.amazonaws.services.dynamodbv2.document.Table
import com.amazonaws.services.dynamodbv2.model.ConditionalCheckFailedException
import com.amazonaws.services.dynamodbv2.xspec.ExpressionSpecBuilder.attribute_exists
fun Table.updateItemByPaths(hashKeyName: String, hashKeyValue: Any, updateActions: List<UpdateAction>) {
val parentPaths = updateActions.map { it.pathOperand.path.parent() }
.filter { it.isNotEmpty() }
.toSet() // to remove duplicates
try {
val builder = ExpressionSpecBuilder()
updateActions.forEach { builder.addUpdate(it) }
if (parentPaths.isNotEmpty()) {
var condition: Condition = ComparatorCondition("=", LiteralOperand(true), LiteralOperand(true))
parentPaths.forEach { condition = condition.and(attribute_exists<Any>(it)) }
builder.withCondition(condition)
}
this.updateItem(hashKeyName, hashKeyValue, builder.buildForUpdate())
} catch (e: ConditionalCheckFailedException) {
this.updateItemByPaths(hashKeyName, hashKeyValue, parentPaths.map { M(it).set(mapOf<String, Any>()) })
this.updateItemByPaths(hashKeyName, hashKeyValue, updateActions)
}
}
private fun String.parent() = this.substringBeforeLast('.', "")
Here is a helper function I wrote in Typescript that works for this a single level of nesting using a recursive method.
I refer to the top-level attribute as a column.
//usage
await setKeyInColumn('customerA', 'address', 'state', "MA")
// Updates a map value to hold a new key value pair. It will create a top-level address if it doesn't exist.
static async setKeyInColumn(primaryKeyValue: string, colName: string, key: string, value: any, _doNotCreateColumn?:boolean) {
const obj = {};
obj[key] = value; // creates a nested value like {address:value}
// Some conditions depending on whether the column already exists or not
const ConditionExpression = _doNotCreateColumn ? undefined:`attribute_not_exists(${colName})`
const AttributeValue = _doNotCreateColumn? value : obj;
const UpdateExpression = _doNotCreateColumn? `SET ${colName}.${key} = :keyval `: `SET ${colName} = :keyval ` ;
try{
const updateParams = {
TableName: TABLE_NAME,
Key: {key:primaryKeyValue},
UpdateExpression,
ExpressionAttributeValues: {
":keyval": AttributeValue
},
ConditionExpression,
ReturnValues: "ALL_NEW",
}
const resp = await docClient.update(updateParams).promise()
if (resp && resp[colName]) {
return resp[colName];
}
}catch(ex){
//if the column already exists, then rerun and do not create it
if(ex.code === 'ConditionalCheckFailedException'){
return this.setKeyInColumn(primaryKeyValue,colName,key, value, true)
}
console.log("Failed to Update Column in DynamoDB")
console.log(ex);
return undefined
}
}
I've got quite similar situation. I can think of only a one way to do this in 1 query/atomically.
Extract map values to top level attributes.
Example
Given I have this post item in DynamoDB:
{
"PK": "123",
"SK": "post",
"title": "Hello World!"
}
And I want to later add an analytics entry to same partition:
{
"PK": "123",
"SK": "analytics#december",
"views": {
// <day of month>: <views>
"1": "12",
"2": "457463",
// etc
}
}
Like in your case, it's not possible to increment/decrement views days counters in single query if analytics item nor views map might not exist (could be later feature or don't want to put empty items).
Proposed solution:
{
"PK": "123",
"SK": "analytics#december",
// <day of month>: <views>
"1": "12", // or "day1" if "1" seems too generic
"2": "457463",
// etc
}
}
Then you could do something like this (increment +1 example):
{
UpdateExpression: "SET #day = if_not_exists(#day, 0) + 1",
AttributeExpressionNames: {
'#day': "1"
}
}
if day attribute value doesn't exist, set default value to 0
if item in database doesn't exist, update API adds a new one
Given I have 3 types of collections and a dynamic value, how would I specify what collection to search for based on that dynamic value?
E.g,
array = [
{id: 'one', type: 'profile'},
{id: 'something', type: 'post'},
{id: 'askjdaksj', type: 'comment']
]
How would I isolate the type and turn it into a collection? Basically turning type into Collection.find
array[0].type.find({_id: id});
=> Profiles.find({_id: id});
Is this possible?
Here's a complete working example:
Posts = new Mongo.Collection('posts');
Comments = new Mongo.Collection('comments');
var capitalize = function(string) {
return string.charAt(0).toUpperCase() + string.slice(1);
};
var nameToCollection = function(name) {
// pluralize and capitalize name, then find it on the global object
// 'post' -> global['Posts'] (server)
// 'post' -> window['Posts'] (client)
return this[capitalize(name) + 's'];
};
Meteor.startup(function() {
// ensure all old documents are removed
Posts.remove({});
Comments.remove({});
// insert some new documents
var pid = Posts.insert({text: 'I am a post'});
var cid = Comments.insert({text: 'I am a comment'});
var items = [
{id: pid, type: 'post'},
{id: cid, type: 'comment'}
];
_.each(items, function(item) {
// find the collection object based on the type (name)
var collection = nameToCollection(item.type);
// retrieve the document from the dynamically found collection
var doc = collection.findOne(item.id);
console.log(doc);
});
});
Recommended reading: collections by reference.
I am able to add a row at runtime using .js file into Kendo Data Source, but I havent seen from the form(UI), I followed the below steps
var vgrid = $("#grdEntitys").data("kendoGrid");
var datasource = vgrid.dataSource;
var newRecord = { No: "8164",ModellNo: "147",ID: "Test01", Name: "TEST"}
datasource.insert(newRecord);
then It throws an error "TypeError: Cannot read property 'AttributeValue' of undefined",
if we look at the console log, I am able to see the incrmented rows count as well as the newly inserted record. But in UI there is no change(UI Grid).
could you please anyone let me know, how to add row at client side?
Thanks in advance
For insert you have to specify index (Insert) :
var dataItem = dataSource.insert(0, { name: "John Doe" });
Alternatively you could use Add where you don't have to specify the index:
<script>
var dataSource= new kendo.data.DataSource({
data: [
{ name: "Jane Doe", age: 30 }
]
});
dataSource.add({ name: "John Doe", age: 33 });
you can use the script in you event to add item in your grid.
var dataSource = $("#CustomerPackageChannelKendoGridAdd").data("kendoGrid").dataSource;
// Get value from another field
var _JV_ACCOUNT_ID = $('#JV_ACCOUNT_ID').val();
var _JV_ACCOUNT_NAME = $('#JV_ACCOUNT_NAME').val();
var _JV_ACCOUNT_CODE = $('#JV_ACCOUNT_CODE').val();
var _JV_NOTES = $('#JV_NOTES').val();
var _JV_DATE = $('#JV_DATE').val();
var type = $('#JV_Transaction_TYPE').val();
// You can set condition if required for you
if (CheckExistingData(gridDataAdd, _JV_ACCOUNT_ID) == false) {
currentId += 1;
dataSource.add(
{
id: currentId,
JV_ACCOUNT_ID: _JV_ACCOUNT_ID,
JV_ACCOUNT_NAME: _JV_ACCOUNT_NAME
, JV_ACCOUNT_CODE: _JV_ACCOUNT_CODE
, JV_NOTES: _JV_NOTES
, JV_DATE: _JV_DATE
, JV_DEBIT_AMOUNT: _JV_DEBIT_AMOUNT
, JV_CREDIT_AMOUNT: _JV_CREDIT_AMOUNT
});
}
For more you can also see this: