I have a dynamoDb table with only hashKey. I am trying to insert a record by using dynamoDbMapper.save method. Although there is no mismatch in key, I am receiving following error.
The provided key element does not match the schema (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ValidationException
Table has only hashKey (id) and no sort key. Also there is global secondary index.
My dynamoDb data class is :
public class DynamoDbData {
#DynamoDBHashKey(attributeName = "id")
private String id;
#DynamoDBIndexRangeKey(globalSecondaryIndexName = "my-index", attributeName = "myfield")
private String myField;
#DynamoDBAttribute(attributeName = "title")
#DynamoDBIndexHashKey(globalSecondaryIndexName = "my-index", attributeName = "title")
private String title;
}
and I am trying to save the object by using dynamoDbMapper.
dynamoDBMapper.save(dynamoDbData);
Usually error says there is a mistmatch in key. But here I have only hashKey in dynamoDb and object which I am trying to save has also only hasKey with same attribute name. What is wrong here? Why am I getting this error ?
Do I need to use seperate objects which has only hashKey or only index?
Note: I can successfully save in tests by using DynamoDBLocal
Output of describe-table:
{
"Table": {
"AttributeDefinitions": [
{
"AttributeName": "id",
"AttributeType": "S"
},
{
"AttributeName": "myfield",
"AttributeType": "S"
},
{
"AttributeName": "title",
"AttributeType": "S"
}
],
"TableName": "myTable",
"KeySchema": [
{
"AttributeName": "id",
"KeyType": "HASH"
}
],
"TableStatus": "ACTIVE",
"CreationDateTime": "2022-11-15T14:44:24.068000+01:00",
"ProvisionedThroughput": {
"NumberOfDecreasesToday": 0,
"ReadCapacityUnits": 0,
"WriteCapacityUnits": 0
},
"TableSizeBytes": 8413,
"GlobalSecondaryIndexes": [
{
"IndexName": "my-index",
"KeySchema": [
{
"AttributeName": "title",
"KeyType": "HASH"
},
{
"AttributeName": "myfield",
"KeyType": "RANGE"
}
],
"Projection": {
"ProjectionType": "KEYS_ONLY"
},
"IndexStatus": "ACTIVE",
"ProvisionedThroughput": {
"NumberOfDecreasesToday": 0,
"ReadCapacityUnits": 0,
"WriteCapacityUnits": 0
}
}
]
}
}
Bean definition for mapper :
#Bean
DynamoDBMapper dynamoDbMapper(AmazonDynamoDB amazonDynamoDB) {
DynamoDBMapperConfig config = DynamoDBMapperConfig.builder()
.withPaginationLoadingStrategy(DynamoDBMapperConfig.PaginationLoadingStrategy.EAGER_LOADING)
.withConsistentReads(DynamoDBMapperConfig.ConsistentReads.CONSISTENT)
.withTableNameOverride(DynamoDBMapperConfig.TableNameOverride.withTableNamePrefix(properties.getTablePrefix()))
.build();
return new DynamoDBMapper(amazonDynamoDB, config);
}
This is down to the fact of how Java and DynamoDB infer their variables/attribute names... I've noticed this while using Lombok annotations for the Getter/Setter
Although you have set your hash key annotation to #DynamoDBHashKey(attributeName = "id"), when using the Mapper it will see id as Id, which means it is not matching your key element.
If you are using Lombok then try use a lower case for id:
#Data
#DynamoDBTable(tableName = "Stack01")
public static class DynamoDbData {
#DynamoDBHashKey(attributeName = "id")
private String id;
#DynamoDBAttribute(attributeName = "title")
#DynamoDBIndexHashKey(globalSecondaryIndexName = "my-index", attributeName = "title")
private String title;
#DynamoDBIndexRangeKey(globalSecondaryIndexName = "my-index", attributeName = "age")
private String age;
}
Related
I want to translate my Json values from one language to other languages but through the Azure Cognitive Services I can't rich my expectations.
Simple Text and Html based text were translating through Azure translator but Json values I don't know how implement it.
Check the azure cognitive services in azure portal. Create cognitive services using valid subscription and resource group with valid storage policy and public access.
Input of JSON:
{
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "res"
}
]
}
Output:
{
"Consolidaciones": [
{
"authLevel": "anónimo",
"type": "httpTrigger",
"direction": "in",
"nombre": "req",
"métodos": [
"obtener",
"post"
]
},
{
"tipo": "http",
"dirección": "fuera",
"nombre": "res"
}
]
}
View Request:
[{
"text": "{
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "res"
}
]
}"
}]
View Response:
[{
"detectedLanguage": {
"language": "en",
"score": "1"
}}
"translations": [{
"text": "{
"Consolidaciones": [
{
"authLevel": "anónimo",
"type": "httpTrigger",
"direction": "in",
"nombre": "req",
"métodos": [
"obtener",
"post"
]
},
{
"tipo": "http",
"dirección": "fuera",
"nombre": "res"
}
]
}",
"to": "es"
}]
}]
Create Source and Destination containers in the storage account. In the form of folders and upload JSON file to be translated into the source code and get all the keys mentioned above and modify in the below code block.
Code Block in C#
using System;
using System.Net.Http;
using System.Threading.Tasks;
using System.Text;
class Program
{
static readonly string route = "/batches";
private static readonly string endpoint = "https://<NAME-OF-YOUR-RESOURCE>.cognitiveservices.azure.com/translator/text/batch/v1.0";
private static readonly string key = "<YOUR-KEY>";
static readonly string json = ("{\"inputs\": [{\"source\": {\"sourceUrl\": \"https://YOUR-SOURCE-URL-WITH-READ-LIST-ACCESS-SAS\",\"storageSource\": \"AzureBlob\",\"language\": \"en\"}, \"targets\": [{\"targetUrl\": \"https://YOUR-TARGET-URL-WITH-WRITE-LIST-ACCESS-SAS\",\"storageSource\": \"AzureBlob\",\"category\": \"general\",\"language\": \"es\"}]}]}");
static async Task Main(string[] args)
{
using HttpClient client = new HttpClient();
using HttpRequestMessage request = new HttpRequestMessage();
{
StringContent content = new StringContent(json, Encoding.UTF8, "application/json");
request.Method = HttpMethod.Post;
request.RequestUri = new Uri(endpoint + route);
request.Headers.Add("Ocp-Apim-Subscription-Key", key);
request.Content = content;
HttpResponseMessage response = await client.SendAsync(request);
string result = response.Content.ReadAsStringAsync().Result;
if (response.IsSuccessStatusCode)
{
Console.WriteLine($"Status code: {response.StatusCode}");
Console.WriteLine();
Console.WriteLine($"Response Headers:");
Console.WriteLine(response.Headers);
}
else
Console.Write("Error");
}
}
}
Here's my dynamoose schema for table seller
const schema = new dynamoose.Schema({
PK: {
type: String, //ni letak emel.toLowerCase() + #main/business/delivery/ehailing
hashKey: true,
},
SK: {
type: String,
rangeKey: true,
"index": { //utk 'auto' display kedai bila user ada kat location tu
"name": "SKIndex",
"global": true,
"rangeKey": "location"
}
},
"location": String,
}, {
"saveUnknown": true,
"timestamps": true
});
As you can see above, I created a GSI with the SK as the hashkey named SKIndex and having location as the rangeKey. So I tried to perform the query below
var SKIndex_search = "some value"
var locality = "some value too"
var filter = new dynamoose.Condition().where("SKIndex").eq(SKIndex_search).filter("location").beginsWith(locality);
var getResult = await Seller.query(filter).exec()
but it will always return the error "InvalidParameter: Index can't be found for query."
==============
When running this query Seller.query(SKIndex_search).using("SKIndex").filter("location").beginsWith(locality).exec()
It will display the error message ValidationException: Query condition missed key schema element
Full error log:
aws:dynamodb:describeTable:response - {
"Table": {
"AttributeDefinitions": [
{
"AttributeName": "PK",
"AttributeType": "S"
},
{
"AttributeName": "SK",
"AttributeType": "S"
},
{
"AttributeName": "location",
"AttributeType": "S"
}
],
"TableName": "earthlings_seller",
"KeySchema": [
{
"AttributeName": "PK",
"KeyType": "HASH"
},
{
"AttributeName": "SK",
"KeyType": "RANGE"
}
],
"TableStatus": "ACTIVE",
"CreationDateTime": "2021-06-26T20:50:13.233Z",
"ProvisionedThroughput": {
"LastIncreaseDateTime": "1970-01-01T00:00:00.000Z",
"LastDecreaseDateTime": "1970-01-01T00:00:00.000Z",
"NumberOfDecreasesToday": 0,
"ReadCapacityUnits": 1,
"WriteCapacityUnits": 1
},
"TableSizeBytes": 312,
"ItemCount": 1,
"TableArn": "arn:aws:dynamodb:ddblocal:000000000000:table/earthlings_seller",
"GlobalSecondaryIndexes": [
{
"IndexName": "SKIndex",
"KeySchema": [
{
"AttributeName": "SK",
"KeyType": "HASH"
},
{
"AttributeName": "location",
"KeyType": "RANGE"
}
],
"Projection": {
"ProjectionType": "ALL"
},
"IndexStatus": "ACTIVE",
"ProvisionedThroughput": {
"ReadCapacityUnits": 1,
"WriteCapacityUnits": 1
},
"IndexSizeBytes": 312,
"ItemCount": 1,
"IndexArn": "arn:aws:dynamodb:ddblocal:000000000000:table/earthlings_seller/index/SKIndex"
}
]
}
}
aws:dynamodb:query:request - {
"ExpressionAttributeNames": {
"#qra": "location"
},
"ExpressionAttributeValues": {
":qrv": {
"S": "nilai"
}
},
"TableName": "earthlings_seller",
"IndexName": "SKIndex",
"KeyConditionExpression": "begins_with (#qra, :qrv)"
}
As descrribed in the Dynamoose documentation, where takes in a key attribute. This key represents an attribute name (SK), not an index name (SKIndex).
Changing your code to the following should work.
new dynamoose.Condition().where("SK").eq(SKIndex_search).filter("location").beginsWith(locality);
You can also use the using function to manually set a specific index to run your query on. However this is optional. Dynamoose will use a system to look through your indexes and pick one that best matches your query you are making.
We are running a Dotnet Core 2.2 service using ServiceStack 5.7, and need to throttle it. So we want to put it behind a Azure Api Management Gateway (apim) - it runs in a Azure App Service.
We have enabled OpenApi feature using
self.Plugins.Add(new OpenApiFeature());
When we export our OpenApi definition we get the following:
"paths": {
...
"/api/search": {
"post": {
"tags": [
"api"
],
"operationId": "SearchRequestsearch_Post",
"consumes": [
"application/x-www-form-urlencoded"
],
"produces": [
"application/json"
],
"parameters": [
{
"name": "Filters",
"in": "formData",
"type": "array",
"items": {
"$ref": "#/definitions/FilterDto"
},
"collectionFormat": "multi",
"required": false
}
],
"responses": {
"200": {
"description": "Success",
"schema": {
"$ref": "#/definitions/SearchResponse"
}
}
},
"deprecated": false,
"security": [
{
"Bearer": []
}
]
},
"parameters": [
{
"$ref": "#/parameters/Accept"
}
]
}
}
...
"definitions": {
"FilterDto": {
"title": "FilterDto",
"properties": {
"Field": {
"description": "The field to filter on",
"type": "string",
"enum": [
"None",
"DestinationName",
"DocumentId"
]
},
"Values": {
type": "array",
"items": {
"type": "string"
}
},
"Type": {
"type": "string",
"enum": [
"Equals",
"NotEquals",
"RangeNumeric",
"RangeDate"
]
}
},
"description": "FilterDto",
"type": "object"
}
...
}
The problem is that it is not supported to have a parameter with an array of a type (defined in #/definitions/FilterDto). And it fails with:
Parsing error(s): JSON is valid against no schemas from 'oneOf'. Path 'paths['/api/search'].post.parameters[1]', line 1, position 666.
Parsing error(s): The input OpenAPI file is not valid for the OpenAPI specificate https://github.com/OAI/OpenAPI-Specification/blob/master/versions/2.0.md (schema https://github.com/OAI/OpenAPI-Specification/blob/master/schemas/v2.0/schema.json).
In the Azure portal.
In c# (ServiceStack) we have defined the following:
public class SearchRequest : SearchRequestBase, IReturn<SearchResponse>
{
public SearchRequest()
{
Filters = new List<FilterDto>();
}
[ApiMember(Name = "Filters"]
public List<FilterDto> Filters { get; set; }
}
public class FilterDto
{
[ApiMember(Name = "Field"]
[ApiAllowableValues("Field", typeof(FilterAndFacetField))]
public FilterAndFacetField Field { get; set; }
[ApiMember(Name = "Values")]
public List<string> Values { get; set; }
[ApiMember(Name = "Type")]
[ApiAllowableValues("Type", typeof(FilterType))]
public FilterType Type { get; set; }
public FilterDto()
{
Values = new List<string>();
}
}
Have anyone successfully managed to import a OpenApi using array of $ref in the parameters from ServiceStack into a Api Management?
I have a JSON something like this:
{
"key": "Target",
"value": {
"__type": "Entity:http://schemas.microsoft.com/xrm/2011/Contracts",
"Attributes": [
{
"key": "prioritycode",
"value": {
"__type": "OptionSetValue:http://schemas.microsoft.com/xrm/2011/Contracts",
"Value": 1
}
},
{
"key": "completeinternalreview",
"value": false
},
{
"key": "stepname",
"value": "10-Lead"
},
{
"key": "createdby",
"value": {
"__type": "EntityReference:http://schemas.microsoft.com/xrm/2011/Contracts",
"Id": "ca2ead0c-8786-e511-80f9-3863bb347b18",
"KeyAttributes": [],
"LogicalName": "systemuser",
"Name": null,
"RowVersion": null
}
}
]
}
}
How do the toke for the key/values by searching for the value of the key?
Eg I want to get the key value pair 'completeinternalreview'
Assuming you have a C# class like this to represent that attributes object from your JSON:
public class MyValue
{
[JsonProperty("Attributes")]
public List<KeyValuePair<string, object>> Attributes { get; set; }
}
you can simply deserialize the string:
var result = JsonConvert.DeserializeObject<KeyValuePair<string, MyValue>>(jsonString);
and then find the correct key-value pair with:
var kvp = result.Value.Attributes.Find(a => a.Value == "completeinternalreview");
I have downloaded local version for Amazon DynamoDB. I am trying to create a table using shell. When I run the code from shell it gives me an error:
"message":"The security token included in the request is invalid."
"code":"UnrecognizedClientException"
"time":"2017-04-27T12:50:35.880Z"
"statusCode":400
"retryable":false
Create code is:
var dynamodb = new AWS.DynamoDB();
var params = {
"AttributeDefinitions": [
{
"AttributeName": "UserId",
"AttributeType": "N"
},
{
"AttributeName": "FirstName",
"AttributeType": "S"
},
{
"AttributeName": "LastName",
"AttributeType": "S"
},
{
"AttributeName": "CellPhoneNumber",
"AttributeType": "N"
}
],
"TableName": "Users",
"KeySchema": [
{
"AttributeName": "UserId",
"KeyType": "HASH"
},
{
"AttributeName": "CellPhoneNumber",
"KeyType": "RANGE"
}
],
"LocalSecondaryIndexes": [
{
"IndexName": "UserIndex",
"KeySchema": [
{
"AttributeName": "UserId",
"KeyType": "HASH"
},
{
"AttributeName": "CellPhoneNumber",
"KeyType": "RANGE"
}
],
"Projection": {
"ProjectionType": "KEYS_ONLY"
}
}
],
"ProvisionedThroughput": {
"ReadCapacityUnits": 5,
"WriteCapacityUnits": 5
}
}
dynamodb.createTable(params, function(err, data) {
if (err) ppJson(err); // an error occurred
else ppJson(data); // successful response
});
How do I create a table in local DynamoDB? Do I need to create a DB first? I am asking this because I have always worked on SQL and this is the first time I am using NoSQL
No need to create database. Just need to create table.
Use the below configuration for local dynamodb. The endpoint URL is important. The other attributes are dummy values (i.e. it can be any values).
var creds = new AWS.Credentials('akid', 'secret', 'session');
AWS.config.update({
region: "us-west-2",
endpoint: "http://localhost:8000",
credentials : creds
});
Also, no need to define all the attributes while creating the table. Only key attributes need to be defined. Otherwise, you will get error.
Full code to create table (should be executed on http://localhost:8000/shell/):-
var dynamodb = new AWS.DynamoDB({
region: 'us-east-1',
endpoint: "http://localhost:8000"
});
var tableName = "Movies";
var params = {
"AttributeDefinitions": [
{
"AttributeName": "UserId",
"AttributeType": "N"
},
{
"AttributeName": "CellPhoneNumber",
"AttributeType": "N"
}
],
"TableName": "PBUsers",
"KeySchema": [
{
"AttributeName": "UserId",
"KeyType": "HASH"
},
{
"AttributeName": "CellPhoneNumber",
"KeyType": "RANGE"
}
],
"LocalSecondaryIndexes": [
{
"IndexName": "UserIndex",
"KeySchema": [
{
"AttributeName": "UserId",
"KeyType": "HASH"
},
{
"AttributeName": "CellPhoneNumber",
"KeyType": "RANGE"
}
],
"Projection": {
"ProjectionType": "KEYS_ONLY"
}
}
],
"ProvisionedThroughput": {
"ReadCapacityUnits": 5,
"WriteCapacityUnits": 5
}
}
dynamodb.createTable(params, function(err, data) {
if (err) {
if (err.code === "ResourceInUseException" && err.message === "Cannot create preexisting table") {
console.log("message ====>" + err.message);
} else {
console.error("Unable to create table. Error JSON:", JSON.stringify(err, null, 2));
}
} else {
console.log("Created table. Table description JSON:", JSON.stringify(data, null, 2));
}
});
var params = {
TableName: 'student',
KeySchema: [
{
AttributeName: 'sid',
KeyType: 'HASH',
},
],
AttributeDefinitions: [
{
AttributeName: 'sid',
AttributeType: 'N',
},
],
ProvisionedThroughput: {
ReadCapacityUnits: 10,
WriteCapacityUnits: 10,
},
};
dynamodb.createTable(params, function(err, data) {
if (err) ppJson(err); // an error occurred
else ppJson(data); // successful response
});
You need to also install aws-amplify cli locally as well before you can create a local DynamoDB table.
npm install -g #aws-amplify/cli