AWS SDK Dynamodb query operation on secondary index with encrypted data - amazon-dynamodb

I am trying to use DynamodbMapper to query data using gsi.
HashMap<String, AttributeValue> eav = new HashMap<>();
eav.put(":v1", new AttributeValue().withS(employee.getDepartment()));
eav.put(":v2", new AttributeValue().withS(employee.getContactId()));
DynamoDBQueryExpression<Employee> queryExpression =
new DynamoDBQueryExpression()
.withIndexName("DepartmentContactId-index")
.withKeyConditionExpression("Department = :v1 and contactId = :v2")
.withExpressionAttributeValues(eav)
.withConsistentRead(false);
List<Employee> items =
dynamoDBMapper.query(Employee.class, queryExpression);
I am getting bad signature exception.
PS: one of the field(column) in Employee table in dynamodb is encrypted using AWSKMS. I have configured the same KMS key in dynamodb mapper but still getting the same issue. Any pointers?
Mapper class -->
package com.test.model;
import com.amazonaws.services.dynamodbv2.datamodeling.*;
importcom.amazonaws.services.dynamodbv2.datamodeling.encryption.DoNotEncrypt;
import static com.test.util.Constants.*;
#DynamoDBTable(tableName = "Employee")
public class Employee {
private String id;
private String department;
private String contactId;
private RulesData rulesData;
// Partition Key
#DynamoDBHashKey(attributeName = ID)
#DynamoDBAutoGeneratedKey
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
#DoNotEncrypt
#DynamoDBRangeKey(attributeName = DEPARTMENT)
public String getDepartment() {
return department;
}
public void setDepartment(String department) {
this.department = department;
}
#DoNotEncrypt
#DynamoDBAttribute(attributeName = CONTACT_ID)
public String getContactId() {
return contactId;
}
public void setContactId(String contactId) {
this.contactId = contactId;
}
#DynamoDBAttribute(attributeName = DATA)
public RulesData getRulesData() {
return rulesData;
}
public void setRulesData(RulesData rulesData) {
this.rulesData = rulesData;
}
}

If you set the projection type of the global secondary index(GSI) to be other than ALL, then the signature attribute will not be in the GSI .
Thus, if you require only unencrypted fields from your query on the GSI, use a new DynamoDBMapper without an AttributeEncryptor.
If you require encrypted fields too, set the projection type of the GSI to be ALL.

Related

How to query DynamoDB using ONLY Partition Key [Java]?

I am new to DynamoDB and wanted to know how can we query on a table in DynamoDB by using ONLY partition key in JAVA
I have table called "ervive-pdi-data-invalid-qa" and it's Schema is :
partition key is "SubmissionId"
Sort key is "Id".
City (Attribute)
Errors (Attribute)
The table looks like this:
Table
I want to retrieve the sort key value and remaining attributes data by using Partition key using (software.amazon.awssdk) new version of AWS SDK DynamoDB classes.
is it possible to get it? If so, can any one post the answers?
Have tried this:
DynamoDbClient ddb =
DynamoDbClient.builder().region(Region.US_EAST_1).build();
DynamoDbEnhancedClient enhancedClient =
DynamoDbEnhancedClient.builder()
.dynamoDbClient(ddb)
.build();
//Define table
DynamoDbTable<ErvivePdiDataInvalidQa> table =
enhancedClient.table("ervive-pdi-data-invalid-qa",
TableSchema.fromBean(ErvivePdiDataInvalidQa.class));
Key key = Key.builder().partitionValue(2023).build();
ErvivePdiDataInvalidQa result = table.getItem(r->r.key(key));
System.out.println("The record id is "+result.getId());
ErvivePdiDataInvalidQa table class is in below comment*
and it is returning "The provided key element does not match the schema (Service: DynamoDb, Status Code: 400, Request ID: PE1MKPMQ9MLT51OLJQVDCURQGBVV4KQNSO5AEMVJF66Q9ASUAAJG, Extended Request ID: null)"
Query you need is documented in one of the examples of AWS Dynamodb Query API for Java.
AmazonDynamoDB client = AmazonDynamoDBClientBuilder.standard()
.withRegion(Regions.US_WEST_2).build();
DynamoDB dynamoDB = new DynamoDB(client);
Table table = dynamoDB.getTable("ervive-pdi-data-invalid-qa");
QuerySpec spec = new QuerySpec()
.withKeyConditionExpression("SubmissionId = :v_id")
.withValueMap(new ValueMap()
.withInt(":v_id", 2146));
ItemCollection<QueryOutcome> items = table.query(spec);
Iterator<Item> iterator = items.iterator();
Item item = null;
while (iterator.hasNext()) {
item = iterator.next();
System.out.println(item.toJSONPretty());
}
A single Query operation can retrieve a maximum of 1 MB of data, see documentation
I have been working with Padma on this issue. We first tried A. Khan's code but could not get passed authentication with v1. Instead we got "WARNING: Your profile name includes a 'profile ' prefix. This is considered part of the profile name in the Java SDK, so you will need to include this prefix in your profile name when you reference this profile from your Java code."
ultimately it could not get the credentials. Our credentials assume IAM roles in .aws/config-i2 file. It works fine in v2 but not v1.
So then we tried v2 of the SDK and have no problems with connecting but we get NULL returned on trying to fetch all records from the table.
In all of the below attempts using v2 of SDK, table data returns NULL
We created this table class
package data;
import software.amazon.awssdk.enhanced.dynamodb.mapper.annotations.DynamoDbBean;
import software.amazon.awssdk.enhanced.dynamodb.mapper.annotations.DynamoDbPartitionKey;
import software.amazon.awssdk.enhanced.dynamodb.mapper.annotations.DynamoDbSortKey;
#DynamoDbBean
public class ErvivePdiDataInvalidQa {
private int submissionId;
private String id;
private String address1;
private String city;
private String dateOfBirth;
private String errors;
private String firstName;
private String firstNameNormalized;
private String gender;
private String lastName;
private String lastNameNormalized;
private String middleNameInitial;
private String postalCode;
private String rowNumber;
private String state;
private String submissionType;
#DynamoDbPartitionKey
public int getSubmissionId() {
return submissionId;
}
public void setSubmissionId(int submissionId) {
this.submissionId = submissionId;
}
#DynamoDbSortKey
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public String getAddress1() {
return address1;
}
public void setAddress1(String Address1) {
this.address1 = Address1;
}
public String getCity() {
return city;
}
public void setCity(String city) {
this.city = city;
}
public String getDateOfBirth() {
return dateOfBirth;
}
public void setDateOfBirth(String dateOfBirth) {
this.dateOfBirth = dateOfBirth;
}
public String getErrors() {
return errors;
}
public void setErrors(String errors) {
this.errors = errors;
}
public String getFirstName() {
return firstName;
}
public void setFirstName(String firstName) {
this.firstName = firstName;
}
public String getFirstNameNormalized() {
return firstNameNormalized;
}
public void setFirstNameNormalized(String firstNameNormalized) {
this.firstNameNormalized = firstNameNormalized;
}
public String getGender() {
return gender;
}
public void setGender(String gender) {
this.gender = gender;
}
public String getLastName() {
return lastName;
}
public void setLastName(String lastName) {
this.lastName = lastName;
}
public String getLastNameNormalized() {
return lastNameNormalized;
}
public void setLastNameNormalized(String lastNameNormalized) {
this.lastNameNormalized = lastNameNormalized;
}
public String getMiddleNameInitial() {
return middleNameInitial;
}
public void setMiddleNameInitial(String middleNameInitial) {
this.middleNameInitial = middleNameInitial;
}
public String getPostalCode() {
return postalCode;
}
public void setPostalCode(String postalCode) {
this.postalCode = postalCode;
}
public String getRowNumber() {
return rowNumber;
}
public void setRowNumber(String rowNumber) {
this.rowNumber = rowNumber;
}
public String getState() {
return state;
}
public void setState(String state) {
this.state = state;
}
public String getSubmissionType() {
return submissionType;
}
public void setSubmissionType(String submissionType) {
this.submissionType = submissionType;
}
}
DynamoDB code to get all records
//Connection
DynamoDbClient ddb = DynamoDbClient.builder().build();
DynamoDbEnhancedClient enhancedClient = DynamoDbEnhancedClient.builder()
.dynamoDbClient(ddb)
.build();
//Define table
DynamoDbTable<ErvivePdiDataInvalidQa> table = enhancedClient.table("ervive-pdi-data-invalid-qa", TableSchema.fromBean(ErvivePdiDataInvalidQa.class));
//Get All Items from table - RETURNING NULL
Iterator<ErvivePdiDataInvalidQa> results = table.scan().items().iterator();
while (results.hasNext()) {
ErvivePdiDataInvalidQa rec = results.next();
System.out.println("The record id is "+rec.getId());
}
Also tried:
DynamoDB code to filter by SubmissionID
AttributeValue attr = AttributeValue.builder()
.n("1175")
.build();
// Get only Open items in the Work table
Map<String, AttributeValue> myMap = new HashMap<>();
myMap.put(":val1", attr);
Map<String, String> myExMap = new HashMap<>();
myExMap.put("#sid", "SubmissionId");
// Set the Expression so only Closed items are queried from the Work table
Expression expression = Expression.builder()
.expressionValues(myMap)
.expressionNames(myExMap)
.expression("#sid = :val1")
.build();
ScanEnhancedRequest enhancedRequest = ScanEnhancedRequest.builder()
.filterExpression(expression)
.limit(15)
.build();
// Get items in the Record table and write out the ID value
Iterator<ErvivePdiDataInvalidQa> results = table.scan(enhancedRequest).items().iterator();
while (results.hasNext()) {
ErvivePdiDataInvalidQa record = results.next();
System.out.println("The record id is " + record.getId());
}

Query with DynamoDB Secondary Index AWS SDK 2 Java exception creating DynamoDbIndex object

I'm having trouble running a query against a secondary index, getting an exception:
Ex getting dynamodb scan: java.lang.IllegalArgumentException: Attempt to execute an operation that requires a secondary index without defining the index attributes in the table metadata. Index name: category-timestamp-index
Can someone guide me on how I'm doing this wrong?
My table is idIT_RSS_Sources and I've created an index category-timestamp-index.
screenshot attached of index
My code is:
DynamoDbEnhancedClient enhancedClient = getEnhancedDBClient(region);
// Create a DynamoDbTable object
logger.debug("getting RSS Source category-timestamp-index");
//this throws the exception
DynamoDbIndex<RSS_Source> catIndex =
enhancedClient.table("idIT_RSS_Sources",
TableSchema.fromBean(RSS_Source.class))
.index("category-timestamp-index");
logger.debug("building query attributes");
AttributeValue att = AttributeValue.builder()
.s(theCategory)
.build();
Map<String, AttributeValue> expressionValues = new HashMap<>();
expressionValues.put(":value", att);
Expression expression = Expression.builder()
.expression("category = :value")
.expressionValues(expressionValues)
.build();
// Create a QueryConditional object that's used in the query operation
QueryConditional queryConditional = QueryConditional
.keyEqualTo(Key.builder().partitionValue(theCategory)
.build());
logger.debug("calling catIndex.query in getRSS...ForCategory");
Iterator<Page<RSS_Source>> dbFeedResults = (Iterator<Page<RSS_Source>>) catIndex.query(
QueryEnhancedRequest.builder()
.queryConditional(queryConditional)
.build());
solved, I was not using the proper annotation in my model class:
#DynamoDbSecondaryPartitionKey(indexNames = { "category-index" })
public String getCategory() { return category; }
public void setCategory(String category) { this.category = category; }
Assume you have a model named Issues.
package com.example.dynamodb;
import software.amazon.awssdk.enhanced.dynamodb.mapper.annotations.DynamoDbBean;
import software.amazon.awssdk.enhanced.dynamodb.mapper.annotations.DynamoDbPartitionKey;
import software.amazon.awssdk.enhanced.dynamodb.mapper.annotations.DynamoDbSecondaryPartitionKey;
import software.amazon.awssdk.enhanced.dynamodb.mapper.annotations.DynamoDbSortKey;
#DynamoDbBean
public class Issues {
private String issueId;
private String title;
private String createDate;
private String description;
private String dueDate;
private String status;
private String priority;
private String lastUpdateDate;
#DynamoDbPartitionKey
public String getId() {
return this.issueId;
}
public void setId(String id) {
this.issueId = id;
}
#DynamoDbSortKey
public String getTitle() {
return this.title;
}
public void setTitle(String title) {
this.title = title;
}
public void setLastUpdateDate(String lastUpdateDate) {
this.lastUpdateDate = lastUpdateDate;
}
public String getLastUpdateDate() {
return this.lastUpdateDate;
}
public void setPriority(String priority) {
this.priority = priority;
}
public String getPriority() {
return this.priority;
}
public void setStatus(String status) {
this.status = status;
}
public String getStatus() {
return this.status;
}
public void setDueDate(String dueDate) {
this.dueDate = dueDate;
}
#DynamoDbSecondaryPartitionKey(indexNames = { "dueDateIndex" })
public String getDueDate() {
return this.dueDate;
}
public String getDate() {
return this.createDate;
}
public void setDate(String date) {
this.createDate = date;
}
public String getDescription() {
return this.description;
}
public void setDescription(String description) {
this.description = description;
}
}
Notice the annotation on getDueDate.
#DynamoDbSecondaryPartitionKey(indexNames = { "dueDateIndex" })
public String getDueDate() {
return this.dueDate;
}
This is because the Issues table has a secondary index named dueDateIndex.
To query on this secondary index, you can use this code that uses the Amazon DynamoDB Java API V2:
public static void queryIndex(DynamoDbClient ddb, String tableName, String indexName) {
try {
// Create a DynamoDbEnhancedClient and use the DynamoDbClient object
DynamoDbEnhancedClient enhancedClient = DynamoDbEnhancedClient.builder()
.dynamoDbClient(ddb)
.build();
//Create a DynamoDbTable object based on Issues
DynamoDbTable<Issues> table = enhancedClient.table("Issues", TableSchema.fromBean(Issues.class));
String dateVal = "2013-11-19";
DynamoDbIndex<Issues> secIndex =
enhancedClient.table("Issues",
TableSchema.fromBean(Issues.class))
.index("dueDateIndex");
AttributeValue attVal = AttributeValue.builder()
.s(dateVal)
.build();
// Create a QueryConditional object that's used in the query operation
QueryConditional queryConditional = QueryConditional
.keyEqualTo(Key.builder().partitionValue(attVal)
.build());
// Get items in the Issues table
SdkIterable<Page<Issues>> results = secIndex.query(
QueryEnhancedRequest.builder()
.queryConditional(queryConditional)
.build());
AtomicInteger atomicInteger = new AtomicInteger();
atomicInteger.set(0);
results.forEach(page -> {
Issues issue = (Issues) page.items().get(atomicInteger.get());
System.out.println("The issue title is "+issue.getTitle());
atomicInteger.incrementAndGet();
});
} catch (DynamoDbException e) {
System.err.println(e.getMessage());
System.exit(1);
}
}
For what it's worth, if your Global Secondary Index has a sort key, you must annotate that field in the DynamoDB bean with:
#DynamoDbSecondarySortKey(indexNames = { "<indexName>" })
public String getFieldName() {
return fieldName;
}
My working code is as below:
sortKey-index = GSI in dynamo db
List<Flow> flows = new ArrayList<>();
DynamoDbIndex<Flow> flowBySortKey = table().index("sortKey-index");
// Create a QueryConditional object that's used in the query operation
QueryConditional queryConditional = QueryConditional
.keyEqualTo(Key.builder()
.partitionValue(sortKey)
.build());
SdkIterable<Page<Flow>> dbFeedResults = flowBySortKey.query(
QueryEnhancedRequest.builder()
.queryConditional(queryConditional)
.build());
dbFeedResults.forEach(flowPage -> {
flows.addAll(flowPage.items());
});

Unable to get insert, update and delete to work with spring-data-jpa

I am trying to create a web service that performs basic CRUD operations written using spring boot 2. The select operation works fine, however the insert, delete and update operations have no effect as their query is not getting generated and executed.
I have looked through different examples but I am unable to figure out any issues. The major concern for me is the fact that not even a query is being triggered for insert, delete or update operations.
Student Entity
#Entity
#Table(name = "student")
#JsonIgnoreProperties({"hibernateLazyInitializer", "handler"})
public class Student {
#Id
#NotNull
#Column(name = "id")
private int id;
#NotNull
#Column(name = "name")
private String name;
#Column(name = "course")
private String course;
public Student(int id, String name, String course) {
this.id = id;
this.name = name;
this.course = course;
}
public Student(){}
public int getId() {
return id;
}
public void setId(int id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getCourse() {
return course;
}
public void setCourse(String course) {
this.course = course;
}
#Override
public String toString() {
return "Student{ id=" + id + ", name='" + name + '\'' + ", course='" + course + '\'' + '}';
}
}
StudentDaoImpl
#Repository
#Transactional
public class StudentDaoImpl implements StudentDao {
#Autowired
private EntityManagerFactory entityManagerFactory;
#Override
public List<Student> fetchAllStudents() {
Session session = entityManagerFactory.unwrap(SessionFactory.class).openSession();
CriteriaBuilder cb = session.getCriteriaBuilder();
CriteriaQuery<Student> cq = cb.createQuery(Student.class);
Root<Student> root = cq.from(Student.class);
CriteriaQuery<Student> all = cq.select(root);
List<Student> solution = session.createQuery(all).getResultList();
session.close();
return solution;
}
#Override
public Student deleteStudent(Integer id) {
Session session = entityManagerFactory.unwrap(SessionFactory.class).openSession();
Student student = session.load(Student.class, id);
if (student != null){
session.delete(student);
session.close();
}
return student;
}
#Override
public Student fetchForId(Integer id){
Session session = entityManagerFactory.unwrap(SessionFactory.class).openSession();
Student student = session.load(Student.class, id);
session.close();
return student;
}
#Override
public Student insertStudent(Student student) {
Session session = entityManagerFactory.unwrap(SessionFactory.class).openSession();
session.save(student);
session.close();
return student;
}
#Override
public Student updateStudent(Student student) {
Session session = entityManagerFactory.unwrap(SessionFactory.class).openSession();
Student studentCheck = session.load(Student.class, student.getId());
if (studentCheck != null) {
session.saveOrUpdate(student);
session.close();
}
return student;
}
}
application.properties
spring.datasource.url=jdbc:mysql://localhost:3306/test
spring.datasource.username=root
spring.datasource.password=
spring.jpa.database = MYSQL
spring.jpa.show-sql = true
spring.jpa.hibernate.ddl-auto = update
spring.datasource.driver-class-name=com.mysql.jdbc.Driver
spring.jpa.hibernate.naming.physical-strategy=org.hibernate.boot.model.naming.PhysicalNamingStrategyStandardImpl
spring.jpa.properties.hibernate.dialect = org.hibernate.dialect.MySQL5Dialect
Edit
Replacing EntityManagerFactory with EntityManager( + Persistent Context Annotation) worked for me. However I still haven't figured why persistence worked for EntityManager.
If it's not strictly important, you can do it using NativeQuery and its executeUpdate API:
String query = "insert into student values(1,?)";
em.createNativeQuery(query)
.setParameter(1, "Tom")
.executeUpdate();
I would like to suggest this repository
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.stereotype.Repository;
#Repository
public interface StudentRepository extends JpaRepository<Student, Integer> {
}
Probably you have to change the id of Student from int to Integer.
And this repository has the methods for retrieving, updating, creating and deleting.
Let's say that you want to use this repository in a Service, you can do that like this:
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
#Service
#Transactional(rollbackFor = Exception.class)
public class StudentService {
#Autowired
private StudentRepository studentRepository;
......
}

passing in sql.oracle.ARRAY SimpleJdbcCall to StoredProcedure

I am using JBoss 7.1.Final as application server and Oracle as database. We are using Spring framework 3.x and Java 6. trying to pass in an array of Strings and convert them inside the stored proc to array of varchars. I haven't found a good example for this yet. Please provide a pointer if you can to any documentation or previous forum post. I have searched and not found one that seems to apply.
The stored proc is defined as:
CREATE OR REPLACE PROCEDURE GET_TEST_CONTENTS
(IN_RR_ARRAY IN RR_ARRAY,
IN_ORDER_STATE IN VARCHAR2,
OUT_FLAG OUT VARCHAR2,
OUT_RETURN_CODE OUT VARCHAR2,
OUT_RETURN_DESC OUT VARCHAR2,
OUT_RETURN_TYPE OUT VARCHAR2,
OUT_RETURN_VAL OUT NUMBER
)
Type RR_ARRAY is defined as:
create or replace
type RR_ARRAY as table of varchar2(15);
Within my java code I have:
jdbcTemplate = new JdbcTemplate(dataSource);
jdbcTemplate.setResultsMapCaseInsensitive(true);
this.getTestContents = new SimpleJdbcCall(jdbcTemplate)
.withCatalogName("STAR")
.withoutProcedureColumnMetaDataAccess()
.withProcedureName("GET_TEST_CONTENTS")
.declareParameters(
new SqlParameter("IN_RR_ARRAY", OracleTypes.ARRAY,
"RR_ARRAY"),
new SqlParameter("IN_ORDER_STATE", OracleTypes.VARCHAR), new SqlOutParameter("OUT_FLAG",
OracleTypes.VARCHAR),
new SqlOutParameter("OUT_RETURN_VAL", OracleTypes.INTEGER),
new SqlOutParameter("OUT_RETURN_CODE", OracleTypes.VARCHAR),
new SqlOutParameter("OUT_RETURN_DESC", OracleTypes.VARCHAR),
new SqlOutParameter("OUT_RETURN_TYPE", OracleTypes.VARCHAR));
//I get a different error here so creating new connection for testing
//conn = jdbcTemplate.getDataSource().getConnection();
Class.forName("oracle.jdbc.driver.OracleDriver").newInstance();
conn = DriverManager.getConnection(jdbcURL, user, passwd);
ArrayDescriptor desc = new ArrayDescriptor("STAR.RR_ARRAY", conn);
ARRAY arr = new ARRAY(desc, conn, testArray); // testArray is just
// String[] with 2 values
Map<String, Object> hm = new HashMap<String, Object>();
hm.put("IN_RR_ARRAY", arr);
hm.put("IN_ORDER_STATE", stateCode);
hm.put("OUT_FLAG", Types.VARCHAR);
hm.put("OUT_RETURN_CODE", Types.VARCHAR);
hm.put("OUT_RETURN_DESC", Types.VARCHAR);
hm.put("OUT_RETURN_TYPE", Types.VARCHAR);
SqlParameterSource in = new MapSqlParameterSource().addValues(hm);
Map out = getTestContents .execute(in);
The stack trace returned is:
11:24:43,691 ERROR [com.test.repository.TestContentsDao] (http-localhost-127.0.0.1-8080-1) Error while calling GET_TEST_CONTENTS Stored procedure: org.springframework.jdbc.UncategorizedSQLException: CallableStatementCallback; uncategorized SQLException for SQL [{call STAR.GET_TEST_CONTENTS(?, ?, ?, ?, ?, ?, ?)}]; SQL state [99999]; error code [17059]; Fail to convert to internal representation: oracle.sql.ARRAY#2a081f8f; nested exception is java.sql.SQLException: Fail to convert to internal representation: oracle.sql.ARRAY#2a081f8f
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:83) [spring-jdbc-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:80) [spring-jdbc-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:80) [spring-jdbc-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:969) [spring-jdbc-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.jdbc.core.JdbcTemplate.call(JdbcTemplate.java:1003) [spring-jdbc-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.jdbc.core.simple.AbstractJdbcCall.executeCallInternal(AbstractJdbcCall.java:388) [spring-jdbc-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.jdbc.core.simple.AbstractJdbcCall.doExecute(AbstractJdbcCall.java:351) [spring-jdbc-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.jdbc.core.simple.SimpleJdbcCall.execute(SimpleJdbcCall.java:181) [spring-jdbc-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at com.test.repository.TestContentsDao.isGood(TestContentsDao.java:147) [classes:]
Any advice or pointers to examples or docs will be appreciated
I found the fix for this. Now I use this List of Strings:
List<String> ndcList;
I changed the array parameter from OracleTypes.ARRAY to java.sql.stypes.ARRAY and specified schema prefix on array name. And changed the code and created a few new convenience methods at the bottom.
I needed the wrapped connection and had to add this dependency to my pom:
<dependency>
<groupId>jboss</groupId>
<artifactId>jboss-common-jdbc-wrapper</artifactId>
<version>3.2.3</version>
</dependency>
---- method code starts here------------
jdbcTemplate = new JdbcTemplate(dataSource);
jdbcTemplate.setResultsMapCaseInsensitive(true);
this.getTestContents = new SimpleJdbcCall(jdbcTemplate)
.withCatalogName("STAR")
.withoutProcedureColumnMetaDataAccess()
.withProcedureName("GET_TEST_CONTENTS")
.declareParameters(
new SqlParameter("IN_RR_ARRAY", java.sql.types.ARRAY, "STAR.RR_ARRAY"),
new SqlParameter("IN_ORDER_STATE", OracleTypes.VARCHAR),
new SqlOutParameter("OUT_FLAG", OracleTypes.VARCHAR),
new SqlOutParameter("OUT_RETURN_VAL", OracleTypes.INTEGER),
new SqlOutParameter("OUT_RETURN_CODE", OracleTypes.VARCHAR),
new SqlOutParameter("OUT_RETURN_DESC", OracleTypes.VARCHAR),
new SqlOutParameter("OUT_RETURN_TYPE", OracleTypes.VARCHAR));
Map<String, Object> hm = new HashMap<String, Object>();
hm.put("IN_RR_ARRAY", new ScriptArray(ndcList));
hm.put("IN_ORDER_STATE", stateCode);
hm.put("OUT_FLAG", Types.VARCHAR);
hm.put("OUT_RETURN_CODE", Types.VARCHAR);
hm.put("OUT_RETURN_DESC", Types.VARCHAR);
hm.put("OUT_RETURN_TYPE", Types.VARCHAR);
SqlParameterSource in = new MapSqlParameterSource().addValues(hm);
Map out = getTestContents.execute(in);
---- method code ends here------------
public class ScriptArray extends AbstractSqlTypeValue {
private List<String> values;
public ScriptArray(List<String> values) {
this.values = values;
}
public Object createTypeValue(Connection con, int sqlType,
String typeName) throws SQLException {
oracle.jdbc.OracleConnection wrappedConnection = con
.unwrap(oracle.jdbc.OracleConnection.class);
con = wrappedConnection;
ArrayDescriptor desc = new ArrayDescriptor(typeName, con);
return new ARRAY(desc, con,
(String[]) values.toArray(new String[values.size()]));
}
}
Been fighting the similar issue for a day. This article helped me.
Here's code backup, in case page will be unavailable:
-- custom type
create or replace TYPE "MY_TYPE"
as object(name varchar(255),
value varchar(255))
-- array of MY_TYPE
create or replace
TYPE "MY_ARRAY"
as table of MY_TYPE
-- echo like SP, doesn't do too much
create or replace
procedure foo(
i_array in MY_ARRAY,
o_array out MY_ARRAY)
as
begin
o_array := MY_ARRAY();
for i in 1 .. i_array.count loop
o_array.extend;
o_array(i) := MY_TYPE(i_array(i).name, i_array(i).value);
end loop;
end;
Java code:
public class FooStoredProcedure {
private static final String SP_NAME = "FOO";
private static final String MY_ARRAY = "MY_ARRAY";
private static final String MY_TYPE = "MY_TYPE";
private static final String I_ARRAY = "i_array";
private static final String O_ARRAY = "o_array";
private final StoredProcedure storedProcedure;
public FooStoredProcedure(DataSource dataSource) {
storedProcedure = new StoredProcedure(dataSource, SP_NAME) {
{
declareParameter(new SqlParameter(I_ARRAY, Types.ARRAY, MY_ARRAY));
declareParameter(new SqlOutParameter(O_ARRAY, Types.ARRAY, MY_ARRAY, new SqlReturnType() {
#Override
public Object getTypeValue(CallableStatement cs, int paramIndex,
int sqlType, String typeName) throws SQLException {
Connection connection = cs.getConnection();
Map<String, Class<?>> typeMap = connection.getTypeMap();
typeMap.put(MY_TYPE, MyType.class);
return cs.getObject(paramIndex);
}
}));
compile();
}
};
}
/**
* #return array of {#link MyType} objects or <code>null</code>
*/
public MyType[] execute(final MyType[] values) {
Map<String, Object> params = new HashMap<String, Object>();
params.put(I_ARRAY, new AbstractSqlTypeValue() {
#Override
protected Object createTypeValue(Connection con, int sqlType, String typeName) throws SQLException {
ArrayDescriptor descriptor = new ArrayDescriptor(typeName, con);
return new ARRAY(descriptor, con, values);
}
});
Map<?, ?> result = storedProcedure.execute(params);
if ((!result.containsKey(O_ARRAY) || result.get(O_ARRAY) == null)) {
return null;
}
try {
Object[] resultArray = (Object[]) ((ARRAY) result.get(O_ARRAY)).getArray();
return Arrays.copyOf(resultArray, resultArray.length, MyType[].class);
} catch (SQLException e) {
throw new DataRetrievalFailureException("Unable to retrieve array", e);
}
}
public static class MyType implements SQLData {
private String name;
private String value;
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getValue() {
return value;
}
public void setValue(String value) {
this.value = value;
}
#Override
public String getSQLTypeName() throws SQLException {
return MY_TYPE;
}
#Override
public void readSQL(SQLInput stream, String typeName) throws SQLException {
name = stream.readString();
value = stream.readString();
}
#Override
public void writeSQL(SQLOutput stream) throws SQLException {
stream.writeString(name);
stream.writeString(value);
}
#Override
public String toString() {
return ToStringBuilder.reflectionToString(this, ToStringStyle.SHORT_PREFIX_STYLE);
}
}
}
I looked at the internet and it was very difficult to get it working with the solution many people have provided.. here is working code example.. in pom.xml create this dependency.
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-oracle</artifactId>
<version>2.0.0.M1</version>
</dependency>
oracle sample code
create table employee (EMPNO number(12) not null, FNAME varchar2(60), LNAME varchar2(60), EMAIL varchar2(120));
CREATE SEQUENCE empno_seq START WITH 1 INCREMENT BY 1 NOCACHE NOCYCLE;
CREATE OR REPLACE TYPE employee_type
AS OBJECT (EMPNO number(12), FNAME varchar2(60), LNAME varchar2(60), EMAIL varchar2(120));
/
CREATE OR REPLACE TYPE employee_table_type AS TABLE OF employee_type;
/
create or replace PROCEDURE SAVE_EMPLOYEES(p_emp_insert_array in employee_table_type) AS
BEGIN
FORALL i IN p_emp_insert_array.first .. p_emp_insert_array.last
insert into employee(
empno,
FNAME,
LNAME,
EMAIL)
values (
empno_seq.nextval,
p_emp_insert_array(i).FNAME,
p_emp_insert_array(i).LNAME,
p_emp_insert_array(i).EMAIL
);
END SAVE_EMPLOYEES;
/
import com.abc.employeepoc.domain.Employee;
import org.springframework.data.jdbc.support.oracle.StructMapper;
import java.sql.Connection;
import java.sql.SQLException;
import java.sql.Struct;
/**
*
* #author rsharma
*/
public class EmployeeStructMapper implements StructMapper<Employee> {
#Override
public Struct toStruct(Employee emp, Connection conn, String oracleTypeName) throws SQLException {
Object[] attributes = {
emp.getEmpno(),
emp.getFirstName(),
emp.getLastName(),
emp.getEmailAddress()
};
return conn.createStruct(oracleTypeName, attributes);
}
#Override
public Employee fromStruct(Struct struct) throws SQLException {
Employee emp= new Employee();
Object[] attributes = struct.getAttributes();
emp.setEmpno(((Number) attributes[0]).longValue());
emp.setFirstName(String.valueOf(attributes[1]));
emp.setLastName(String.valueOf(attributes[2]));
emp.setEmailAddress(String.valueOf(attributes[3]));
return emp;
}
}
SqlStructArrayValue in spring has some issue with OracleConnection casting so i created my own one similar to them.
import java.sql.Connection;
import java.sql.SQLException;
import java.sql.Struct;
import oracle.jdbc.OracleConnection;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.jdbc.support.oracle.SqlStructArrayValue;
import org.springframework.data.jdbc.support.oracle.StructMapper;
/**
*
* #author rsharma
*/
public class OracleSqlStructArrayValue<T> extends SqlStructArrayValue<T> {
private T[] values;
/**
* The object that will do the mapping *
*/
private StructMapper<T> mapper;
/**
* The type name of the STRUCT *
*/
private String structTypeName;
/**
* The type name of the ARRAY *
*/
private String arrayTypeName;
public OracleSqlStructArrayValue(T[] values, StructMapper<T> mapper, String structTypeName) {
super(values, mapper, structTypeName);
this.values = values;
this.mapper = mapper;
this.structTypeName = structTypeName;
}
public OracleSqlStructArrayValue(T[] values, StructMapper<T> mapper, String structTypeName, String arrayTypeName) {
super(values, mapper, structTypeName, arrayTypeName);
this.values = values;
this.mapper = mapper;
this.structTypeName = structTypeName;
this.arrayTypeName = arrayTypeName;
}
#Override
protected Object createTypeValue(Connection conn, int sqlType, String typeName) throws SQLException {
if (typeName == null && arrayTypeName == null) {
throw new InvalidDataAccessApiUsageException(
"The typeName for the array is null in this context. Consider setting the arrayTypeName.");
}
Struct[] structValues = new Struct[values.length];
for (int i = 0; i < values.length; i++) {
structValues[i] = mapper.toStruct(values[i], conn, structTypeName);
}
OracleConnection oracleConn = (OracleConnection) conn;
return oracleConn.createOracleArray(typeName != null ? typeName : arrayTypeName, structValues);
}
}
Now in your DAO Class do the following...
public class EmployeeDAO {
private static final Logger logger = LoggerFactory.getLogger(EmployeeDAO.class);
#Autowired
private DataSource dataSource;
private JdbcTemplate jdbcTemplate;
private SimpleJdbcCall saveEmployeesArrayCall;
#PostConstruct
private void postConstruct() {
jdbcTemplate = new JdbcTemplate(dataSource);
this.saveEmployeesArrayCall =
new SimpleJdbcCall(dataSource).withProcedureName(SQLConstants.SAVE_EMPLOYEES_STORE_PROC)
.withoutProcedureColumnMetaDataAccess()
.declareParameters(new SqlParameter("p_emp_insert_array", Types.ARRAY, SQLConstants.EMPLOYEE_OBJ_TABLE_TYPE));
}
public void saveEmployees(List<Employee> employees) {
Map<String, Object> in = new HashMap<>();
in.put("p_emp_insert_array", new OracleSqlStructArrayValue<>(employees.toArray(new Employee[0]), new EmployeeStructMapper(), SQLConstants.EMPLOYEE_OBJ_TYPE));
saveEmployeesArrayCall.execute(in);
}
}
import io.swagger.annotations.ApiModelProperty;
import java.util.Objects;
import org.springframework.data.annotation.Id;
/**
*
* #author rsharma
*/
public class Employee implements java.io.Serializable{
#Id
#ApiModelProperty(notes = "The database generated Employee Number")
private Long empno;
#ApiModelProperty(notes = "First Name of the Employee", required = true)
private String firstName;
#ApiModelProperty(notes = "Last Name of the Employee")
private String lastName;
private String emailAddress;
public Employee() {
super();
}
public Employee(Long empno, String emailAddress, String firstName, String lastName) {
this.empno = empno;
this.emailAddress = emailAddress;
this.firstName = firstName;
this.lastName = lastName;
}
public Long getEmpno() {
return empno;
}
public void setEmpno(Long empno) {
this.empno = empno;
}
public String getFirstName() {
return firstName;
}
public void setFirstName(String firstName) {
this.firstName = firstName;
}
public String getLastName() {
return lastName;
}
public void setLastName(String lastName) {
this.lastName = lastName;
}
public String getEmailAddress() {
return emailAddress;
}
public void setEmailAddress(String emailAddress) {
this.emailAddress = emailAddress;
}
#Override
public String toString() {
return "Employee{" + "empno=" + empno + ", firstName=" + firstName + ", lastName=" + lastName + ", emailAddress=" + emailAddress + '}';
}
#Override
public int hashCode() {
int hash = 7;
hash = 59 * hash + Objects.hashCode(this.empno);
return hash;
}
#Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null) {
return false;
}
if (getClass() != obj.getClass()) {
return false;
}
final Employee other = (Employee) obj;
if (!Objects.equals(this.empno, other.empno)) {
return false;
}
return true;
}
}

How can I retrieve/store a result set to an ArrayList?

How do I use the JdbcTemplate.query()/queryForList() to run a query using namedParameter and store the result set into a List of 'User's?
User Class:
public class User {
String name = null;
String id = null;
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getId() {
return name;
}
public void setId(String id) {
this.id = id;
}
}
Query:
SELECT name, id FROM USERS where email=:email
I'm looking for something like:
ArrayList<User> userList = jdbcTemplate.query(sql_query,
...some_mapper..., etc);
Seems like the answer to the question is not available at one place, on the Internet. Here's what I found out:
For adding the resultset into a List<>, we can use the NamedParameterJdbcTemplate.query() function:
NamedParameterJdbcTemplate jdbcTemplate;
ArrayList<User> usersSearchResult = (ArrayList<User>) jdbcTemplate.query(
USER_LIST_TP_query,
namedParameters,
new RowMapperResultSetExtractor<User>(new UserRowMapper(), 20));
We also have to define a custom RowMapperResultSetExtractor so that JDBC can understand how to convert each row in the result set to the type User.
private class UserRowMapper implements RowMapper<User> {
public User mapRow(ResultSet rs, int rowNum) throws SQLException {
User user = new User();
user.setId(rs.getString("ID"));
user.setName(rs.getString("NAME"));
return user;
}
}

Resources