passing in sql.oracle.ARRAY SimpleJdbcCall to StoredProcedure - jdbctemplate

I am using JBoss 7.1.Final as application server and Oracle as database. We are using Spring framework 3.x and Java 6. trying to pass in an array of Strings and convert them inside the stored proc to array of varchars. I haven't found a good example for this yet. Please provide a pointer if you can to any documentation or previous forum post. I have searched and not found one that seems to apply.
The stored proc is defined as:
CREATE OR REPLACE PROCEDURE GET_TEST_CONTENTS
(IN_RR_ARRAY IN RR_ARRAY,
IN_ORDER_STATE IN VARCHAR2,
OUT_FLAG OUT VARCHAR2,
OUT_RETURN_CODE OUT VARCHAR2,
OUT_RETURN_DESC OUT VARCHAR2,
OUT_RETURN_TYPE OUT VARCHAR2,
OUT_RETURN_VAL OUT NUMBER
)
Type RR_ARRAY is defined as:
create or replace
type RR_ARRAY as table of varchar2(15);
Within my java code I have:
jdbcTemplate = new JdbcTemplate(dataSource);
jdbcTemplate.setResultsMapCaseInsensitive(true);
this.getTestContents = new SimpleJdbcCall(jdbcTemplate)
.withCatalogName("STAR")
.withoutProcedureColumnMetaDataAccess()
.withProcedureName("GET_TEST_CONTENTS")
.declareParameters(
new SqlParameter("IN_RR_ARRAY", OracleTypes.ARRAY,
"RR_ARRAY"),
new SqlParameter("IN_ORDER_STATE", OracleTypes.VARCHAR), new SqlOutParameter("OUT_FLAG",
OracleTypes.VARCHAR),
new SqlOutParameter("OUT_RETURN_VAL", OracleTypes.INTEGER),
new SqlOutParameter("OUT_RETURN_CODE", OracleTypes.VARCHAR),
new SqlOutParameter("OUT_RETURN_DESC", OracleTypes.VARCHAR),
new SqlOutParameter("OUT_RETURN_TYPE", OracleTypes.VARCHAR));
//I get a different error here so creating new connection for testing
//conn = jdbcTemplate.getDataSource().getConnection();
Class.forName("oracle.jdbc.driver.OracleDriver").newInstance();
conn = DriverManager.getConnection(jdbcURL, user, passwd);
ArrayDescriptor desc = new ArrayDescriptor("STAR.RR_ARRAY", conn);
ARRAY arr = new ARRAY(desc, conn, testArray); // testArray is just
// String[] with 2 values
Map<String, Object> hm = new HashMap<String, Object>();
hm.put("IN_RR_ARRAY", arr);
hm.put("IN_ORDER_STATE", stateCode);
hm.put("OUT_FLAG", Types.VARCHAR);
hm.put("OUT_RETURN_CODE", Types.VARCHAR);
hm.put("OUT_RETURN_DESC", Types.VARCHAR);
hm.put("OUT_RETURN_TYPE", Types.VARCHAR);
SqlParameterSource in = new MapSqlParameterSource().addValues(hm);
Map out = getTestContents .execute(in);
The stack trace returned is:
11:24:43,691 ERROR [com.test.repository.TestContentsDao] (http-localhost-127.0.0.1-8080-1) Error while calling GET_TEST_CONTENTS Stored procedure: org.springframework.jdbc.UncategorizedSQLException: CallableStatementCallback; uncategorized SQLException for SQL [{call STAR.GET_TEST_CONTENTS(?, ?, ?, ?, ?, ?, ?)}]; SQL state [99999]; error code [17059]; Fail to convert to internal representation: oracle.sql.ARRAY#2a081f8f; nested exception is java.sql.SQLException: Fail to convert to internal representation: oracle.sql.ARRAY#2a081f8f
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:83) [spring-jdbc-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:80) [spring-jdbc-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:80) [spring-jdbc-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:969) [spring-jdbc-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.jdbc.core.JdbcTemplate.call(JdbcTemplate.java:1003) [spring-jdbc-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.jdbc.core.simple.AbstractJdbcCall.executeCallInternal(AbstractJdbcCall.java:388) [spring-jdbc-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.jdbc.core.simple.AbstractJdbcCall.doExecute(AbstractJdbcCall.java:351) [spring-jdbc-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.jdbc.core.simple.SimpleJdbcCall.execute(SimpleJdbcCall.java:181) [spring-jdbc-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at com.test.repository.TestContentsDao.isGood(TestContentsDao.java:147) [classes:]
Any advice or pointers to examples or docs will be appreciated

I found the fix for this. Now I use this List of Strings:
List<String> ndcList;
I changed the array parameter from OracleTypes.ARRAY to java.sql.stypes.ARRAY and specified schema prefix on array name. And changed the code and created a few new convenience methods at the bottom.
I needed the wrapped connection and had to add this dependency to my pom:
<dependency>
<groupId>jboss</groupId>
<artifactId>jboss-common-jdbc-wrapper</artifactId>
<version>3.2.3</version>
</dependency>
---- method code starts here------------
jdbcTemplate = new JdbcTemplate(dataSource);
jdbcTemplate.setResultsMapCaseInsensitive(true);
this.getTestContents = new SimpleJdbcCall(jdbcTemplate)
.withCatalogName("STAR")
.withoutProcedureColumnMetaDataAccess()
.withProcedureName("GET_TEST_CONTENTS")
.declareParameters(
new SqlParameter("IN_RR_ARRAY", java.sql.types.ARRAY, "STAR.RR_ARRAY"),
new SqlParameter("IN_ORDER_STATE", OracleTypes.VARCHAR),
new SqlOutParameter("OUT_FLAG", OracleTypes.VARCHAR),
new SqlOutParameter("OUT_RETURN_VAL", OracleTypes.INTEGER),
new SqlOutParameter("OUT_RETURN_CODE", OracleTypes.VARCHAR),
new SqlOutParameter("OUT_RETURN_DESC", OracleTypes.VARCHAR),
new SqlOutParameter("OUT_RETURN_TYPE", OracleTypes.VARCHAR));
Map<String, Object> hm = new HashMap<String, Object>();
hm.put("IN_RR_ARRAY", new ScriptArray(ndcList));
hm.put("IN_ORDER_STATE", stateCode);
hm.put("OUT_FLAG", Types.VARCHAR);
hm.put("OUT_RETURN_CODE", Types.VARCHAR);
hm.put("OUT_RETURN_DESC", Types.VARCHAR);
hm.put("OUT_RETURN_TYPE", Types.VARCHAR);
SqlParameterSource in = new MapSqlParameterSource().addValues(hm);
Map out = getTestContents.execute(in);
---- method code ends here------------
public class ScriptArray extends AbstractSqlTypeValue {
private List<String> values;
public ScriptArray(List<String> values) {
this.values = values;
}
public Object createTypeValue(Connection con, int sqlType,
String typeName) throws SQLException {
oracle.jdbc.OracleConnection wrappedConnection = con
.unwrap(oracle.jdbc.OracleConnection.class);
con = wrappedConnection;
ArrayDescriptor desc = new ArrayDescriptor(typeName, con);
return new ARRAY(desc, con,
(String[]) values.toArray(new String[values.size()]));
}
}

Been fighting the similar issue for a day. This article helped me.
Here's code backup, in case page will be unavailable:
-- custom type
create or replace TYPE "MY_TYPE"
as object(name varchar(255),
value varchar(255))
-- array of MY_TYPE
create or replace
TYPE "MY_ARRAY"
as table of MY_TYPE
-- echo like SP, doesn't do too much
create or replace
procedure foo(
i_array in MY_ARRAY,
o_array out MY_ARRAY)
as
begin
o_array := MY_ARRAY();
for i in 1 .. i_array.count loop
o_array.extend;
o_array(i) := MY_TYPE(i_array(i).name, i_array(i).value);
end loop;
end;
Java code:
public class FooStoredProcedure {
private static final String SP_NAME = "FOO";
private static final String MY_ARRAY = "MY_ARRAY";
private static final String MY_TYPE = "MY_TYPE";
private static final String I_ARRAY = "i_array";
private static final String O_ARRAY = "o_array";
private final StoredProcedure storedProcedure;
public FooStoredProcedure(DataSource dataSource) {
storedProcedure = new StoredProcedure(dataSource, SP_NAME) {
{
declareParameter(new SqlParameter(I_ARRAY, Types.ARRAY, MY_ARRAY));
declareParameter(new SqlOutParameter(O_ARRAY, Types.ARRAY, MY_ARRAY, new SqlReturnType() {
#Override
public Object getTypeValue(CallableStatement cs, int paramIndex,
int sqlType, String typeName) throws SQLException {
Connection connection = cs.getConnection();
Map<String, Class<?>> typeMap = connection.getTypeMap();
typeMap.put(MY_TYPE, MyType.class);
return cs.getObject(paramIndex);
}
}));
compile();
}
};
}
/**
* #return array of {#link MyType} objects or <code>null</code>
*/
public MyType[] execute(final MyType[] values) {
Map<String, Object> params = new HashMap<String, Object>();
params.put(I_ARRAY, new AbstractSqlTypeValue() {
#Override
protected Object createTypeValue(Connection con, int sqlType, String typeName) throws SQLException {
ArrayDescriptor descriptor = new ArrayDescriptor(typeName, con);
return new ARRAY(descriptor, con, values);
}
});
Map<?, ?> result = storedProcedure.execute(params);
if ((!result.containsKey(O_ARRAY) || result.get(O_ARRAY) == null)) {
return null;
}
try {
Object[] resultArray = (Object[]) ((ARRAY) result.get(O_ARRAY)).getArray();
return Arrays.copyOf(resultArray, resultArray.length, MyType[].class);
} catch (SQLException e) {
throw new DataRetrievalFailureException("Unable to retrieve array", e);
}
}
public static class MyType implements SQLData {
private String name;
private String value;
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getValue() {
return value;
}
public void setValue(String value) {
this.value = value;
}
#Override
public String getSQLTypeName() throws SQLException {
return MY_TYPE;
}
#Override
public void readSQL(SQLInput stream, String typeName) throws SQLException {
name = stream.readString();
value = stream.readString();
}
#Override
public void writeSQL(SQLOutput stream) throws SQLException {
stream.writeString(name);
stream.writeString(value);
}
#Override
public String toString() {
return ToStringBuilder.reflectionToString(this, ToStringStyle.SHORT_PREFIX_STYLE);
}
}
}

I looked at the internet and it was very difficult to get it working with the solution many people have provided.. here is working code example.. in pom.xml create this dependency.
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-oracle</artifactId>
<version>2.0.0.M1</version>
</dependency>
oracle sample code
create table employee (EMPNO number(12) not null, FNAME varchar2(60), LNAME varchar2(60), EMAIL varchar2(120));
CREATE SEQUENCE empno_seq START WITH 1 INCREMENT BY 1 NOCACHE NOCYCLE;
CREATE OR REPLACE TYPE employee_type
AS OBJECT (EMPNO number(12), FNAME varchar2(60), LNAME varchar2(60), EMAIL varchar2(120));
/
CREATE OR REPLACE TYPE employee_table_type AS TABLE OF employee_type;
/
create or replace PROCEDURE SAVE_EMPLOYEES(p_emp_insert_array in employee_table_type) AS
BEGIN
FORALL i IN p_emp_insert_array.first .. p_emp_insert_array.last
insert into employee(
empno,
FNAME,
LNAME,
EMAIL)
values (
empno_seq.nextval,
p_emp_insert_array(i).FNAME,
p_emp_insert_array(i).LNAME,
p_emp_insert_array(i).EMAIL
);
END SAVE_EMPLOYEES;
/
import com.abc.employeepoc.domain.Employee;
import org.springframework.data.jdbc.support.oracle.StructMapper;
import java.sql.Connection;
import java.sql.SQLException;
import java.sql.Struct;
/**
*
* #author rsharma
*/
public class EmployeeStructMapper implements StructMapper<Employee> {
#Override
public Struct toStruct(Employee emp, Connection conn, String oracleTypeName) throws SQLException {
Object[] attributes = {
emp.getEmpno(),
emp.getFirstName(),
emp.getLastName(),
emp.getEmailAddress()
};
return conn.createStruct(oracleTypeName, attributes);
}
#Override
public Employee fromStruct(Struct struct) throws SQLException {
Employee emp= new Employee();
Object[] attributes = struct.getAttributes();
emp.setEmpno(((Number) attributes[0]).longValue());
emp.setFirstName(String.valueOf(attributes[1]));
emp.setLastName(String.valueOf(attributes[2]));
emp.setEmailAddress(String.valueOf(attributes[3]));
return emp;
}
}
SqlStructArrayValue in spring has some issue with OracleConnection casting so i created my own one similar to them.
import java.sql.Connection;
import java.sql.SQLException;
import java.sql.Struct;
import oracle.jdbc.OracleConnection;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.jdbc.support.oracle.SqlStructArrayValue;
import org.springframework.data.jdbc.support.oracle.StructMapper;
/**
*
* #author rsharma
*/
public class OracleSqlStructArrayValue<T> extends SqlStructArrayValue<T> {
private T[] values;
/**
* The object that will do the mapping *
*/
private StructMapper<T> mapper;
/**
* The type name of the STRUCT *
*/
private String structTypeName;
/**
* The type name of the ARRAY *
*/
private String arrayTypeName;
public OracleSqlStructArrayValue(T[] values, StructMapper<T> mapper, String structTypeName) {
super(values, mapper, structTypeName);
this.values = values;
this.mapper = mapper;
this.structTypeName = structTypeName;
}
public OracleSqlStructArrayValue(T[] values, StructMapper<T> mapper, String structTypeName, String arrayTypeName) {
super(values, mapper, structTypeName, arrayTypeName);
this.values = values;
this.mapper = mapper;
this.structTypeName = structTypeName;
this.arrayTypeName = arrayTypeName;
}
#Override
protected Object createTypeValue(Connection conn, int sqlType, String typeName) throws SQLException {
if (typeName == null && arrayTypeName == null) {
throw new InvalidDataAccessApiUsageException(
"The typeName for the array is null in this context. Consider setting the arrayTypeName.");
}
Struct[] structValues = new Struct[values.length];
for (int i = 0; i < values.length; i++) {
structValues[i] = mapper.toStruct(values[i], conn, structTypeName);
}
OracleConnection oracleConn = (OracleConnection) conn;
return oracleConn.createOracleArray(typeName != null ? typeName : arrayTypeName, structValues);
}
}
Now in your DAO Class do the following...
public class EmployeeDAO {
private static final Logger logger = LoggerFactory.getLogger(EmployeeDAO.class);
#Autowired
private DataSource dataSource;
private JdbcTemplate jdbcTemplate;
private SimpleJdbcCall saveEmployeesArrayCall;
#PostConstruct
private void postConstruct() {
jdbcTemplate = new JdbcTemplate(dataSource);
this.saveEmployeesArrayCall =
new SimpleJdbcCall(dataSource).withProcedureName(SQLConstants.SAVE_EMPLOYEES_STORE_PROC)
.withoutProcedureColumnMetaDataAccess()
.declareParameters(new SqlParameter("p_emp_insert_array", Types.ARRAY, SQLConstants.EMPLOYEE_OBJ_TABLE_TYPE));
}
public void saveEmployees(List<Employee> employees) {
Map<String, Object> in = new HashMap<>();
in.put("p_emp_insert_array", new OracleSqlStructArrayValue<>(employees.toArray(new Employee[0]), new EmployeeStructMapper(), SQLConstants.EMPLOYEE_OBJ_TYPE));
saveEmployeesArrayCall.execute(in);
}
}
import io.swagger.annotations.ApiModelProperty;
import java.util.Objects;
import org.springframework.data.annotation.Id;
/**
*
* #author rsharma
*/
public class Employee implements java.io.Serializable{
#Id
#ApiModelProperty(notes = "The database generated Employee Number")
private Long empno;
#ApiModelProperty(notes = "First Name of the Employee", required = true)
private String firstName;
#ApiModelProperty(notes = "Last Name of the Employee")
private String lastName;
private String emailAddress;
public Employee() {
super();
}
public Employee(Long empno, String emailAddress, String firstName, String lastName) {
this.empno = empno;
this.emailAddress = emailAddress;
this.firstName = firstName;
this.lastName = lastName;
}
public Long getEmpno() {
return empno;
}
public void setEmpno(Long empno) {
this.empno = empno;
}
public String getFirstName() {
return firstName;
}
public void setFirstName(String firstName) {
this.firstName = firstName;
}
public String getLastName() {
return lastName;
}
public void setLastName(String lastName) {
this.lastName = lastName;
}
public String getEmailAddress() {
return emailAddress;
}
public void setEmailAddress(String emailAddress) {
this.emailAddress = emailAddress;
}
#Override
public String toString() {
return "Employee{" + "empno=" + empno + ", firstName=" + firstName + ", lastName=" + lastName + ", emailAddress=" + emailAddress + '}';
}
#Override
public int hashCode() {
int hash = 7;
hash = 59 * hash + Objects.hashCode(this.empno);
return hash;
}
#Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null) {
return false;
}
if (getClass() != obj.getClass()) {
return false;
}
final Employee other = (Employee) obj;
if (!Objects.equals(this.empno, other.empno)) {
return false;
}
return true;
}
}

Related

Can we set the disable conversion value from config in #Converter

#Convert(converter = MsisdnEncryptor.class,disableConversion=true)
I have used this converter in my spring boot entity class.
package com.example.demo.entity;
This is the entity class:
#Entity
#Data
#AllArgsConstructor
#NoArgsConstructor
#Table(name="employee")
public class Employee {
#Id
#Column(name = "emp_id")
#SequenceGenerator(
name = "employee_sequence",
sequenceName = "employee_sequence",
allocationSize = 1
)
#GeneratedValue(strategy = GenerationType.SEQUENCE,
generator = "employee_sequence")
private Long EmpId;
#Column(name = "first_name")
private String firstName;
#Column(name = "last_name")
private String lastName;
#Column(name = "mobile_number")
**#Convert(converter = MsisdnEncryptor.class,disableConversion=true)**
**here I want to put values from application.properties in disableConversion**
private String mobileNumber;
#Column(name= "date_time")
#JsonFormat(shape=JsonFormat.Shape.STRING, pattern="dd-MM-yyyy HH:mm:ss", timezone="GMT+5:30")
private Timestamp dateTime;
}
This is the Encryptor class:
#Component
public class MsisdnEncryptor implements AttributeConverter<String, String> {
private static final Logger logger = LoggerFactory.getLogger(MsisdnEncryptor.class);
#Value("${is.enabled:true}")
private boolean isEnabled;
private static final String AES = "AES";
private static final String SECRET = "secret-key-12345";
private final Key key;
private final Cipher cipher;
public MsisdnEncryptor() throws Exception {
key = new SecretKeySpec(SECRET.getBytes(), AES);
cipher = Cipher.getInstance(AES);
}
#Override
public String convertToDatabaseColumn(String attribute) {
try {
cipher.init(Cipher.ENCRYPT_MODE, key);
return Base64.getEncoder().encodeToString(cipher.doFinal(attribute.getBytes()));
} catch (IllegalBlockSizeException | BadPaddingException | InvalidKeyException e) {
throw new IllegalStateException(e);
}
}
#Override
public String convertToEntityAttribute(String dbData) {
try {
cipher.init(Cipher.DECRYPT_MODE, key);
return new String(cipher.doFinal(Base64.getDecoder().decode(dbData)));
} catch (InvalidKeyException | BadPaddingException | IllegalBlockSizeException e) {
throw new IllegalStateException(e);
}
}
}
I know we can use #Value for taking the values , but it doesn't work here as it says attribute value must be constant.
Or if there is any other way of doing the encryption and enabling/disabling it from config , I would be more than happy to go through it.

How to query DynamoDB using ONLY Partition Key [Java]?

I am new to DynamoDB and wanted to know how can we query on a table in DynamoDB by using ONLY partition key in JAVA
I have table called "ervive-pdi-data-invalid-qa" and it's Schema is :
partition key is "SubmissionId"
Sort key is "Id".
City (Attribute)
Errors (Attribute)
The table looks like this:
Table
I want to retrieve the sort key value and remaining attributes data by using Partition key using (software.amazon.awssdk) new version of AWS SDK DynamoDB classes.
is it possible to get it? If so, can any one post the answers?
Have tried this:
DynamoDbClient ddb =
DynamoDbClient.builder().region(Region.US_EAST_1).build();
DynamoDbEnhancedClient enhancedClient =
DynamoDbEnhancedClient.builder()
.dynamoDbClient(ddb)
.build();
//Define table
DynamoDbTable<ErvivePdiDataInvalidQa> table =
enhancedClient.table("ervive-pdi-data-invalid-qa",
TableSchema.fromBean(ErvivePdiDataInvalidQa.class));
Key key = Key.builder().partitionValue(2023).build();
ErvivePdiDataInvalidQa result = table.getItem(r->r.key(key));
System.out.println("The record id is "+result.getId());
ErvivePdiDataInvalidQa table class is in below comment*
and it is returning "The provided key element does not match the schema (Service: DynamoDb, Status Code: 400, Request ID: PE1MKPMQ9MLT51OLJQVDCURQGBVV4KQNSO5AEMVJF66Q9ASUAAJG, Extended Request ID: null)"
Query you need is documented in one of the examples of AWS Dynamodb Query API for Java.
AmazonDynamoDB client = AmazonDynamoDBClientBuilder.standard()
.withRegion(Regions.US_WEST_2).build();
DynamoDB dynamoDB = new DynamoDB(client);
Table table = dynamoDB.getTable("ervive-pdi-data-invalid-qa");
QuerySpec spec = new QuerySpec()
.withKeyConditionExpression("SubmissionId = :v_id")
.withValueMap(new ValueMap()
.withInt(":v_id", 2146));
ItemCollection<QueryOutcome> items = table.query(spec);
Iterator<Item> iterator = items.iterator();
Item item = null;
while (iterator.hasNext()) {
item = iterator.next();
System.out.println(item.toJSONPretty());
}
A single Query operation can retrieve a maximum of 1 MB of data, see documentation
I have been working with Padma on this issue. We first tried A. Khan's code but could not get passed authentication with v1. Instead we got "WARNING: Your profile name includes a 'profile ' prefix. This is considered part of the profile name in the Java SDK, so you will need to include this prefix in your profile name when you reference this profile from your Java code."
ultimately it could not get the credentials. Our credentials assume IAM roles in .aws/config-i2 file. It works fine in v2 but not v1.
So then we tried v2 of the SDK and have no problems with connecting but we get NULL returned on trying to fetch all records from the table.
In all of the below attempts using v2 of SDK, table data returns NULL
We created this table class
package data;
import software.amazon.awssdk.enhanced.dynamodb.mapper.annotations.DynamoDbBean;
import software.amazon.awssdk.enhanced.dynamodb.mapper.annotations.DynamoDbPartitionKey;
import software.amazon.awssdk.enhanced.dynamodb.mapper.annotations.DynamoDbSortKey;
#DynamoDbBean
public class ErvivePdiDataInvalidQa {
private int submissionId;
private String id;
private String address1;
private String city;
private String dateOfBirth;
private String errors;
private String firstName;
private String firstNameNormalized;
private String gender;
private String lastName;
private String lastNameNormalized;
private String middleNameInitial;
private String postalCode;
private String rowNumber;
private String state;
private String submissionType;
#DynamoDbPartitionKey
public int getSubmissionId() {
return submissionId;
}
public void setSubmissionId(int submissionId) {
this.submissionId = submissionId;
}
#DynamoDbSortKey
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public String getAddress1() {
return address1;
}
public void setAddress1(String Address1) {
this.address1 = Address1;
}
public String getCity() {
return city;
}
public void setCity(String city) {
this.city = city;
}
public String getDateOfBirth() {
return dateOfBirth;
}
public void setDateOfBirth(String dateOfBirth) {
this.dateOfBirth = dateOfBirth;
}
public String getErrors() {
return errors;
}
public void setErrors(String errors) {
this.errors = errors;
}
public String getFirstName() {
return firstName;
}
public void setFirstName(String firstName) {
this.firstName = firstName;
}
public String getFirstNameNormalized() {
return firstNameNormalized;
}
public void setFirstNameNormalized(String firstNameNormalized) {
this.firstNameNormalized = firstNameNormalized;
}
public String getGender() {
return gender;
}
public void setGender(String gender) {
this.gender = gender;
}
public String getLastName() {
return lastName;
}
public void setLastName(String lastName) {
this.lastName = lastName;
}
public String getLastNameNormalized() {
return lastNameNormalized;
}
public void setLastNameNormalized(String lastNameNormalized) {
this.lastNameNormalized = lastNameNormalized;
}
public String getMiddleNameInitial() {
return middleNameInitial;
}
public void setMiddleNameInitial(String middleNameInitial) {
this.middleNameInitial = middleNameInitial;
}
public String getPostalCode() {
return postalCode;
}
public void setPostalCode(String postalCode) {
this.postalCode = postalCode;
}
public String getRowNumber() {
return rowNumber;
}
public void setRowNumber(String rowNumber) {
this.rowNumber = rowNumber;
}
public String getState() {
return state;
}
public void setState(String state) {
this.state = state;
}
public String getSubmissionType() {
return submissionType;
}
public void setSubmissionType(String submissionType) {
this.submissionType = submissionType;
}
}
DynamoDB code to get all records
//Connection
DynamoDbClient ddb = DynamoDbClient.builder().build();
DynamoDbEnhancedClient enhancedClient = DynamoDbEnhancedClient.builder()
.dynamoDbClient(ddb)
.build();
//Define table
DynamoDbTable<ErvivePdiDataInvalidQa> table = enhancedClient.table("ervive-pdi-data-invalid-qa", TableSchema.fromBean(ErvivePdiDataInvalidQa.class));
//Get All Items from table - RETURNING NULL
Iterator<ErvivePdiDataInvalidQa> results = table.scan().items().iterator();
while (results.hasNext()) {
ErvivePdiDataInvalidQa rec = results.next();
System.out.println("The record id is "+rec.getId());
}
Also tried:
DynamoDB code to filter by SubmissionID
AttributeValue attr = AttributeValue.builder()
.n("1175")
.build();
// Get only Open items in the Work table
Map<String, AttributeValue> myMap = new HashMap<>();
myMap.put(":val1", attr);
Map<String, String> myExMap = new HashMap<>();
myExMap.put("#sid", "SubmissionId");
// Set the Expression so only Closed items are queried from the Work table
Expression expression = Expression.builder()
.expressionValues(myMap)
.expressionNames(myExMap)
.expression("#sid = :val1")
.build();
ScanEnhancedRequest enhancedRequest = ScanEnhancedRequest.builder()
.filterExpression(expression)
.limit(15)
.build();
// Get items in the Record table and write out the ID value
Iterator<ErvivePdiDataInvalidQa> results = table.scan(enhancedRequest).items().iterator();
while (results.hasNext()) {
ErvivePdiDataInvalidQa record = results.next();
System.out.println("The record id is " + record.getId());
}

Query with DynamoDB Secondary Index AWS SDK 2 Java exception creating DynamoDbIndex object

I'm having trouble running a query against a secondary index, getting an exception:
Ex getting dynamodb scan: java.lang.IllegalArgumentException: Attempt to execute an operation that requires a secondary index without defining the index attributes in the table metadata. Index name: category-timestamp-index
Can someone guide me on how I'm doing this wrong?
My table is idIT_RSS_Sources and I've created an index category-timestamp-index.
screenshot attached of index
My code is:
DynamoDbEnhancedClient enhancedClient = getEnhancedDBClient(region);
// Create a DynamoDbTable object
logger.debug("getting RSS Source category-timestamp-index");
//this throws the exception
DynamoDbIndex<RSS_Source> catIndex =
enhancedClient.table("idIT_RSS_Sources",
TableSchema.fromBean(RSS_Source.class))
.index("category-timestamp-index");
logger.debug("building query attributes");
AttributeValue att = AttributeValue.builder()
.s(theCategory)
.build();
Map<String, AttributeValue> expressionValues = new HashMap<>();
expressionValues.put(":value", att);
Expression expression = Expression.builder()
.expression("category = :value")
.expressionValues(expressionValues)
.build();
// Create a QueryConditional object that's used in the query operation
QueryConditional queryConditional = QueryConditional
.keyEqualTo(Key.builder().partitionValue(theCategory)
.build());
logger.debug("calling catIndex.query in getRSS...ForCategory");
Iterator<Page<RSS_Source>> dbFeedResults = (Iterator<Page<RSS_Source>>) catIndex.query(
QueryEnhancedRequest.builder()
.queryConditional(queryConditional)
.build());
solved, I was not using the proper annotation in my model class:
#DynamoDbSecondaryPartitionKey(indexNames = { "category-index" })
public String getCategory() { return category; }
public void setCategory(String category) { this.category = category; }
Assume you have a model named Issues.
package com.example.dynamodb;
import software.amazon.awssdk.enhanced.dynamodb.mapper.annotations.DynamoDbBean;
import software.amazon.awssdk.enhanced.dynamodb.mapper.annotations.DynamoDbPartitionKey;
import software.amazon.awssdk.enhanced.dynamodb.mapper.annotations.DynamoDbSecondaryPartitionKey;
import software.amazon.awssdk.enhanced.dynamodb.mapper.annotations.DynamoDbSortKey;
#DynamoDbBean
public class Issues {
private String issueId;
private String title;
private String createDate;
private String description;
private String dueDate;
private String status;
private String priority;
private String lastUpdateDate;
#DynamoDbPartitionKey
public String getId() {
return this.issueId;
}
public void setId(String id) {
this.issueId = id;
}
#DynamoDbSortKey
public String getTitle() {
return this.title;
}
public void setTitle(String title) {
this.title = title;
}
public void setLastUpdateDate(String lastUpdateDate) {
this.lastUpdateDate = lastUpdateDate;
}
public String getLastUpdateDate() {
return this.lastUpdateDate;
}
public void setPriority(String priority) {
this.priority = priority;
}
public String getPriority() {
return this.priority;
}
public void setStatus(String status) {
this.status = status;
}
public String getStatus() {
return this.status;
}
public void setDueDate(String dueDate) {
this.dueDate = dueDate;
}
#DynamoDbSecondaryPartitionKey(indexNames = { "dueDateIndex" })
public String getDueDate() {
return this.dueDate;
}
public String getDate() {
return this.createDate;
}
public void setDate(String date) {
this.createDate = date;
}
public String getDescription() {
return this.description;
}
public void setDescription(String description) {
this.description = description;
}
}
Notice the annotation on getDueDate.
#DynamoDbSecondaryPartitionKey(indexNames = { "dueDateIndex" })
public String getDueDate() {
return this.dueDate;
}
This is because the Issues table has a secondary index named dueDateIndex.
To query on this secondary index, you can use this code that uses the Amazon DynamoDB Java API V2:
public static void queryIndex(DynamoDbClient ddb, String tableName, String indexName) {
try {
// Create a DynamoDbEnhancedClient and use the DynamoDbClient object
DynamoDbEnhancedClient enhancedClient = DynamoDbEnhancedClient.builder()
.dynamoDbClient(ddb)
.build();
//Create a DynamoDbTable object based on Issues
DynamoDbTable<Issues> table = enhancedClient.table("Issues", TableSchema.fromBean(Issues.class));
String dateVal = "2013-11-19";
DynamoDbIndex<Issues> secIndex =
enhancedClient.table("Issues",
TableSchema.fromBean(Issues.class))
.index("dueDateIndex");
AttributeValue attVal = AttributeValue.builder()
.s(dateVal)
.build();
// Create a QueryConditional object that's used in the query operation
QueryConditional queryConditional = QueryConditional
.keyEqualTo(Key.builder().partitionValue(attVal)
.build());
// Get items in the Issues table
SdkIterable<Page<Issues>> results = secIndex.query(
QueryEnhancedRequest.builder()
.queryConditional(queryConditional)
.build());
AtomicInteger atomicInteger = new AtomicInteger();
atomicInteger.set(0);
results.forEach(page -> {
Issues issue = (Issues) page.items().get(atomicInteger.get());
System.out.println("The issue title is "+issue.getTitle());
atomicInteger.incrementAndGet();
});
} catch (DynamoDbException e) {
System.err.println(e.getMessage());
System.exit(1);
}
}
For what it's worth, if your Global Secondary Index has a sort key, you must annotate that field in the DynamoDB bean with:
#DynamoDbSecondarySortKey(indexNames = { "<indexName>" })
public String getFieldName() {
return fieldName;
}
My working code is as below:
sortKey-index = GSI in dynamo db
List<Flow> flows = new ArrayList<>();
DynamoDbIndex<Flow> flowBySortKey = table().index("sortKey-index");
// Create a QueryConditional object that's used in the query operation
QueryConditional queryConditional = QueryConditional
.keyEqualTo(Key.builder()
.partitionValue(sortKey)
.build());
SdkIterable<Page<Flow>> dbFeedResults = flowBySortKey.query(
QueryEnhancedRequest.builder()
.queryConditional(queryConditional)
.build());
dbFeedResults.forEach(flowPage -> {
flows.addAll(flowPage.items());
});

Retrieving data from a SQLite database and displaying it on the ListView

I already search over the internet and I don't understand what I need to do to display data from the database to a ListView. They have tutorials but I don't quite get it.
Here is my database handler code
package com.example.databasetest;
import android.content.ContentValues;
import android.content.Context;
import android.database.SQLException;
import android.database.sqlite.SQLiteDatabase;
import android.database.sqlite.SQLiteOpenHelper;
import android.util.Log;
import android.widget.Toast;
public class DBHandler {
public static final String TABLE_NAME = "tableKo";
public static final String DATABASE_NAME = "databaseKo";
private static final int DATABASE_VERSION = 1;
private static final String TAG = "DBHandler";
public static final String COL_ID = "_id";
public static final String COL_NAME = "name";
public static final String COL_ADDRESS = "address";
public static final String COL_PHONE = "phone";
public static final String COL_EMAIL = "email";
private final Context context;
private SQLiteDatabase db;
private MySQLiteOpenHelper DBHelper;
private String[] data;
private static final String CREATE_DATABASE ="create table "
+ TABLE_NAME + "(" + COL_ID
+ " integer primary key, " + COL_NAME
+ " text not null, " + COL_ADDRESS + " text not null,"
+ COL_PHONE + " text not null," + COL_EMAIL + " text not null);";
public DBHandler(Context ctx) {
this.context = ctx;
DBHelper = new MySQLiteOpenHelper(context);
}
private static class MySQLiteOpenHelper extends SQLiteOpenHelper{
public MySQLiteOpenHelper(Context context) {
super(context, DATABASE_NAME, null, DATABASE_VERSION);
}
#Override
public void onCreate(SQLiteDatabase db) {
// TODO Auto-generated method stub
try {
db.execSQL(CREATE_DATABASE);
} catch (SQLException e) {
e.printStackTrace();
}
}
#Override
public void onUpgrade(SQLiteDatabase db, int oldVersion, int newVersion) {
// TODO Auto-generated method stub
Log.w(TAG, oldVersion + " to " + newVersion
+ ", which will destroy all old data");
db.execSQL("DROP TABLE IF EXISTS " + TABLE_NAME);
onCreate(db);
}
}
public DBHandler open() throws SQLException {
db = DBHelper.getWritableDatabase();
return this;
}
public void close() {
DBHelper.close();
}
public void insertData (String name, String address, String phone, String email) {
open();
ContentValues values = new ContentValues();
values.put(COL_NAME, name);
values.put(COL_ADDRESS, address);
values.put(COL_PHONE, phone);
values.put(COL_EMAIL, email);
db.insert(TABLE_NAME, null, values);
//db.execSQL("Insert into " +TABLE_NAME+ " VALUES('"+COL_ID+"','"+name+"','"+address+"','"+phone+"','"+email+"');");
db.close();
}
// now here I want to make a return type method to return "I don't know what data type or anything that will fit the listView
public void getData() {
DBHelper.getReadableDatabase();
}
}
I want to make a method that will return something that will fit the ListView. Should I return arrayAdapter or just a simple String array? and also if it is arrayAdapter I don't know what type should I put in it(for clarification of what I mean here is this ArrayAdapter"what exactly should I put here? the activity that will use it or String?"). what should the method like?
I will really appreciate your help.
This solution works for me. You need to declare the public void getData() method as an ArrayList or an ArrayAdapter like this:
public ArrayList<String> getData() {
ArrayList<String> values = new ArrayList<String>();
String columns[] = new String[] { COL_NAME }; //considering you wanna return name
Cursor c = db.query(TABLE_NAME, columns, null, null, null, null,
null);
String result;
int iName = c.getColumnIndex(COL_NAME);
for (c.moveToFirst(); !c.isAfterLast(); c.moveToNext()) {
result = c.getString(iName);
values.add(result);
}
return values;
}
Now in your main class where you wanna display the listview, put in the following code:
private DBHandler dbHandler;
#Override
protected void onCreate(Bundle savedInstanceState) {
// TODO Auto-generated method stub
super.onCreate(savedInstanceState);
dbHandler = new DBHandler(Classname.this);
dbHandler.open();
ArrayList<String> data = dbHandler.getData();
final ListView listView = getListView();
setListAdapter(new ArrayAdapter<String>(ExpSubList.this,
android.R.layout.simple_list_item_1, data));
}
This should hopefully will display the names in your listview. I haven't used an xml layout to define the listview, defined it in java directly. You can do it as required.

Persisting interfaces using JDO/Datanucleus

I have the following class:
#PersistenceCapable(identityType = IdentityType.APPLICATION, detachable = "true")
public class TclRequest implements Comparable<TclRequest> {
#PrimaryKey
private String id;
#Persistent(types = { DNSTestData.class, POP3TestData.class, PPPoETestData.class, RADIUSTestData.class }, defaultFetchGroup = "true")
#Columns({ #Column(name = "dnstestdata_fk"), #Column(name = "pop3testdata_fk"), #Column(name = "pppoetestdata_fk"), #Column(name = "radiustestdata_fk") })
private TestData testData;
public String getId() {
return id;
}
public TestData getTestData() {
return testData;
}
public void setId(String id) {
this.id = id;
}
public void setTestData(TestData testData) {
this.testData = testData;
}
}
The TestData interface looks like this:
#PersistenceCapable(detachable = "true")
public interface TestData {
#PrimaryKey
public String getId();
public void setId(String id);
}
Which is implemented by many classed including this one:
#PersistenceCapable(detachable = "true")
public class RADIUSTestData implements TestData {
#PrimaryKey
private String id;
private String password;
private String username;
public RADIUSTestData() {
}
public RADIUSTestData(String password, String username) {
super();
this.password = password;
this.username = username;
}
#Override
public String getId() {
return id;
}
#Override
public void setId(String id) {
this.id = id;
}
}
When I try to persiste the TclRequest class, after constructing it of course and using the RADIUSTestData:
//'o' is the constructed TclRequest object.
PersistenceManager pm = null;
Transaction t = null;
try {
pm = getPM();
t = pm.currentTransaction();
t.begin();
pm.makePersistent(o);
t.commit();
} catch (Exception e) {
e.printStackTrace();
if (t != null && t.isActive()) {
t.rollback();
}
} finally {
closePM(pm);
}
The interface field isn't persisted. And the column is not created in the table ! I enabled the debug mode and found 2 catchy things:
1)
-Class com.skycomm.cth.beans.ixload.radius.TestData specified to use "application identity" but no "objectid-class" was specified. Reverting to javax.jdo.identity.StringIdentity
2)
-Performing reachability on PC field "com.skycomm.cth.beans.TclRequest.testData"
-Could not find StateManager for PC object "" at field "com.skycomm.cth.beans.TclRequest.testData" - ignoring for reachability
What could this mean ?
Thanks in advance.
I have figured out how to do it. It's not very much scalable but it works for now.
These are the annotations for the interface member variable. Note that the order of declared types, columns and class names in the extension value is important to be maintaned:
#Persistent(types = { RADIUSTestData.class, POP3TestData.class, PPPoETestData.class, DNSTestData.class }, defaultFetchGroup = "true")
#Columns({ #Column(name = "radiustestdata_fk"), #Column(name = "pop3testdata_fk"), #Column(name = "pppoetestdata_fk"),
#Column(name = "dnstestdata_fk") })
#Extension(vendorName = "datanucleus", key = "implementation-classes", value = "com.skycomm.cth.tcl.beans.radius.RADIUSTestData, com.skycomm.cth.tcl.beans.pop3.POP3TestData, com.skycomm.cth.tcl.beans.pppoe.PPPoETestData, com.skycomm.cth.tcl.beans.dns.DNSTestData")
A sample class implementing one of the interfaces (Just it's "header"):
#PersistenceCapable(detachable = "true")
public class RADIUSTestData implements TestData {
So it's pretty normal here.

Resources