I have to write JUnit test cases for REST API created using Spring boot and Oracle. There is no dedicated test DB environment. So I planned to use the in-memory database. I did a POC on H2 database. Even after spending 3 days, I was able to do basic things but it is not fully compatible with Oracle. It didn't support In, Out parameter, also it didn't support "call schema.package.function(In, Out)". I was able to create schema and function but I was not able to create a package. Could you please suggest an in-memory which support the following
Should support schema, package and function creation.
Should support In, Out parameter
Should be lightweight
Should be compatible with Oracle and Java
HSQLDB supports IN and OUT parameters for PROCEDURES. You can mimic the schema.package.function(in, out) by renaming the database CATALOG as the name of the schema and creating a separate schema named as the package, then creating the function in that schema.
Alternatively, HyperXtremeSQL (http://hyperxtreme.co.uk) supports creation of package, procedure and function with Oracle syntax.
Related
As I understand it, it's not possible to enable at rest encryption after a table has been created. I currently use the Java SDK to create tables but can't see any way to request the table to be created with encryption turned on.
I've used the latest SDK:
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-dynamodb</artifactId>
<version>1.11.391</version>
And classes com.amazonaws.services.dynamodbv2.model.CreateTableRequest and com.amazonaws.services.dynamodbv2.util.TableUtils#createTableIfNotExists.
I know another option to create tables would be to use something like the CLI or terraform. However, what I liked about the Java SDK API was that it colocates the "schema" right there in the code along with the Java POJO mappings that use it, making it easy to run in tests against the local dynamo db. If I need to write terraform files or scripts to call the CLI these kind of fall "out of band" with the actual code that uses it which feels less ideal.
Is there any way to do this with the Java SDK?
In which way can I access another database (not OpenEdge) via ODBC from OpenEdge without using DataDirect?
The use case is data migration from one system to another, so performance cannot be neglected completely but it's a one time thing that is allowed to take a little longer.
Why without DataDirect? Extra cost. Our client doesn't have the license.
Why not dump and load (via CSV f.e.)? The client doesn't want to do the mapping between the systems this way but with database views.
As far as I know there is no way to directly access other database if you're not using DataDirect or something like DataServer for Oracle etc.
However, you could call a third party ODBC library as external functions, and write your handle your queries to the foreign database by accessing. This wouldn't allow you to use OpenEdge constructs like FOR EACH, buffers etc, but it would allow you to retrieve the data and process it using custom functions, and then insert into the OpenEdge tables etc.
See the following KB for accessing external library functions:
https://knowledgebase.progress.com/articles/Article/P183546
Another approach you could use, assuming your tables are in OpenEdge already, is to use the OpenEdge SQL92 ODBC driver from another language (C/VB/Java/whatever works for you), and read the data from the source database and insert into OpenEdge via SQL92 ODBC.
Looking at the website there are downloadable ODBC drivers for most platforms:
https://www.progress.com/odbc/openedge
So just exploring the possibility of using flyway to maintain my DDL statements against Amazon Athena using Athena provided JDBC driver, Athena supports only CREATE statements (hive DDL) and no INSERTS.
So if database metadata table is the only one that flyway creates and updates, Is there anyway I can externalize the creating, storing into a totally different database ?
Currently this is not possible in flyway, as the schema history is read from/written to the database defined by the current jdbc connection. You can see this for yourself in the JDBCTableSchemaHistory file.
If you wish to add this support, you could create a pull request on the repo, or just add an issue detailing exactly what you want the behaviour to be.
I need to connect to Cassandra Database and Query from there.
I want to know, is there any exist database library for Cassandra in Robot Framework.
Short answer: no, there isn't such.
One of the active (and good) Cassandra drivers for Python is from a company called DataStax, here is its repo - https://github.com/datastax/python-driver. Have in mind it has some peculiarities getting installed and running in the various OSes.
But as it does not (regretfully) adhere to Python Database API, so you cannot just install it and straight ahead use by RF's DatabaseLibrary.
You could/should create your own library wrapping the driver calls (which shouldn't be that hard...).
I'm retrospectively unit testing a zend application and want to use an SQL Lite database for convenience. In production we use MySQL updated with DB migrations. Simple question: How do I create an SQL Lite schema? Is it possible to automatically recreate the schema inside phpunit?
Many thanks for your help.
Are you using doctrine?
If so, you can generate the schema with doctrine's orm:schema-tool:create feature.
You can use this command to either dump the SQL, or generate your tables directly through the connection.