I am working on migrating teradata DBs to any open source DB(which DB is under discussion). I came across Apache Drill engine. My question is can we use drill to load data from teradata? If yes, can we use it as a schema conversion tool?
In theory yes it can load data from Teradata, since Teradata has a JDBC driver you can configure Teradata as a source. For an example of how to configure a JDBC data source in Drill see the docs here.
Drill has a CTAS statement. I know it can be used to write parquet, CSV, and json files, but I'm not sure what other data sources it supports.
To get more information about what Drill can do, and to request features, please get in touch with the Drill team on the user list.
Related
In which way can I access another database (not OpenEdge) via ODBC from OpenEdge without using DataDirect?
The use case is data migration from one system to another, so performance cannot be neglected completely but it's a one time thing that is allowed to take a little longer.
Why without DataDirect? Extra cost. Our client doesn't have the license.
Why not dump and load (via CSV f.e.)? The client doesn't want to do the mapping between the systems this way but with database views.
As far as I know there is no way to directly access other database if you're not using DataDirect or something like DataServer for Oracle etc.
However, you could call a third party ODBC library as external functions, and write your handle your queries to the foreign database by accessing. This wouldn't allow you to use OpenEdge constructs like FOR EACH, buffers etc, but it would allow you to retrieve the data and process it using custom functions, and then insert into the OpenEdge tables etc.
See the following KB for accessing external library functions:
https://knowledgebase.progress.com/articles/Article/P183546
Another approach you could use, assuming your tables are in OpenEdge already, is to use the OpenEdge SQL92 ODBC driver from another language (C/VB/Java/whatever works for you), and read the data from the source database and insert into OpenEdge via SQL92 ODBC.
Looking at the website there are downloadable ODBC drivers for most platforms:
https://www.progress.com/odbc/openedge
I am investigating Apache Ignite to pull data from Teradata and cache it so that I can use it to display in UI. Now, we are doing it using Cassandra but we want to move out of it for some reasons. It will be helpful if I get few templates on how it can be achieved as I am not finding relevant code sources or docs to read through.
Here is the documentation page about loading the data into Apache Ignite: https://apacheignite.readme.io/docs/data-loading
I am facing a problem...how to do change data capture with Teradata as a source.
I've found several tools but all of them use Teradata only as a target and sources are "traditional" database systems like Oracle, DB2...
The closest shot I have is Informatica PowerExchange for Teradata. But after reading several pdfs from Informatica I am still unsure if Teradata is possible as a source.
I'm looking for a solution how to capture changes on Teradata warehouse and replicate them somewhere else (hadoop or rdbm).
Thank you for any help.
I have been developing locally for some time and am now pushing everything to production. Of course I was also adding data to the development server without thinking that I hadn't reconfigured it to be Postgres.
Now I have a SQLite DB who's information I need to be on a remote VPS on a Postgres DB there.
I have tried dumping to a .sql file but am getting a lot of syntax complaints from Postgres. What's the best way to do this?
For pretty much any conversion between two databases the options are:
Do a schema-only dump from the source database. Hand-convert it and load it into the target database. Then do a data only dump from the source DB in the most compatible form of SQL dump it offers. Try loading that into the target DB. When you hit problems, script transformations to the dump using sed/awk/perl/whatever and try again. Repeat until it loads and the results match.
Like (1), hand-convert the schema. Then write a script in your preferred language that connects to both databases, SELECTs from one, and INSERTs into the other, possibly with some transformations of data types and representations.
Use an ETL tool like Talend or Pentaho to connect to both databases and convert between them. ETL tools are like a "somebody else already wrote it" version of (2), but they can take some learning.
Hope that you can find a pre-written conversion too. Heroku one called sequel that will work for SQLite -> PostgreSQL; is it available without Heroku and able to function without all the other Heroku infrastructure and code?
After any of those, some post-transfer steps like using setval() to initialize sequences is typically required.
Heroku's database conversion tool is called sequel. Here are the ruby gems you need:
gem install sequel
gem install sqlite3
gem install pg
Then this worked for me for a sqlite database file named 'tweets.db' in the current working directory:
sequel -C sqlite://tweets.db postgres://pgusername:pgpassword#localhost/pgdatabasename
PostgreSQL supports "foreign data wrappers", which allow you to directly access any data source through the DB, including sqlite. Even up to automatically importing the schema. You can then use create table localtbl as (select * from remotetbl) to get your data into the actual PG storage.
https://wiki.postgresql.org/wiki/Foreign_data_wrappers
https://github.com/pgspider/sqlite_fdw
I need to export the data of a SQLite database and import this data to another SQLite database. But the two databases have different schemas.
Does exist an open source tool that can help me doing that job?
The only opensource tool that i know is opendbcopy that i'm using for migrate from a database server to another and also for a similar kind of job that you want to do with SQLite but i've done it with PostgreSQL.
However opendbcopy is JDBC compliant and can connect to every database that have a JDBC driver, so you can try, also if the schema is not the same you can use the column mapping feature :
In addition i know also a good commercial alternative (that is easier to use) that is ESF Database Migration Toolkit .