How to manage change data capture with Teradata? - teradata

I am facing a problem...how to do change data capture with Teradata as a source.
I've found several tools but all of them use Teradata only as a target and sources are "traditional" database systems like Oracle, DB2...
The closest shot I have is Informatica PowerExchange for Teradata. But after reading several pdfs from Informatica I am still unsure if Teradata is possible as a source.
I'm looking for a solution how to capture changes on Teradata warehouse and replicate them somewhere else (hadoop or rdbm).
Thank you for any help.

Related

how to migrate a gupta (SQLBase) database to MySQL (including data)

Im trying to migrate a Gupta DB to MySQL. I already have a script to create every Table (with indexes and comments) and views I need in MySQL (table-, view- and column-names as well as column-types are equal in gupta as well as MySQL). But now I cant figure out how to read data from the old SQLBase Database and "import it" into the new MySQL database.
i thought of reading the old data from SQLBase and writing it in a file to then read it somehow via MySQL to import it. the Problem here is that for some tables there are more than 1 million records which i can not lose a single one of....
You can ETL natively from Gupta 'SQLBase' very easily. But you need to read the manual appropriate for your SQLBase version here: Gupta SQLBase manuals ( all verions ) , specifically, you need to read the 'UNLOAD' command in the 'Language Reference' manual. But best advice is: Don't do it ! Upgrade to SQLBase v12.2 instead.

Use Apache drill as schema conversion tool

I am working on migrating teradata DBs to any open source DB(which DB is under discussion). I came across Apache Drill engine. My question is can we use drill to load data from teradata? If yes, can we use it as a schema conversion tool?
In theory yes it can load data from Teradata, since Teradata has a JDBC driver you can configure Teradata as a source. For an example of how to configure a JDBC data source in Drill see the docs here.
Drill has a CTAS statement. I know it can be used to write parquet, CSV, and json files, but I'm not sure what other data sources it supports.
To get more information about what Drill can do, and to request features, please get in touch with the Drill team on the user list.

Bulk Load into SAP HANA Database

We were using Oracle DB previously in our Datawarehouse setup. We used SQL*Loader utility for bulk loading which was invoked through Informatica. We are shifting our DB to SAP HANA. We are very new to HANA DB. We were looking for similar command line utility in SAP HANA DB for EFFICIENT BULK DATA LOAD. I came across utility with CTL file in SAP HANA.
But problem we are facing is that we need to specify CTL file, path DATA file, path BAD file, path on command line only. Is there way to achieve this? Or do we have a better mechanism in SAP HANA for scheduled bulk loading?
The EXPORT/IMPORT commands of the SAP HANA server are not as versatile as the Oracle command line SQL*Loader.
It's mainly aimed at transports between HANA systems.
For proper ETL you rather want to use either "Smart Data Integration" (https://help.sap.com/hana_options_eim) and/or "Smard Data Acess" (https://help.sap.com/saphelp_hanaplatform/helpdata/en/71/0235120ce744c381176121d4c60b28/content.htm).
Specifically for typical EDW scenarios, there's also the option for "Data Warehousing Foundation" (https://help.sap.com/hana_options_dwf) which provides a lot of functionality to handle mass data. partitioning, data distribution etc.
Knowing many former Oracle users just want a 1:1 swap of tools, I want to give fair warning: data loading & transformation in HANA is a lot less command line based.

How to map a SQLite database to another?

I need to export the data of a SQLite database and import this data to another SQLite database. But the two databases have different schemas.
Does exist an open source tool that can help me doing that job?
The only opensource tool that i know is opendbcopy that i'm using for migrate from a database server to another and also for a similar kind of job that you want to do with SQLite but i've done it with PostgreSQL.
However opendbcopy is JDBC compliant and can connect to every database that have a JDBC driver, so you can try, also if the schema is not the same you can use the column mapping feature :
In addition i know also a good commercial alternative (that is easier to use) that is ESF Database Migration Toolkit .

Free tool to sync data from ms access to sqlite

is there any tool that can compare data from ms-access database table with sqlite table and update it if finds it updated and insert if not found?
Thanks in advance
There aren't any good tool that allows one click conversion from one db to antoher, so created a utility in c# for the same.

Resources