Trying to figure out what are the alternatives in Snowflake for Teradata BTEQ's and TPT(Import/Export)
In Teradata BTEQ's, which are run for an Enterprise using scheduled jobs on UNIX/LINUX , we have the flexibility of writing different SQL's in a sequence.
What alternative we have in Snowflake?
I'm not overly familiar with Terdata but based on this overview of beteq I would say that the Snowflake equivalent is probably the SnowSQL CLI. In terms of import/export the SnowSQL CLI can be used to upload data via the PUT command.
Related
Kusto (ADX) has .export command to export data from Kusto to other data stores (Azure Storage is of my particular interest though). Here I am not referring to continuous export but rather just export command which can also be used in on-demand manner. I think couple of years back this was the only (out of the box) option to export data out of Kusto. Now I see that ADFv2 (Azure Data Factory v2) copy activity also supports ADX (Kusto) as one of the sources and using this we are able to copy data from Kusto to Azure Storage. But which method is faster ? Is there any recommendation? One could also orchestrate .export command from ADFv2's ADX command activity which is what we were doing earlier -- but now I see that copy activity from ADFv2 has started supporting ADX as source out of the box , which is much easier to work with. But we only want to use the method which will provide best performance (faster).
Please see the comparison between copy activity and the export command here specifically the tip here:
The reason for it, is that the ".export" command perform better as it by default executed in parallel as well as providing more customization options.
I'm trying to find if one can connect to teradata using H2O. Upon reading some of the basic documentation on H2O, i found that H2O has the ability to connect to relational databases provided they supply a JDBC driver.
Link: http://docs.h2o.ai/h2o/latest-stable/h2o-docs/getting-data-into-h2o.html?highlight=jdbc
However, the documentation suggests: "Currently supported SQL databases are MySQL, PostgreSQL, and MariaDB"
So I'm wondering if H2O can connect to other databases like Teradata because they do have a jdbc driver
Link: https://downloads.teradata.com/download/connectivity/jdbc-driver
-Suhail
The core H2O function importSqlTable in water.jdbc.SQLManager class is called by both h2o.import_sql_table and h2o.import_sql_select (H2O R API - must be similar with Python counterparts). After inspecting importSqlTable source code I found a problem that will likely prevent you from loading with Teradata due to SELECT syntax.
Still I'd suggest trying and reporting in comments on result and error if it fails. When starting H2O server add the following to your command line:
-cp <path_to_h2o_jar>:<path_to_Teradata_jdbc_driver_jar> jdbc.drivers=com.teradata.jdbc.TeraDriver
UPDATE:
Use version Xia (3.22.0.1) - 10/26/2018 or later that fixed JDBC support for Teradata.
I need to connect to Cassandra Database and Query from there.
I want to know, is there any exist database library for Cassandra in Robot Framework.
Short answer: no, there isn't such.
One of the active (and good) Cassandra drivers for Python is from a company called DataStax, here is its repo - https://github.com/datastax/python-driver. Have in mind it has some peculiarities getting installed and running in the various OSes.
But as it does not (regretfully) adhere to Python Database API, so you cannot just install it and straight ahead use by RF's DatabaseLibrary.
You could/should create your own library wrapping the driver calls (which shouldn't be that hard...).
I have been developing locally for some time and am now pushing everything to production. Of course I was also adding data to the development server without thinking that I hadn't reconfigured it to be Postgres.
Now I have a SQLite DB who's information I need to be on a remote VPS on a Postgres DB there.
I have tried dumping to a .sql file but am getting a lot of syntax complaints from Postgres. What's the best way to do this?
For pretty much any conversion between two databases the options are:
Do a schema-only dump from the source database. Hand-convert it and load it into the target database. Then do a data only dump from the source DB in the most compatible form of SQL dump it offers. Try loading that into the target DB. When you hit problems, script transformations to the dump using sed/awk/perl/whatever and try again. Repeat until it loads and the results match.
Like (1), hand-convert the schema. Then write a script in your preferred language that connects to both databases, SELECTs from one, and INSERTs into the other, possibly with some transformations of data types and representations.
Use an ETL tool like Talend or Pentaho to connect to both databases and convert between them. ETL tools are like a "somebody else already wrote it" version of (2), but they can take some learning.
Hope that you can find a pre-written conversion too. Heroku one called sequel that will work for SQLite -> PostgreSQL; is it available without Heroku and able to function without all the other Heroku infrastructure and code?
After any of those, some post-transfer steps like using setval() to initialize sequences is typically required.
Heroku's database conversion tool is called sequel. Here are the ruby gems you need:
gem install sequel
gem install sqlite3
gem install pg
Then this worked for me for a sqlite database file named 'tweets.db' in the current working directory:
sequel -C sqlite://tweets.db postgres://pgusername:pgpassword#localhost/pgdatabasename
PostgreSQL supports "foreign data wrappers", which allow you to directly access any data source through the DB, including sqlite. Even up to automatically importing the schema. You can then use create table localtbl as (select * from remotetbl) to get your data into the actual PG storage.
https://wiki.postgresql.org/wiki/Foreign_data_wrappers
https://github.com/pgspider/sqlite_fdw
I need to export the data of a SQLite database and import this data to another SQLite database. But the two databases have different schemas.
Does exist an open source tool that can help me doing that job?
The only opensource tool that i know is opendbcopy that i'm using for migrate from a database server to another and also for a similar kind of job that you want to do with SQLite but i've done it with PostgreSQL.
However opendbcopy is JDBC compliant and can connect to every database that have a JDBC driver, so you can try, also if the schema is not the same you can use the column mapping feature :
In addition i know also a good commercial alternative (that is easier to use) that is ESF Database Migration Toolkit .