Connection to Google Bigtable in Google Dataproc using HBase odbc driver - odbc

Has anyone already made a connection to Google Bigtable in Google Cloud Dataproc using any available HBase odbc driver? If yes, can you tell which ODBC you've used? Thanks

As noted by Solomon in the comments, we don't have a native ODBC or JDBC driver for Google Cloud Bigtable. If you're building a .NET application and want to interact with Bigtable we recommend using the native client here:
https://github.com/GoogleCloudPlatform/google-cloud-dotnet
Another approach that may work would be to use BigQuery which does have ODBC support and combine that with BigQuery's support for federated data access to Bigtable. That's a bit of a Rube Goldberg construction though, so it may be more painful to set up and debug.

Related

Is encryption at rest supported on remote protocol in OrientDB?

In the documentation of OrientDB it mentioned that encryption at rest is not supported on remote protocol yet. It can be used only with plocal.
Currently we are using the OrientDB version 2.2.22. Database encryption is mandatory for us. We were previously using OrientDB in plocal mode, but now we have a new requirement in which multiple processes from different JVMs need to connect with same OrientDB database, which is not possible in plocal model.
Is there any way we can achieve it? Is there any workaround? Is this feature going to be supported in upcoming releases?
If you start your server and provide the key at startup, from that point on, the database is accessible via remote. So it would work. I suggest encrypting the TCP/IP connection too at that point.
No, it cannot currently be done:
NOTE: Encryption at rest is not supported on remote protocol yet. It can be used only with plocal.
Given your new requirements, it seems like OrientDB is not the right choice for you anymore.

MSDTC is not supported by AWS RDS SQL server

I have transaction scope in my code which move transaction to MSDTC. But when I run this code into AWS cloud where RDS is SQL server. It is not supporting MSDTC please how can I make this supportable or what will be alternative way for this. I need MSDTC in my code.
Well, there isn't much that you can do since AWS does not support it (https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/CHAP_SQLServer.html).
Features Not Supported and Features with Limited Support
The following Microsoft SQL Server features are not supported on Amazon RDS:
Stretch database
Backing up to Microsoft Azure Blob Storage
Buffer pool extension
Data Quality Services
Database Log Shipping
Database Mail
Distribution Transaction Coordinator (MSDTC)
File tables
FILESTREAM support
Maintenance Plans
Performance Data Collector
...
...
The alternative is to deploy/host/manage your own MSSQL server on AWS.
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Appendix.SQLServer.Options.MSDTC.html
Looks like this is now supported by AWS.

Streaming data from Oracle 11g to Kafka

I am looking for a solution to stream data from Oracle 11g to Kafka. I was hoping to use GoldenGate, but that only seems to be available for Oracle 12c. Is the Confluent platform the best way to go?
Thanks!
First, the general answer would be: The best way to connect Oracle (databases) to Kafka is indeed to use Confluent Platform with Kafka's Connect API in combination with a ready-to-use connector for GoldenGate. See the GoldenGate/Oracle entry in section "Certified Connectors" at https://www.confluent.io/product/connectors/. The listed Kafka connector for GoldenGate is maintained by Oracle.
Is the Confluent platform the best way to go?
Hence, in general, the answer to the above question is: "Yes, it is."
However, as you pointed out for your specific question about Oracle versions, Oracle unfortunately has the following information in the README of their GoldenGate connector:
Supported Versions
The Oracle GoldenGate Kafka Connect Handler/Formatter is coded and
tested with the following product versions.
Oracle GoldenGate for Big Data 12.2.0.1.1
Confluent IO Kafka/Kafka Connect 0.9.0.1-cp1
Porting may be required for Oracle GoldenGate Kafka Connect
Handler/Formatter to work with other versions of Oracle GoldenGate for
Big Data and/or Confluent IO Kafka/Kafka Connect
This means that the connector does not work with Oracle 11g, at least as far as I can tell.
Sorry if that doesn't answer your specific question. At least I wanted to give you some feedback on the general approach. If I do come across a more specific answer, I'll update this text.
Update Mar 15, 2017: The best option you have at the moment is to use the Confluent's JDBC connector. That connector can't give you quite the same feature set as Oracle's native GoldenGate connector though.
Oracle GoldenGate and Confluent Platform are not comparable.
Confluent Platform provides the complete streaming platform and is a collection of multiple software which can be used for streaming your data, where as GoldenGate is replication and data-integration software.
Also GoldenGate is highly reliable for db replication since it maintains transactional integrity, same cannot be said for Kafka Mirror Maker or Confluent's Replicator at this time.
If you want just pure transactions - please also consider using OpenLogReplicator. It supports Oracle database from version 11.2.0.1.
It can produce transactions to Kafka in 2 formats:
Classic format - when every transaction is one Kafka message (multiple DMLS per Kafka message)
Debezium style format - transactions are divided - every DML is one Kafka message
There is already a working version. You can try it.
Right now I am using ojdbc6 to connect to Oracle 11g. It is good enough but not perfect especially when using pooling mode to check if there are new updates on the original tables.
I tried also to read all tables using certain pattern but this did not work well.
The best mode to connect an Oracle DB to Kafka (especially when the tables are very wide, columns wise, is to use queries for the connectors. This way, you ensure that you pick the right fields and do some casting for numbers if you are using avro.

How to use ODBC to connect to any DBMS

I'm developping a java application and i'm using JDBC to connect to MySQL Database, now i want to use ODBC to be able to get and retrieve data from any DBMS, of course if have access to it. Is there an API or tool to do this ?
What you are looking for is a JDBC-ODBC bridge. There are several available. It is not recommended, instead you should always use a native JDBC driver.

HCatalog & Impala itegration

Is there a way to use WebHcat to submit Impala queries?
As far as I understand, Impala uses same metastore as Hive and HCatalog give unified access to this metastore.
Unfortunately Impala queries are submitted to a different service endpoint than Hive queries, so you can't use WebHCat to submit queries to Impala.
If you're curious, here's a bit more information about how to submit queries to Impala. First, read the Impala Concepts and Architecture documentation. As you now know, you can submit your query to any node running the impalad daemon. The interface exposed by this daemon is specified in ImpalaService.thrift. There are a number of open source clients that have been implemented that will allow you to submit queries to Impala from the command line, from a web interface, or from a library in your favorite programming language. Here are a few examples:
impala-shell: command-line interface that ships with Impala
Impala app in Cloudera Hue: web interface
impyla: Python library
impala-ruby: Ruby library
php-impala: PHP library
ImpalaSharp: C# library
impala-java-client: Java library

Resources