0% found this document useful (0 votes)
218 views

Cloudera JDBC Driver For Apache Hive Install Guide PDF

Uploaded by

prasannna
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
218 views

Cloudera JDBC Driver For Apache Hive Install Guide PDF

Uploaded by

prasannna
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 111

Cloudera JDBC

Driver for Apache


Hive
Important Notice

© 2010-2019 Cloudera, Inc. All rights reserved.

Cloudera, the Cloudera logo, and any other product or service names or slogans contained in this
document, except as otherwise disclaimed, are trademarks of Cloudera and its suppliers or
licensors, and may not be copied, imitated or used, in whole or in part, without the prior written
permission of Cloudera or the applicable trademark holder.

Hadoop and the Hadoop elephant logo are trademarks of the Apache Software Foundation. All
other trademarks, registered trademarks, product names and company names or logos
mentioned in this document are the property of their respective owners. Reference to any
products, services, processes or other information, by trade name, trademark, manufacturer,
supplier or otherwise does not constitute or imply endorsement, sponsorship or
recommendation thereof by us.

Complying with all applicable copyright laws is the responsibility of the user. Without limiting the
rights under copyright, no part of this document may be reproduced, stored in or introduced into
a retrieval system, or transmitted in any form or by any means (electronic, mechanical,
photocopying, recording, or otherwise), or for any purpose, without the express written
permission of Cloudera.

Cloudera may have patents, patent applications, trademarks, copyrights, or other intellectual
property rights covering subject matter in this document. Except as expressly provided in any
written license agreement from Cloudera, the furnishing of this document does not give you any
license to these patents, trademarks copyrights, or other intellectual property.

The information in this document is subject to change without notice. Cloudera shall not be liable
for any damages resulting from technical errors or omissions which may be present in this
document, or from use of this document.

Cloudera, Inc.
1001 Page Mill Road, Building 2
Palo Alto, CA 94304-1008
[email protected]
US: 1-888-789-1488
Intl: 1-650-843-0595
www.cloudera.com

Release Information

Version: 2.6.5

Date: February 21, 2019

2 | Cloudera JDBC Driver for Apache Hive


Table of Contents
ABOUT THE CLOUDERA JDBC DRIVER FOR APACHE HIVE 5
SYSTEM R EQUIREMENTS 6
CLOUDERA JDBC DRIVER FOR APACHE HIVE FILES 7
I NSTALLING AND U SING THE CLOUDERA JDBC DRIVER FOR APACHE HIVE 8
REFERENCING THE JDBC DRIVER L IBRARIES 8
REGISTERING THE DRIVER CLASS 9
BUILDING THE CONNECTION URL 10
CONFIGURING AUTHENTICATION 12
U SING N O A UTHENTICATION 12
U SING KERBEROS 12
U SING U SER N AME 13
U SING U SER N AME A ND PASSWORD (LDAP) 14
U SING A HADOOP DELEGATION TOKEN 15
A UTHENTICATION MECHANISMS 17
CONFIGURING KERBEROS A UTHENTICATION FOR WINDOWS 19
KERBEROS E NCRYPTION STRENGTH AND THE JCE POLICY FILES E XTENSION 23
CONFIGURING SSL 25
CONFIGURING SERVER-SIDE PROPERTIES 27
CONFIGURING L OGGING 28
FEATURES 30
SQL QUERY VERSUS HIVEQL QUERY 30
DATA TYPES 30
CATALOG AND SCHEMA SUPPORT 31
WRITE-BACK 31
IHADOOP STATEMENT 32
IHADOOP CONNECTION 35
SECURITY AND A UTHENTICATION 38
INTERFACES AND SUPPORTED METHODS 38
DRIVER CONFIGURATION OPTIONS 93
A LLOWSELFSIGNED CERTS 93
A SYNC E XEC POLLINTERVAL 93
A UTH MECH 94
CAISSUED CERTSMISMATCH 94
CATALOG SCHEMASWITCH 94
DECIMALCOLUMNSCALE 95

Cloudera JDBC Driver for Apache Hive | 3


DEFAULTSTRING COLUMNL ENGTH 95
DELEGATIONTOKEN 95
DELEGATIONUID 96
HTTP P ATH 96
KRB A UTH TYPE 96
KRB HOSTFQDN 97
KRB REALM 97
KRB SERVICEN AME 98
L OG L EVEL 98
L OG PATH 99
PREPARED METAL IMITZERO 99
PWD 99
ROWSFETCHED PER BLOCK 100
SOCKETTIMEOUT 100
SSL 100
SSLKEYSTORE 101
SSLKEYSTOREPWD 101
SSLTRUSTSTORE 101
SSLTRUSTSTOREPWD 102
TRANSPORTMODE 102
UID 103
U SEN ATIVEQUERY 103
ZK 103
CONTACT U S 105
T HIRD -PARTY T RADEMARKS 106
T HIRD -PARTY L ICENSES 107

4 | Cloudera JDBC Driver for Apache Hive


About the Cloudera JDBC Driver for Apache Hive

About the Cloudera JDBC Driver for Apache Hive


The Cloudera JDBC Driver for Apache Hive is used for direct SQL and HiveQL access to Apache
Hadoop / Hive distributions, enabling Business Intelligence (BI), analytics, and reporting on
Hadoop / Hive-based data. The driver efficiently transforms an application’s SQL query into the
equivalent form in HiveQL, which is a subset of SQL-92. If an application is Hive-aware, then the
driver is configurable to pass the query through to the database for processing. The driver
interrogates Hive to obtain schema information to present to a SQL-based application. Queries,
including joins, are translated from SQL to HiveQL. For more information about the differences
between HiveQL and SQL, see "Features" on page 30.

The Cloudera JDBC Driver for Apache Hive complies with the JDBC 4.0 and 4.1 data standards.
JDBC is one of the most established and widely supported APIs for connecting to and working with
databases. At the heart of the technology is the JDBC driver, which connects an application to the
database. For more information about JDBC, see Data Access Standards on the Simba
Technologies website: https://www.simba.com/resources/data-access-standards-glossary.

This guide is suitable for users who want to access data residing within Hive from their desktop
environment. Application developers might also find the information helpful. Refer to your
application for details on connecting via JDBC.

Cloudera JDBC Driver for Apache Hive | 5


System Requirements

System Requirements
Each machine where you use the Cloudera JDBC Driver for Apache Hive must have Java Runtime
Environment (JRE) installed. The version of JRE that must be installed depends on the version of
the JDBC API you are using with the driver. The following table lists the required version of JRE for
each provided version of the JDBC API.

JDBC API Version JRE Version

4.0 6.0 or later

4.1 7.0 or later

The driver is recommended for Apache Hive versions 0.11 through 3.1, and CDH versions 5.0
through 5.15. The driver also supports later minor versions of CDH 5.

6 | Cloudera JDBC Driver for Apache Hive


Cloudera JDBC Driver for Apache Hive Files

Cloudera JDBC Driver for Apache Hive Files


The Cloudera JDBC Driver for Apache Hive is delivered in the following ZIP archives, where
[Version] is the version number of the driver:
l HiveJDBC4_[Version].zip
l HiveJDBC41_[Version].zip

The archive contains the driver supporting the JDBC API version indicated in the archive name, as
well as release notes and third-party license information. In addition, the required third-party
libraries and dependencies are packaged and shared in the driver JAR file in the archive.

Cloudera JDBC Driver for Apache Hive | 7


Installing and Using the Cloudera JDBC Driver for Apache Hive

Installing and Using the Cloudera JDBC Driver for Apache


Hive
To install the Cloudera JDBC Driver for Apache Hive on your machine, extract the files from the
appropriate ZIP archive to the directory of your choice.

To access a Hive data store using the Cloudera JDBC Driver for Apache Hive, you need to configure
the following:
l The list of driver library files (see "Referencing the JDBC Driver Libraries" on page 8)
l The Driver or DataSource class (see "Registering the Driver Class" on page 9)
l The connection URL for the driver (see "Building the Connection URL" on page 10)

Referencing the JDBC Driver Libraries


Before you use the Cloudera JDBC Driver for Apache Hive, the JDBC application or Java code that
you are using to connect to your data must be able to access the driver JAR files. In the application
or code, specify all the JAR files that you extracted from the ZIP archive.

Using the Driver in a JDBC Application

Most JDBC applications provide a set of configuration options for adding a list of driver library
files. Use the provided options to include all the JAR files from the ZIP archive as part of the driver
configuration in the application. For more information, see the documentation for your JDBC
application.

Using the Driver in Java Code

You must include all the driver library files in the class path. This is the path that the Java Runtime
Environment searches for classes and other resource files. For more information, see "Setting the
Class Path" in the appropriate Java SE Documentation.

For Java SE 6:


l For Windows:
https://docs.oracle.com/javase/6/docs/technotes/tools/windows/classpath.html
l For Linux and Solaris:
https://docs.oracle.com/javase/6/docs/technotes/tools/solaris/classpath.html

For Java SE 7:


l For Windows:
http://docs.oracle.com/javase/7/docs/technotes/tools/windows/classpath.html
l For Linux and Solaris:
http://docs.oracle.com/javase/7/docs/technotes/tools/solaris/classpath.html

For Java SE 8:
l For Windows:
http://docs.oracle.com/javase/8/docs/technotes/tools/windows/classpath.html

8 | Cloudera JDBC Driver for Apache Hive


Installing and Using the Cloudera JDBC Driver for Apache Hive

l For Linux and Solaris:


http://docs.oracle.com/javase/8/docs/technotes/tools/solaris/classpath.html

Registering the Driver Class


Before connecting to your data, you must register the appropriate class for your application.

The following is a list of the classes used to connect the Cloudera JDBC Driver for Apache Hive to
Hive data stores. The Driver classes extend java.sql.Driver, and the DataSource
classes extend javax.sql.DataSource and
javax.sql.ConnectionPoolDataSource.

To support JDBC 4.0, classes with the following fully-qualified class names (FQCNs) are available:
l com.cloudera.hive.jdbc4.HS1Driver
l com.cloudera.hive.jdbc4.HS2Driver
l com.cloudera.hive.jdbc4.HS1DataSource
l com.cloudera.hive.jdbc4.HS2DataSource

To support JDBC 4.1, classes with the following FQCNs are available:
l com.cloudera.hive.jdbc41.HS1Driver
l com.cloudera.hive.jdbc41.HS2Driver
l com.cloudera.hive.jdbc41.HS1DataSource
l com.cloudera.hive.jdbc41.HS2DataSource

The following sample code shows how to use the DriverManager to establish a connection for
JDBC 4.0:

Note:

In these examples, the line Class.forName(DRIVER_CLASS); is only required for


JDBC 4.0.
private static Connection connectViaDM() throws Exception
{
Connection connection = null;
Class.forName(DRIVER_CLASS);
connection = DriverManager.getConnection(CONNECTION_URL);
return connection;
}

The following sample code shows how to use the DataSource class to establish a connection:
private static Connection connectViaDS() throws Exception
{
Connection connection = null;
Class.forName(DRIVER_CLASS);

Cloudera JDBC Driver for Apache Hive | 9


Installing and Using the Cloudera JDBC Driver for Apache Hive

DataSource ds = new com.cloudera.hive.jdbc41.HS1DataSource


();
ds.setURL(CONNECTION_URL);
connection = ds.getConnection();
return connection;
}

Building the Connection URL


Use the connection URL to supply connection information to the data source that you are
accessing. The following is the format of the connection URL for the Cloudera JDBC Driver for
Apache Hive, where [Subprotocol] is hive if you are connecting to a Hive Server 1 instance or hive2
if you are connecting to a Hive Server 2 instance, [Host] is the DNS or IP address of the Hive server,
and [Port] is the number of the TCP port that the server uses to listen for client requests:
jdbc:[Subprotocol]://[Host]:[Port]

Note:

By default, Hive uses port 10000.

By default, the driver uses the schema named default and authenticates the connection using the
user name anonymous.

You can specify optional settings such as the number of the schema to use or any of the
connection properties supported by the driver. For a list of the properties available in the driver,
see "Driver Configuration Options" on page 93.

Note:

If you specify a property that is not supported by the driver, then the driver attempts to apply
the property as a Hive server-side property for the client session. For more information, see
"Configuring Server-Side Properties" on page 27.

The following is the format of a connection URL that specifies some optional settings:
jdbc:[Subprotocol]://[Host]:[Port]/[Schema];[Property1]=[Value];
[Property2]=[Value];...

For example, to connect to port 11000 on a Hive Server 2 instance installed on the local machine,
use a schema named default2, and authenticate the connection using a user name and password,
you would use the following connection URL:
jdbc:hive2://localhost:11000/default2;AuthMech=3;
UID=cloudera;PWD=cloudera

10 | Cloudera JDBC Driver for Apache Hive


Installing and Using the Cloudera JDBC Driver for Apache Hive

Important:

l Properties are case-sensitive.


l Do not duplicate properties in the connection URL.

Note:

l If you specify a schema in the connection URL, you can still issue queries on other
schemas by explicitly specifying the schema in the query. To inspect your databases and
determine the appropriate schema to use, run the show databases command at the
Hive command prompt.
l If you set the transportMode property to http, then the port number specified in the
connection URL corresponds to the HTTP port rather than the TCP port. By default, Hive
servers use 10001 as the HTTP port number.

Cloudera JDBC Driver for Apache Hive | 11


Configuring Authentication

Configuring Authentication
The Cloudera JDBC Driver for Apache Hive supports the following authentication mechanisms:
l No Authentication
l Kerberos
l User Name
l User Name And Password
l Hadoop Delegation Token

You configure the authentication mechanism that the driver uses to connect to Hive by specifying
the relevant properties in the connection URL.

For information about selecting an appropriate authentication mechanism when using the
Cloudera JDBC Driver for Apache Hive, see "Authentication Mechanisms" on page 17.

For information about the properties you can use in the connection URL, see "Driver
Configuration Options" on page 93.

Note:

In addition to authentication, you can configure the driver to connect over SSL. For more
information, see "Configuring SSL" on page 25.

Using No Authentication

Note:

When connecting to a Hive server of type Hive Server 1, you must use No Authentication.

You provide this information to the driver in the connection URL. For more information about the
syntax of the connection URL, see "Building the Connection URL" on page 10.

To configure a connection without authentication:


1. Set the AuthMech property to 0.
2. Set the transportMode property to binary.

For example:
jdbc:hive2://localhost:10000;AuthMech=0;transportMode=binary;

Using Kerberos
Kerberos must be installed and configured before you can use this authentication mechanism. For
information about configuring and operating Kerberos on Windows, see "Configuring Kerberos
Authentication for Windows" on page 19. For other operating systems, see the MIT Kerberos
documentation: http://web.mit.edu/kerberos/krb5-latest/doc/.

12 | Cloudera JDBC Driver for Apache Hive


Configuring Authentication

You provide this information to the driver in the connection URL. For more information about the
syntax of the connection URL, see "Building the Connection URL" on page 10.

Note:

l This authentication mechanism is available only for Hive Server 2.


l For this authentication mechanism, SASL and HTTP Thrift transport protocols are
supported. If the transportMode property is not set, the driver defaults SASL. If the
hive.server2.transport.mode property has been set to HTTP on the server side, set the
transportMode property to http.

To configure default Kerberos authentication:


1. Set the AuthMech property to 1.
2. To use the default realm defined in your Kerberos setup, do not set the KrbRealm
property.

If your Kerberos setup does not define a default realm or if the realm of your Hive server is
not the default, then set the KrbRealm property to the realm of the Hive server.
3. Set the KrbHostFQDN property to the fully qualified domain name of the Hive server host.
4. Optionally, specify how the driver obtains the Kerberos Subject by setting the
KrbAuthType property as follows:
l To configure the driver to automatically detect which method to use for obtaining
the Subject, set the KrbAuthType property to 0. Alternatively, do not set the
KrbAuthType property.
l Or, to create a LoginContext from a JAAS configuration and then use the Subject
associated with it, set the KrbAuthType property to 1.
l Or, to create a LoginContext from a Kerberos ticket cache and then use the Subject
associated with it, set the KrbAuthType property to 2.

For more detailed information about how the driver obtains Kerberos Subjects based on
these settings, see "KrbAuthType" on page 96.

For example, the following connection URL connects to a Hive server with Kerberos enabled, but
without SSL enabled:
jdbc:hive2://node1.example.com:10000;AuthMech=1;
KrbRealm=EXAMPLE.COM;KrbHostFQDN=hs2node1.example.com;
KrbServiceName=hive;KrbAuthType=2

In this example, Kerberos is enabled for JDBC connections, the Kerberos service principal name is
hive/[email protected], the host name for the data source is
node1.example.com, and the server is listening on port 10000 for JDBC connections.

Using User Name


This authentication mechanism requires a user name but does not require a password. The user
name labels the session, facilitating database tracking.

Cloudera JDBC Driver for Apache Hive | 13


Configuring Authentication

You provide this information to the driver in the connection URL. For more information about the
syntax of the connection URL, see "Building the Connection URL" on page 10.

Note:

This authentication mechanism is available only for Hive Server 2. Most default configurations of
Hive Server 2 require User Name authentication.

To configure User Name authentication:


1. Set the AuthMech property to 2.
2. Set the transportMode property to sasl.
3. Set the UID property to an appropriate user name for accessing the Hive server.

For example:
jdbc:hive2://node1.example.com:10000;AuthMech=2;
transportMode=sasl;UID=hs2

Using User Name And Password (LDAP)


This authentication mechanism requires a user name and a password. It is most commonly used
with LDAP authentication.

You provide this information to the driver in the connection URL. For more information about the
syntax of the connection URL, see "Building the Connection URL" on page 10.

Note:

This authentication mechanism is available only for Hive Server 2.

To configure User Name And Password authentication:


1. Set the AuthMech property to 3.
2. Set the transportMode property to the transport protocol that you want to use in the
Thrift layer.
3. If you set the transportMode property to http, then set the httpPath property to
the partial URL corresponding to the Hive server. Otherwise, do not set the httpPath
property.
4. Set the UID property to an appropriate user name for accessing the Hive server.
5. Set the PWD property to the password corresponding to the user name you provided.

For example, the following connection URL connects to a Hive server with LDAP authentication
enabled, but without SSL or SASL enabled:
jdbc:hive2://node1.example.com:10001;AuthMech=3;
transportMode=http;httpPath=cliservice;UID=hs2;PWD=cloudera;

14 | Cloudera JDBC Driver for Apache Hive


Configuring Authentication

In this example, user name and password (LDAP) authentication is enabled for JDBC connections,
the LDAP user name is hs2, the password is cloudera, and the server is listening on port 10001 for
JDBC connections.

Using a Hadoop Delegation Token


This authentication mechanism requires a Hadoop delegation token. This token must be provided
to the driver in the form of a Base64 URL-safe encoded string. It can be obtained from the driver
using the getDelegationToken() function, or by utilizing the Hadoop distribution .jar
files. For a code sample that demonstrates how to retrieve the token using the
getDelegationToken() function, see "Code Samples: Retrieving a Hadoop Delegation
Token" on page 15.

You provide this information to the driver in the connection URL. For more information about the
syntax of the connection URL, see "Building the Connection URL" on page 10.

Note:

l This authentication mechanism is available only for Hive Server 2.


l This authentication mechanism requires that Kerberos be configured on the server.

To configure Hadoop delegation token authentication:


1. Make sure Kerberos is configured on the server.
2. Set the AuthMech property to 6.
3. Set the delegationToken property to an appropriately encoded Hadoop delegation
token.

For example:
jdbc:hive
2
://node1.example.com:
10000;AuthMech=6;delegationToken=kP9PcyQ7prK2LwUMZMpFQ4R+5VE

Code Samples: Retrieving a Hadoop Delegation Token

If you are using a Hadoop delegation token for authentication, the token must be provided to the
driver in the form of a Base64 URL-safe encoded string. This token can be obtained from the driver
using the getDelegationToken() function, or by utilizing the Hadoop distribution .jar
files.

The code samples below demonstrate the use of the getDelegationToken() function. For
more information about this function, see "IHadoopConnection" on page 35.

The sample below shows how to obtain the token string with the driver using a Kerberos
connection:
import
com.cloudera.hiveserver2.hivecommon.core.IHadoopConnection;

Cloudera JDBC Driver for Apache Hive | 15


Configuring Authentication

public class TestDriverGetDelegationTokenClass


{
public static void main(String[] args) throws SQLException
{
// Create the connection object with Kerberos
authentication.
Connection kerbConnection = DriverManager.getConnection
(
"jdbc:hive2://localhost:10000;AuthMech=1;KrbRealm=Y
ourRealm;KrbHostFQDN=sample.com;KrbServiceName=hiv
e;");
// Unwrap the java.sql.Connection object to an
implementation of IHadoopConnection so the
// methods for delegation token can be called.
String delegationToken = kerbConnection.unwrap
(IHadoopConnection.class).getDelegationToken("owner_
name", "renewer_name");

// The token can then be used to connect with the


driver.
String tokenConnectionString =
"jdbc:hive2://localhost:10000;AuthMech=6;DelegationToke
n=" + delegationToken;
Connection tokenConnection =
DriverManager.getConnection(tokenConnectionString);
}
}

The sample below demonstrates how to obtain the encoded string form of the token if the
delegation is saved to the UserGroupInformation. This sample requires the hadoop-shims-
common-[hadoop version].jar, hadoop-common-[hadoop version].jar, and all
their dependencies.
import org.apache.hadoop.hive.shims.Utils;
import org.apache.hive.service.auth.HiveAuthFactory;
public class TestHadoopDelegationTokenClass
{
public static void main(String[] args) throws SQLException
{
// Obtain the delegationToken stored in the current
UserGroupInformation.
String delegationToken = Utils.getTokenStrForm
(HiveAuthFactory.HS2_CLIENT_TOKEN);

// The token can then be used to connect with the


driver.
String tokenConnectionString =
"jdbc:hive2://localhost:10000;AuthMech=6;DelegationToke
n=" + delegationToken;

16 | Cloudera JDBC Driver for Apache Hive


Configuring Authentication

Connection tokenConnection =
DriverManager.getConnection(tokenConnectionString);
}
}

Authentication Mechanisms
To connect to a Hive server, you must configure the Cloudera JDBC Driver for Apache Hive to use
the authentication mechanism that matches the access requirements of the server and provides
the necessary credentials. To determine the authentication settings that your Hive server
requires, check the server configuration and then refer to the corresponding section below.

Hive Server 1

Hive Server 1 does not support authentication. You must configure the driver to use No
Authentication (see "Using No Authentication" on page 12).

Hive Server 2

Hive Server 2 supports the following authentication mechanisms:


l No Authentication (see "Using No Authentication" on page 12)
l Kerberos (see "Using Kerberos" on page 12)
l User Name (see "Using User Name" on page 13)
l User Name And Password (see "Using User Name And Password (LDAP)" on page 14)
l Hadoop Delegation Token (see "Using a Hadoop Delegation Token" on page 15)

Most default configurations of Hive Server 2 require User Name authentication. If you are unable
to connect to your Hive server using User Name authentication, then verify the authentication
mechanism configured for your Hive server by examining the hive-site.xml file. Examine the
following properties to determine which authentication mechanism your server is set to use:

l hive.server2.authentication: This property sets the authentication mode for


Hive Server 2. The following values are available:
o
NONE enables plain SASL transport. This is the default value.
o
NOSASL disables the Simple Authentication and Security Layer (SASL).
o
KERBEROS enables Kerberos authentication and delegation token authentication.
o
PLAINSASL enables user name and password authentication using a cleartext
password mechanism.
o
LDAP enables user name and password authentication using the Lightweight
Directory Access Protocol (LDAP).
l hive.server2.enable.doAs: If this property is set to the default value of TRUE, then
Hive processes queries as the user who submitted the query. If this property is set to
FALSE, then queries are run as the user that runs the hiveserver2 process.

Cloudera JDBC Driver for Apache Hive | 17


Configuring Authentication

The following table lists the authentication mechanisms to configure for the driver based on the
settings in the hive-site.xml file.

hive.server2.authentication hive.server2.enable.doAs Driver Authentication Mechanism

NOSASL FALSE No Authentication

KERBEROS TRUE or FALSE Kerberos

KERBEROS TRUE Delegation Token

NONE TRUE or FALSE User Name

PLAINSASL or LDAP TRUE or FALSE User Name And Password

Note:

It is an error to set hive.server2.authentication to NOSASL and


hive.server2.enable.doAs to true. This configuration will not prevent the service from
starting up, but results in an unusable service.

For more information about authentication mechanisms, refer to the documentation for your
Hadoop / Hive distribution. See also "Running Hadoop in Secure Mode" in the Apache Hadoop
documentation: http://hadoop.apache.org/docs/r0.23.7/hadoop-project-dist/hadoop-
common/ClusterSetup.html#Running_Hadoop_in_Secure_Mode.

Using No Authentication

When hive.server2.authentication is set to NOSASL, you must configure your


connection to use No Authentication.

Using Kerberos

When connecting to a Hive Server 2 instance and hive.server2.authentication is set to


KERBEROS, you must configure your connection to use Kerberos or Delegation Token
authentication.

Using User Name

When connecting to a Hive Server 2 instance and hive.server2.authentication is set to


NONE, you must configure your connection to use User Name authentication. Validation of the
credentials that you include depends on hive.server2.enable.doAs:
l If hive.server2.enable.doAs is set to TRUE, then the server attempts to map the
user name provided by the driver from the driver configuration to an existing operating
system user on the host running Hive Server 2. If this user name does not exist in the
operating system, then the user group lookup fails and existing HDFS permissions are used.
For example, if the current user group is allowed to read and write to the location in HDFS,

18 | Cloudera JDBC Driver for Apache Hive


Configuring Authentication

then read and write queries are allowed.


l If hive.server2.enable.doAs is set to FALSE, then the user name in the driver
configuration is ignored.

If no user name is specified in the driver configuration, then the driver defaults to using hive as
the user name.

Using User Name And Password

When connecting to a Hive Server 2 instance and the server is configured to use the SASL-PLAIN
authentication mechanism with a user name and a password, you must configure your
connection to use User Name And Password authentication.

Configuring Kerberos Authentication for Windows


You can configure your Kerberos setup so that you use the MIT Kerberos Ticket Manager to get
the Ticket Granting Ticket (TGT), or configure the setup so that you can use the driver to get the
ticket directly from the Key Distribution Center (KDC). Also, if a client application obtains a Subject
with a TGT, it is possible to use that Subject to authenticate the connection.

Downloading and Installing MIT Kerberos for Windows

To download and install MIT Kerberos for Windows 4.0.1:


1. Download the appropriate Kerberos installer:
l For a 64-bit machine, use the following download link from the MIT Kerberos
website: http://web.mit.edu/kerberos/dist/kfw/4.0/kfw-4.0.1-amd64.msi.
l For a 32-bit machine, use the following download link from the MIT Kerberos
website: http://web.mit.edu/kerberos/dist/kfw/4.0/kfw-4.0.1-i386.msi.

Note:

The 64-bit installer includes both 32-bit and 64-bit libraries. The 32-bit installer includes 32-
bit libraries only.

2. To run the installer, double-click the .msi file that you downloaded.
3. Follow the instructions in the installer to complete the installation process.
4. When the installation completes, click Finish.

Using the MIT Kerberos Ticket Manager to Get Tickets

Setting the KRB5CCNAME Environment Variable

You must set the KRB5CCNAME environment variable to your credential cache file.

Cloudera JDBC Driver for Apache Hive | 19


Configuring Authentication

To set the KRB5CCNAME environment variable:


1. Click Start , then right-click Computer, and then click Properties.
2. Click Advanced System Settings.
3. In the System Properties dialog box, on the Advanced tab, click Environment Variables.
4. In the Environment Variables dialog box, under the System Variables list, click New.
5. In the New System Variable dialog box, in the Variable Name field, type KRB5CCNAME.
6. In the Variable Value field, type the path for your credential cache file. For example, type
C:\KerberosTickets.txt.
7. Click OK to save the new variable.
8. Make sure that the variable appears in the System Variables list.
9. Click OK to close the Environment Variables dialog box, and then click OK to close the
System Properties dialog box.
10. Restart your machine.

Getting a Kerberos Ticket

To get a Kerberos ticket:


1. Click Start , then click All Programs, and then click the Kerberos for Windows (64-bit) or
Kerberos for Windows (32-bit) program group.
2. Click MIT Kerberos Ticket Manager.
3. In the MIT Kerberos Ticket Manager, click Get Ticket.
4. In the Get Ticket dialog box, type your principal name and password, and then click OK.

If the authentication succeeds, then your ticket information appears in the MIT Kerberos Ticket
Manager.

Authenticating to the Hive Server

You provide this information to the driver in the connection URL. For more information about the
syntax of the connection URL, see "Building the Connection URL" on page 10.

To authenticate to the Hive server:


Use a connection URL that has the following properties defined:
l AuthMech
l KrbHostFQDN
l KrbRealm
l KrbServiceName

For detailed information about these properties, see "Driver Configuration Options" on page 93

20 | Cloudera JDBC Driver for Apache Hive


Configuring Authentication

Using the Driver to Get Tickets

Deleting the KRB5CCNAME Environment Variable

To enable the driver to get Ticket Granting Tickets (TGTs) directly, make sure that the
KRB5CCNAME environment variable has not been set.

To delete the KRB5CCNAME environment variable:


1. Click the Start button , then right-click Computer, and then click Properties.
2. Click Advanced System Settings.
3. In the System Properties dialog box, click the Advanced tab and then click Environment
Variables.
4. In the Environment Variables dialog box, check if the KRB5CCNAME variable appears in the
System variables list. If the variable appears in the list, then select the variable and click
Delete.
5. Click OK to close the Environment Variables dialog box, and then click OK to close the
System Properties dialog box.

Setting Up the Kerberos Configuration File

To set up the Kerberos configuration file:


1. Create a standard krb5.ini file and place it in the C:\Windows directory.
2. Make sure that the KDC and Admin server specified in the krb5.ini file can be resolved
from your terminal. If necessary, modify
C:\Windows\System32\drivers\etc\hosts.

Setting Up the JAAS Login Configuration File

To set up the JAAS login configuration file:


1. Create a JAAS login configuration file that specifies a keytab file and doNotPrompt=true.

For example:
Client {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
keyTab="PathToTheKeyTab"
principal="cloudera@CLOUDERA"
doNotPrompt=true;
};

2. Set the java.security.auth.login.config system property to the location of the


JAAS file.

For example: C:\KerberosLoginConfig.ini.

Cloudera JDBC Driver for Apache Hive | 21


Configuring Authentication

Authenticating to the Hive Server

You provide this information to the driver in the connection URL. For more information about the
syntax of the connection URL, see "Building the Connection URL" on page 10.

To authenticate to the Hive server:


Use a connection URL that has the following properties defined:
l AuthMech
l KrbHostFQDN
l KrbRealm
l KrbServiceName

For detailed information about these properties, see "Driver Configuration Options" on
page 93.

Using an Existing Subject to Authenticate the Connection

If the client application obtains a Subject with a TGT, then that Subject can be used to
authenticate the connection to the server.

To use an existing Subject to authenticate the connection:


1. Create a PrivilegedAction for establishing the connection to the database.

For example:
// Contains logic to be executed as a privileged action
public class AuthenticateDriverAction
implements PrivilegedAction<Void>
{
// The connection, which is established as a
PrivilegedAction
Connection con;
// Define a string as the connection URL
static String ConnectionURL =
"jdbc:hive2://192.168.1.1:10000";
/**
* Logic executed in this method will have access to the
* Subject that is used to "doAs". The driver will get
* the Subject and use it for establishing a connection
* with the server.
*/
@Override
public Void run()
{
try
{
// Establish a connection using the connection URL

22 | Cloudera JDBC Driver for Apache Hive


Configuring Authentication

con = DriverManager.getConnection(ConnectionURL);
}
catch (SQLException e)
{
// Handle errors that are encountered during
// interaction with the data store
e.printStackTrace();
}
catch (Exception e)
{
// Handle other errors
e.printStackTrace();
}
return null;
}
}

2. Run the PrivilegedAction using the existing Subject, and then use the connection.

For example:

// Create the action


AuthenticateDriverAction authenticateAction = new
AuthenticateDriverAction();
// Establish the connection using the Subject for
// authentication.
Subject.doAs(loginConfig.getSubject(), authenticateAction);
// Use the established connection.
authenticateAction.con;

Kerberos Encryption Strength and the JCE Policy Files Extension


If the encryption being used in your Kerberos environment is too strong, you might encounter the
error message "Unable to connect to server: GSS initiate failed" when trying to use the driver to
connect to a Kerberos-enabled cluster. Typically, Java vendors only allow encryption strength up
to 128 bits by default. If you are using greater encryption strength in your environment (for
example, 256-bit encryption), then you might encounter this error.

Diagnosing the Issue


If you encounter the error message "Unable to connect to server: GSS initiate failed", confirm that
it is occurring due to encryption strength by enabling Kerberos layer logging in the JVM and then
checking if the log output contains the error message "KrbException: Illegal key size".

To enable Kerberos layer logging in a Sun JVM:


Choose one:
l In the Java command you use to start the application, pass in the following

Cloudera JDBC Driver for Apache Hive | 23


Configuring Authentication

argument:
-Dsun.security.krb5.debug=true

l Or, add the following code to the source code of your application:
System.setProperty("sun.security.krb5.debug","true")

To enable Kerberos layer logging in an IBM JVM:


Choose one:
l In the Java command you use to start the application, pass in the following
arguments:
-Dcom.ibm.security.krb5.Krb5Debug=all
-Dcom.ibm.security.jgss.debug=all

l Or, add the following code to the source code of your application:
System.setProperty
("com.ibm.security.krb5.Krb5Debug","all");
System.setProperty
("com.ibm.security.jgss.debug","all");

Resolving the Issue


After you confirm that the error is occurring due to encryption strength, you can resolve the issue
by downloading and installing the Java Cryptography Extension (JCE) Unlimited Strength
Jurisdiction Policy Files extension from your Java vendor. Refer to the instructions from the
vendor to install the files to the correct location.

Important:

Consult your company’s policy to make sure that you are allowed to enable encryption
strengths in your environment that are greater than what the JVM allows by default.

If the issue is not resolved after you install the JCE policy files extension, then restart your
machine and try your connection again. If the issue persists even after you restart your machine,
then verify which directories the JVM is searching to find the JCE policy files extension. To print
out the search paths that your JVM currently uses to find the JCE policy files extension, modify
your Java source code to print the return value of the following call:
System.getProperty("java.ext.dirs")

24 | Cloudera JDBC Driver for Apache Hive


Configuring SSL

Configuring SSL
Note:

In this documentation, "SSL" indicates both TLS (Transport Layer Security) and SSL (Secure
Sockets Layer). The driver supports industry-standard versions of TLS/SSL.

If you are connecting to a Hive server that has Secure Sockets Layer (SSL) enabled, you can
configure the driver to connect to an SSL-enabled socket. When connecting to a server over SSL,
the driver uses one-way authentication to verify the identity of the server.

One-way authentication requires a signed, trusted SSL certificate for verifying the identity of the
server. You can configure the driver to access a specific TrustStore or KeyStore that contains the
appropriate certificate. If you do not specify a TrustStore or KeyStore, then the driver uses the
default Java TrustStore named jssecacerts. If jssecacerts is not available, then the driver
uses cacerts instead.

You provide this information to the driver in the connection URL. For more information about the
syntax of the connection URL, see "Building the Connection URL" on page 10.

To configure SSL:
1. Set the SSL property to 1.
2. If you are not using one of the default Java TrustStores, then do one of the following:
l Create a TrustStore and configure the driver to use it:
a. Create a TrustStore containing your signed, trusted server certificate.
b. Set the SSLTrustStore property to the full path of the TrustStore.
c. Set the SSLTrustStorePwd property to the password for accessing the
TrustStore.
l Or, create a KeyStore and configure the driver to use it:
a. Create a KeyStore containing your signed, trusted server certificate.
b. Set the SSLKeyStore property to the full path of the KeyStore.
c. Set the SSLKeyStorePwd property to the password for accessing the
KeyStore.
3. Optionally, to allow the SSL certificate used by the server to be self-signed, set the
AllowSelfSignedCerts property to 1.

Important:

When the AllowSelfSignedCerts property is set to 1, SSL verification is disabled.


The driver does not verify the server certificate against the trust store, and does not verify
if the server's host name matches the common name or subject alternative names in the
server certificate.

4. Optionally, to allow the common name of a CA-issued certificate to not match the host
name of the Hive server, set the CAIssuedCertNamesMismatch property to 1.

Cloudera JDBC Driver for Apache Hive | 25


Configuring SSL

For example, the following connection URL connects to a data source using username and
password (LDAP) authentication, with SSL enabled:
jdbc:hive2://localhost:10000;AuthMech=3;SSL=1;
SSLKeyStore=C:\\Users\\bsmith\\Desktop\\keystore.jks;SSLKeyStore
Pwd=clouderaSSL123;UID=hs2;PWD=cloudera123

Note:

For more information about the connection properties used in SSL connections, see "Driver
Configuration Options" on page 93.

26 | Cloudera JDBC Driver for Apache Hive


Configuring Server-Side Properties

Configuring Server-Side Properties


You can use the driver to apply configuration properties to the Hive server by setting the
properties in the connection URL.

For example, to set the mapreduce.job.queuename property to myQueue, you would use a
connection URL such as the following:
jdbc:hive://localhost:18000/default2;AuthMech=3;
UID=cloudera;PWD=cloudera;mapreduce.job.queuename=myQueue

Note:

For a list of all Hadoop and Hive server-side properties that your implementation supports, run
the set -v command at the Hive CLI command line or Beeline. You can also execute the set
-v query after connecting using the driver.

Cloudera JDBC Driver for Apache Hive | 27


Configuring Logging

Configuring Logging
To help troubleshoot issues, you can enable logging in the driver.

Important:

Only enable logging long enough to capture an issue. Logging decreases performance and can
consume a large quantity of disk space.

In the connection URL, set the LogLevel key to enable logging at the desired level of detail. The
following table lists the logging levels provided by the Cloudera JDBC Driver for Apache Hive, in
order from least verbose to most verbose.

LogLevel Value Description

0 Disable all logging.

1 Log severe error events that lead the driver to abort.

2 Log error events that might allow the driver to continue running.

3 Log events that might result in an error if action is not taken.

4 Log general information that describes the progress of the driver.

5 Log detailed information that is useful for debugging the driver.

6 Log all driver activity.

To enable logging:
1. Set the LogLevel property to the desired level of information to include in log files.
2. Set the LogPath property to the full path to the folder where you want to save log files. To
make sure that the connection URL is compatible with all JDBC applications, escape the
backslashes (\) in your file path by typing another backslash.

For example, the following connection URL enables logging level 3 and saves the log files in
the C:\temp folder:
jdbc:hive://localhost:11000;LogLevel=3;LogPath=C:\\temp

3. To make sure that the new settings take effect, restart your JDBC application and reconnect
to the server.

The Cloudera JDBC Driver for Apache Hive produces the following log files in the location specified
in the LogPath property:
l A HiveJDBC_driver.log file that logs driver activity that is not specific to a
connection.

28 | Cloudera JDBC Driver for Apache Hive


Configuring Logging

l A HiveJDBC_connection_[Number].log file for each connection made to the


database, where [Number] is a number that identifies each log file. This file logs driver
activity that is specific to the connection.

If the LogPath value is invalid, then the driver sends the logged information to the standard
output stream (System.out).

To disable logging:
1. Set the LogLevel property to 0.
2. To make sure that the new setting takes effect, restart your JDBC application and reconnect
to the server.

Cloudera JDBC Driver for Apache Hive | 29


Features

Features
More information is provided on the following features of the Cloudera JDBC Driver for Apache
Hive:
l "SQL Query versus HiveQL Query" on page 30
l "Data Types" on page 30
l "Catalog and Schema Support" on page 31
l "Write-back" on page 31
l "IHadoopStatement" on page 32
l "IHadoopConnection" on page 35
l "Security and Authentication" on page 38
l "Interfaces and Supported Methods" on page 38

SQL Query versus HiveQL Query


The native query language supported by Hive is HiveQL. HiveQL is a subset of SQL-92. However,
the syntax is different enough that most applications do not work with native HiveQL.

Data Types
The Cloudera JDBC Driver for Apache Hive supports many common data formats, converting
between Hive, SQL, and Java data types.

The following table lists the supported data type mappings.

Hive Type SQL Type Java Type

BIGINT BIGINT java.math.BigInteger

BINARY VARBINARY byte[]

BOOLEAN BOOLEAN Boolean

CHAR CHAR String

(Available only in Hive 0.13.0


or later)

DATE DATE java.sql.Date

30 | Cloudera JDBC Driver for Apache Hive


Features

Hive Type SQL Type Java Type

DECIMAL DECIMAL java.math.BigDecimal

(In Hive 0.13 and later, you


can specify scale and
precision when creating
tables using the DECIMAL
data type.)

DOUBLE DOUBLE Double

FLOAT REAL Float

INT INTEGER Long

SMALLINT SMALLINT Integer

TIMESTAMP TIMESTAMP java.sql.Timestamp

TINYINT TINYINT Short

VARCHAR VARCHAR String

(Available only in Hive 0.12.0


or later)

The aggregate types (ARRAY, MAP, STRUCT, and UNIONTYPE) are not yet supported. Columns of
aggregate types are treated as VARCHAR columns in SQL and STRING columns in Java.

Catalog and Schema Support


The Cloudera JDBC Driver for Apache Hive supports both catalogs and schemas to make it easy for
the driver to work with various JDBC applications. Since Hive only organizes tables into
schemas/databases, the driver provides a synthetic catalog named HIVE under which all of the
schemas/databases are organized. The driver also maps the JDBC schema to the Hive
schema/database.

Note:

Setting the CatalogSchemaSwitch connection property to 1 will cause Hive catalogs to be


treated as schemas in the driver as a restriction for filtering.

Write-back
The Cloudera JDBC Driver for Apache Hive supports translation for the following syntax when
connecting to a Hive Server 2 instance that is running Hive 0.14 or later:

Cloudera JDBC Driver for Apache Hive | 31


Features

l INSERT
l UPDATE
l DELETE
l CREATE
l DROP

If the statement contains non-standard SQL-92 syntax, then the driver is unable to translate the
statement to SQL and instead falls back to using HiveQL.

IHadoopStatement
IHadoopStatement is an interface implemented by the driver's statement class. It provides access
to methods that allow for asynchronous execution of queries and the retrieval of the Yarn ATS
GUID associated with the execution.

The IHadoopStatement interface is defined by the IHadoopStatement.java file. This file


should look like the following example:
//
================================================================
=================================
/// @file IHadoopStatement.java /// /// Exposed interface for
asynchronous query execution. /// /// Copyright (C) 2017 Simba
Technologies Incorporated.
//
================================================================
=================================
package com.cloudera.hiveserver2.hivecommon.core;
import java.sql.ResultSet; import java.sql.SQLException;
import java.sql.Statement;
/**
* An interface that extends the standard SQL Statement Interface
but allows for asynchronous
* query execution.
* The polling for query execution will occur when {@link
ResultSet#next()} or
* {@link ResultSet#getMetaData()} is called.
*/ public interface IHadoopStatement extends Statement
{
/**
* Executes the given SQL statement asynchronously.
* <p> * Sends the query off to the server but does not
wait for query execution to complete.
* A ResultSet with empty columns is returned.
* The polling for completion of query execution is done
when {@link ResultSet#next()} or
* {@link ResultSet#getMetaData()}is called.
* </p>

32 | Cloudera JDBC Driver for Apache Hive


Features

*
* @param sql
An SQL statement to be sent to the database, typically
a
* static SQL SELECT statement.
*
* @return A ResultSet object that DOES NOT contain the
data produced by the given query; never null.
*
* @throws SQLException If a database access error
occurs, or the given SQL
* statement produces anything other than a single
* <code>ResultSet</code> object.
*/
public ResultSet executeAsync(String sql) throws
SQLException;
/**
* Returns the Yarn ATS guid.
*
* @return String The yarn ATS guid from the operation
if execution has started,
* else null.
*/
public String getYarnATSGuid(); }

The following methods are available for use in IHadoopStatement:


l executeAsync(String sql)

The driver sends a request to the server for statement execution and returns immediately
after receiving a response from the server for the execute request without waiting for the
server to complete the execution.

The driver does not wait for the server to complete query execution unless getMetaData
() or next() APIs are called.

Note that this feature does not work with prepared statements.

For example:
import
com.cloudera.hiveserver2.hivecommon.core.IHadoopStatement;

public class TestExecuteAsyncClass


{
public static void main(String[] args) throws SQLException
{
// Create the connection object.
Connection connection = DriverManager.getConnection
("jdbc:hive2://localhost:10000");

Cloudera JDBC Driver for Apache Hive | 33


Features

// Create the statement object.


Statement statement = connection.createStatement();

// Unwrap the java.sql.Statement object to an


implementation of IHadoopStatement so the
// execution can be done asynchronously.
//
// The driver will return from this call as soon as
it gets a response from the
// server for the execute request without waiting
for server to complete query execution.
ResultSet resultSet =
statement.unwrap(
IHadoopStatement.class).executeAsync(
"select * from example_table");

// Calling getMetaData() on the ResultSet here will


cause the driver to wait for the server
// to complete query execution before proceeding with
the rest of the operation.
ResultSetMetaData rsMetadata = resultSet.getMetaData();

// Excluding code for work on the result set


metadata...

// Calling getMetaData() on the ResultSet here, and if


getMetaData() was not call prior to
// this, will cause the driver to wait for the server
to complete query execution before
// proceeding with the rest of the operation.
resultSet.next();

// Excluding code for work on the result set ...


}
}
l getYarnATSGuid()

Returns the Yarn ATS GUID associated with the current execution. Returns null if the Yarn
ATS GUID is not available.

For example:
public class TestYarnGUIDClass
{
public static void main(String[] args) throws SQLException
{
// Create the connection object.

34 | Cloudera JDBC Driver for Apache Hive


Features

Connection connection = DriverManager.getConnection


("jdbc:hive2://localhost:10000");

// Create the statement object.


Statement statement = connection.createStatement();

// Execute a query.
ResultSet resultSet = statement.executeQuery("select *
from example_table");

// Unwrap the java.sql.Statement object to an


implementation of IHadoopStatement to access the
// getYarnATSGuid() API call.
String guid = statement.unwrap(
IHadoopStatement.class).getYarnATSGuid();
}
}

IHadoopConnection
IHadoopConnection is an interface implemented by the driver's statement class. It provides
access to methods that allow for the retrieval, deletion, and renewal of delegation tokens.

The IHadoopStatement interface is defined by the IHadoopStatement.java file. This file


should look like the following example:
//
================================================================
=================================
/// @file IHadoopConnection.java
///
/// Exposed interface for the retrieval of delegation tokens.
///
/// Copyright (C) 2017 Simba Technologies Incorporated.
//
================================================================
=================================
package com.cloudera.hiveserver2.hivecommon.core;
import java.sql.Connection;
import java.sql.SQLException;
/**
* An interface that extends the standard SQL Connection
Interface but allows for the
* retrieval/renewal/cancellation of delegation tokens.
*/
public interface IHadoopConnection extends Connection
{
/**
* Sends a cancel delegation token request to the

Cloudera JDBC Driver for Apache Hive | 35


Features

server.
*
* @param tokenString The token to cancel.
* @throws SQLException If an error occurs while sending
the request.
*/
public void cancelDelegationToken(String tokenString)
throws SQLException;

/**
* Sends a get delegation token request to the server
and returns the token as an
* encoded string.
*
* @param owner The owner of the token.
* @param renewer The renewer of the token.
*
* @return The token as an encoded string.
* @throws SQLException If an error occurs while getting
the token.
*/
public String getDelegationToken(String owner, String
renewer) throws SQLException;

/**
* Sends a renew delegation token request to the sever.
*
* @param tokenString The token to renew.
* @throws SQLException If an error occurs while sending
the request.
*/
public void renewDelegationToken(String tokenString)
throws SQLException;
}

The following methods are available for use in IHadoopConnection:


l getDelegationToken(String owner, String renewer)

The driver sends a request to the server to obtain a delegation token with the given owner
and renewer.

The method should be called on a Kerberos-authenticated connection.


l cancelDelegationToken()

The driver sends a request to the server to cancel the provided delegation token.
l renewDelegationToken()

The driver sends a request to the server to renew the provided delegation token.

36 | Cloudera JDBC Driver for Apache Hive


Features

The following is a basic code sample that demonstrates how to use the above functions:
public class TestDelegationTokenClass
{
public static void main(String[] args) throws SQLException
{
// Create the connection object with Kerberos
authentication.
Connection kerbConnection = DriverManager.getConnection
(
"jdbc:hive2://localhost:10000;AuthMech=1;KrbRealm=Y
ourRealm;KrbHostFQDN=sample.com;KrbServiceName=hiv
e;");

// Unwrap the java.sql.Connection object to an


implementation
// of IHadoopConnection so the methods for delegation
token
// can be called.
String delegationToken = kerbConnection.unwrap
(IHadoopConnection.class).getDelegationToken("owner_
name", "renewer_name");

// The token can then be used to connect with the


driver.
String tokenConnectionString =
"jdbc:hive2://localhost:10000;AuthMech=6;DelegationToke
n=" + delegationToken;
Connection tokenConnection =
DriverManager.getConnection(tokenConnectionString);

// Excluding code for work with the tokenConnection ...

// The original token (delegationToken) can be


cancelled or renewed by unwrapping the
java.sql.Connection object again to
// an implementation of IHadoopConnection.

// Renewing the token:


kerbConnection.unwrap
(IHadoopConnection.class).renewDelegationToken
(delegationToken);

// Cancelling the token:


kerbConnection.unwrap
(IHadoopConnection.class).cancelDelegationToken
(delegationToken);
}
}

Cloudera JDBC Driver for Apache Hive | 37


Features

Security and Authentication


To protect data from unauthorized access, some Hive data stores require connections to be
authenticated with user credentials or the SSL protocol. The Cloudera JDBC Driver for Apache Hive
provides full support for these authentication protocols.

Note:

In this documentation, "SSL" indicates both TLS (Transport Layer Security) and SSL (Secure
Sockets Layer). The driver supports industry-standard versions of TLS/SSL.

The driver provides mechanisms that allow you to authenticate your connection using the
Kerberos protocol, your Hive user name only, or your Hive user name and password. You must
use the authentication mechanism that matches the security requirements of the Hive server. For
information about determining the appropriate authentication mechanism to use based on the
Hive server configuration, see "Authentication Mechanisms" on page 17. For detailed driver
configuration instructions, see "Configuring Authentication" on page 12.

Additionally, the driver supports SSL connections with one-way authentication. If the server has
an SSL-enabled socket, then you can configure the driver to connect to it.

It is recommended that you enable SSL whenever you connect to a server that is configured to
support it. SSL encryption protects data and credentials when they are transferred over the
network, and provides stronger security than authentication alone. For detailed configuration
instructions, see "Configuring SSL" on page 25.

The SSL version that the driver supports depends on the JVM version that you are using. For
information about the SSL versions that are supported by each version of Java, see "Diagnosing
TLS, SSL, and HTTPS" on the Java Platform Group Product Management Blog:
https://blogs.oracle.com/java-platform-group/entry/diagnosing_tls_ssl_and_https.

Note:

The SSL version used for the connection is the highest version that is supported by both the
driver and the server, which is determined at connection time.

Interfaces and Supported Methods


The Cloudera JDBC Driver for Apache Hive implements the following JDBC interfaces:

l "CallableStatement" on page 39 l "PooledConnection" on page 68


l "Connection" on page 48 l "PreparedStatement" on page 69
l "DatabaseMetaData" on page 53 l "ResultSet" on page 74
l "DataSource" on page 65 l "ResultSetMetaData" on page 87
l "Driver" on page 66 l "Statement" on page 89
l "ParameterMetaData" on page 67

38 | Cloudera JDBC Driver for Apache Hive


Features

However, the driver does not support every method from these interfaces. For information about
whether a specific method is supported by the driver and which version of the JDBC API is the
earliest version that supports the method, refer to the following sections.

The driver does not support the following JDBC features:

l Array l SQLData
l Blob l SQLInput
l Clob l SQLOutput
l Ref l Struct
l Savepoint

CallableStatement

The CallableStatement interface extends the PreparedStatement interface.

The following table lists the methods that belong to the CallableStatement interface, and
describes whether each method is supported by the Cloudera JDBC Driver for Apache Hive and
which version of the JDBC API is the earliest version that supports the method.

For detailed information about each method in the CallableStatement interface, see the
Java API documentation:
http://docs.oracle.com/javase/1.5.0/docs/api/java/sql/CallableStatement.html.

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

Array getArray(int i) 3.0 No

Array getArray(String 3.0 No


parameterName)

BigDecimal getBigDecimal 3.0 Yes


(int parameterIndex)

BigDecimal getBigDecimal 3.0 Yes Deprecated.


(int parameterIndex, int
scale)

BigDecimal getBigDecimal 3.0 Yes


(String parameterName)

Blob getBlob(int i) 3.0 No

Cloudera JDBC Driver for Apache Hive | 39


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

Blob getBlob(String 3.0 No


parameterName)

boolean getBoolean(int 3.0 Yes


parameterIndex)

boolean getBoolean(String 3.0 Yes


parameterName)

byte getByte(int 3.0 Yes


parameterIndex)

byte getByte(String 3.0 Yes


parameterName)

byte[] getBytes(int 3.0 Yes


parameterIndex)

byte[] getBytes(String 3.0 Yes


parameterName)

Clob getClob(int i) 3.0 No

Clob getClob(String 3.0 No


parameterName)

Date getDate(int 3.0 Yes


parameterIndex)

Date getDate(int 3.0 Yes


parameterIndex, Calendar
cal)

Date getDate(String 3.0 Yes


parameterName)

Date getDate(String 3.0 Yes


parameterName, Calendar
cal)

double getDouble(int 3.0 Yes


parameterIndex)

40 | Cloudera JDBC Driver for Apache Hive


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

double getDouble(String 3.0 Yes


parameterName)

float getFloat(int 3.0 Yes


parameterIndex)

float getFloat(String 3.0 Yes


parameterName)

int getInt(int 3.0 Yes


parameterIndex)

int getInt(String 3.0 Yes


parameterName)

long getLong(int 3.0 Yes


parameterIndex)

long getLong(String 3.0 Yes


parameterName)

Reader getNCharacterStream 4.0 No


(int parameterIndex)

Reader getNCharacterStream 4.0 No


(String parameterName)

NClob getNClob(int 4.0 No


parameterIndex)

NClob getNClob(String 4.0 No


parameterName)

String getNString(int 4.0 No


parameterIndex)

String getNString(String 4.0 No


parameterName)

Object getObject(int 3.0 Yes


parameterIndex)

Cloudera JDBC Driver for Apache Hive | 41


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

<T> T getObject(int 4.1 No


parameterIndex, Class<T>
type)

Object getObject(int i, 3.0 No


Map<String,Class<?>> map)

Object getObject(String 3.0 Yes


parameterName)

<T> T getObject(String 4.1 No


parameterName, Class<T>
type)

Object getObject(String 3.0 Yes


parameterName,
Map<String,Class<?>> map)

Ref getRef(int i) 3.0 No

Ref getRef(String 3.0 No


parameterName)

RowId getRowId(int 4.0 No


parameterIndex)

RowId getRowId(String 4.0 No


parameterName)

short getShort(int 3.0 Yes


parameterIndex)

short getShort(String 3.0 Yes


parameterName)

SQLXML getSQLXML(int 4.0 No


parameterIndex)

SQLXML getSQLXML(String 4.0 No


parameterName)

String getString(int 3.0 Yes


parameterIndex)

42 | Cloudera JDBC Driver for Apache Hive


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

String getString(String 3.0 Yes


parameterName)

Time getTime(int 3.0 Yes


parameterIndex)

Time getTime(int 3.0 Yes


parameterIndex, Calendar
cal)

Time getTime(String 3.0 Yes


parameterName)

Time getTime(String 3.0 Yes


parameterName, Calendar
cal)

Timestamp getTimestamp(int 3.0 Yes


parameterIndex)

Timestamp getTimestamp(int 3.0 Yes


parameterIndex, Calendar
cal)

Timestamp getTimestamp 3.0 Yes


(String parameterName)

Timestamp getTimestamp 3.0 Yes


(String parameterName,
Calendar cal)

URL getURL(int 3.0 No


parameterIndex)

URL getURL(String 3.0 No


parameterName)

void registerOutParameter 3.0 Yes


(int parameterIndex, int
sqlType)

Cloudera JDBC Driver for Apache Hive | 43


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

void registerOutParameter 3.0 Yes


(int parameterIndex, int
sqlType, int scale)

void registerOutParameter 3.0 Yes


(int paramIndex, int
sqlType, String typeName)

void registerOutParameter 3.0 Yes


(String parameterName, int
sqlType)

void registerOutParameter 3.0 Yes


(String parameterName, int
sqlType, int scale)

void registerOutParameter 3.0 Yes


(String parameterName, int
sqlType, String typeName)

void setAsciiStream(String 4.0 Yes


parameterName, InputStream
x)

void setAsciiStream(String 3.0 Yes


parameterName, InputStream
x, int length)

void setAsciiStream(String 4.0 Yes


parameterName, InputStream
x, long length)

void setBigDecimal(String 3.0 Yes


parameterName, BigDecimal
x)

void setBinaryStream(String 4.0 Yes


parameterName, InputStream
x)

setBinaryStream(String 3.0 Yes


parameterName, InputStream
x, int length)

44 | Cloudera JDBC Driver for Apache Hive


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

void setBinaryStream(String 4.0 Yes


parameterName, InputStream
x, long length)

void setBlob(String 4.0 Yes


parameterName, Blob x)

void setBlob(String 4.0 Yes


parameterName, InputStream
inputStream)

void setBlob(String 4.0 Yes


parameterName, InputStream
inputStream, long length)

void setBoolean(String 3.0 Yes


parameterName, boolean x)

void setByte(String 3.0 Yes


parameterName, byte x)

void setBytes(String 3.0 Yes


parameterName, byte[] x)

void setCharacterStream 4.0 Yes


(String parameterName,
Reader reader)

void setCharacterStream 3.0 Yes


(String parameterName,
Reader reader, int length)

void setCharacterStream 4.0 Yes


(String parameterName,
Reader reader, long length)

void setClob(String 4.0 Yes


parameterName, Clob x)

void setClob(String 4.0 Yes


parameterName, Reader
reader)

Cloudera JDBC Driver for Apache Hive | 45


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

void setClob(String 4.0 Yes


parameterName, Reader
reader, long length)

void setDate(String 3.0 Yes


parameterName, Date x)

void setDate(String 3.0 Yes


parameterName, Date x,
Calendar cal)

void setDouble(String 3.0 Yes


parameterName, double x)

void setFloat(String 3.0 Yes


parameterName, float x)

void setInt(String 3.0 Yes


parameterName, int x)

void setLong(String 3.0 Yes


parameterName, long x)

void setNCharacterStream 4.0 Yes


(String parameterName,
Reader value)

void setNCharacterStream 4.0 Yes


(String parameterName,
Reader value, long length)

void setNClob(String 4.0 Yes


parameterName, NClob value)

void setNClob(String 4.0 Yes


parameterName, Reader
reader)

void setNClob(String 4.0 Yes


parameterName, Reader
reader, long length)

46 | Cloudera JDBC Driver for Apache Hive


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

void setNString(String 4.0 Yes


parameterName, String
value)

void setNull(String 3.0 Yes


parameterName, int sqlType)

void setNull(String 3.0 Yes


parameterName, int sqlType,
String typeName)

void setObject(String 3.0 Yes


parameterName, Object x)

void setObject(String 3.0 Yes


parameterName, Object x,
int targetSqlType)

void setObject(String 3.0 Yes


parameterName, Object x,
int targetSqlType, int
scale)

void setRowId(String 4.0 Yes


parameterName, RowId x)

void setShort(String 3.0 Yes


parameterName, short x)

void setSQLXML(String 4.0 Yes


parameterName, SQLXML
xmlObject)

void setString(String 3.0 Yes


parameterName, String x)

void setTime(String 3.0 Yes


parameterName, Time x)

void setTime(String 3.0 Yes


parameterName, Time x,
Calendar cal)

Cloudera JDBC Driver for Apache Hive | 47


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

void setTimestamp(String 3.0 Yes


parameterName, Timestamp x)

void setTimestamp(String 3.0 Yes


parameterName, Timestamp x,
Calendar cal)

void setURL(String 3.0 Yes


parameterName, URL val)

boolean wasNull() 3.0 Yes

boolean isWrapperFor 4.0 Yes


(Class<?> iface)

<T> T unwrap(Class<T> 4.0 Yes


iface)

Connection

The following table lists the methods that belong to the Connection interface, and describes
whether each method is supported by the Cloudera JDBC Driver for Apache Hive and which
version of the JDBC API is the earliest version that supports the method.

For detailed information about each method in the Connection interface, see the Java API
documentation: http://docs.oracle.com/javase/1.5.0/docs/api/java/sql/Connection.html.

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

void clearWarnings() 3.0 Yes

void close() 3.0 Yes

void commit() 3.0 Yes Auto-commit cannot be set to


false because it is hard-
coded to true.

48 | Cloudera JDBC Driver for Apache Hive


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

Array createArrayOf(String 4.0 No


typeName, Object[]
elements)

Blob createBlob() 4.0 No

Clob createClob() 4.0 No

NClob createNClob() 4.0 No

SQLXML createSQLXML() 4.0 No

Statement createStatement() 3.0 Yes

Statement createStatement 3.0 No


(int resultSetType, int
resultSetConcurrency)

Statement createStatement 3.0 No


(int resultSetType, int
resultSetConcurrency, int
resultSetHoldability)

Struct createStruct(String 4.0 No


typeName, Object[]
attributes)

boolean getAutoCommit() 3.0 Yes Hard-coded to true.

String getCatalog() 3.0 Yes

Properties getClientInfo() 4.0 Yes

String getClientInfo(String 4.0 Yes


name)

int getHoldability() 3.0 Yes Hard-coded to CLOSE_


CURSORS_AT_COMMIT.

DatabaseMetaData 3.0 Yes


getMetaData()

Cloudera JDBC Driver for Apache Hive | 49


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

int getNetworkTimeout() 4.1 No

String getSchema() 4.1 Yes The returned schema name


does not always match the
one used by statements.
Statements use the schema
name defined in the
connection URL.

int getTransactionIsolation 3.0 Yes Hard-coded to


() TRANSACTION_READ_
UNCOMMITTED.

Map<String,Class<?>> 3.0 No
getTypeMap()

SQLWarning getWarnings() 3.0 Yes

boolean isClosed() 3.0 Yes

boolean isReadOnly() 3.0 Yes Returns true.

boolean isValid(int 4.0 Yes


timeout)

String nativeSQL(String 3.0 Yes


sql)

CallableStatement 3.0 No
prepareCall(String sql)

CallableStatement 3.0 No
prepareCall(String sql, int
resultSetType, int
resultSetConcurrency)

CallableStatement 3.0 No
prepareCall(String sql, int
resultSetType, int
resultSetConcurrency, int
resultSetHoldability)

50 | Cloudera JDBC Driver for Apache Hive


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

PreparedStatement 3.0 Yes


prepareStatement(String
sql)

PreparedStatement 3.0 No
prepareStatement(String
sql, int autoGeneratedKeys)

PreparedStatement 3.0 No
prepareStatement(String
sql, int[] columnIndexes)

PreparedStatement 3.0 No
prepareStatement(String
sql, int resultSetType, int
resultSetConcurrency)

PreparedStatement 3.0 No
prepareStatement(String
sql, int resultSetType, int
resultSetConcurrency, int
resultSetHoldability)

PreparedStatement 3.0 No
prepareStatement(String
sql, String[] columnNames)

void releaseSavepoint 3.0 No Savepoints are not available


(Savepoint savepoint) because transactions are not
supported.

void rollback() 3.0 No Savepoints are not available


because transactions are not
supported.

void rollback(Savepoint 3.0 No Savepoints are not available


savepoint) because transactions are not
supported.

void setAutoCommit(boolean 3.0 Yes Ignored because auto-commit


autoCommit) is hard-coded to true.

Cloudera JDBC Driver for Apache Hive | 51


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

void setCatalog(String 3.0 Yes


catalog)

void setClientInfo 4.0 Yes


(Properties properties)

void setClientInfo(String 4.0 Yes


name, String value)

void setHoldability(int 3.0 Yes


holdability)

void setNetworkTimeout 4.1 No


(Executor executor, int
milliseconds)

void setReadOnly(boolean 3.0 Yes


readOnly)

Savepoint setSavepoint() 3.0 No Savepoints are not available


because transactions are not
supported.

Savepoint setSavepoint 3.0 No Savepoints are not available


(String name) because transactions are not
supported.

void setSchema(String 4.1 Yes Does not actually change the


schema) schema name used by newly
created statements; only
changes the value returned by
getSchema().

void 3.0 Yes


setTransactionIsolation(int
level)

void setTypeMap 3.0 No


(Map<String,Class<?>> map)

boolean isWrapperFor 4.0 Yes


(Class<?> iface)

52 | Cloudera JDBC Driver for Apache Hive


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

<T> T unwrap(Class<T> 4.0 Yes


iface)

DatabaseMetaData

The following table lists the methods that belong to the DatabaseMetaData interface, and
describes whether each method is supported by the Cloudera JDBC Driver for Apache Hive and
which version of the JDBC API is the earliest version that supports the method.

For detailed information about each method in the DatabaseMetaData interface, see the Java
API
documentation:http://docs.oracle.com/javase/1.5.0/docs/api/java/sql/DatabaseMetaData.html.

Suppo
rted Suppo
Since rted
Method Notes
JDBC by the
Versio Driver
n

boolean allProceduresAreCallable() 3.0 Yes Returns true.

boolean allTablesAreSelectable() 3.0 Yes Returns true.

boolean 4.0 Yes Returns true.


autoCommitFailureClosesAllResultSets()

boolean 3.0 Yes Returns false.


dataDefinitionCausesTransactionCommit
()

boolean 3.0 Yes Returns false.


dataDefinitionIgnoredInTransactions()

boolean deletesAreDetected(int type) 3.0 Yes Returns true.

boolean doesMaxRowSizeIncludeBlobs() 3.0 Yes Returns false.

boolean generatedKeyAlwaysReturned() 4.1 Yes

Cloudera JDBC Driver for Apache Hive | 53


Features

Suppo
rted Suppo
Since rted
Method Notes
JDBC by the
Versio Driver
n

ResultSet getAttributes(String 3.0 Yes


catalog, String schemaPattern, String
typeNamePattern, String
attributeNamePattern)

ResultSet getBestRowIdentifier(String 3.0 Yes


catalog, String schema, String table,
int scope, boolean nullable)

ResultSet getCatalogs() 3.0 Yes

String getCatalogSeparator() 3.0 Yes

String getCatalogTerm() 3.0 Yes

ResultSet getClientInfoProperties() 4.0 Yes

ResultSet getColumnPrivileges(String 3.0 Yes


catalog, String schema, String table,
String columnNamePattern)

ResultSet getColumns(String catalog, 3.0 Yes


String schemaPattern, String
tableNamePattern, String
columnNamePattern)

Connection getConnection() 3.0 Yes

ResultSet getCrossReference(String 3.0 Yes


primaryCatalog, String primarySchema,
String primaryTable, String
foreignCatalog, String foreignSchema,
String foreignTable)

int getDatabaseMajorVersion() 3.0 Yes

int getDatabaseMinorVersion() 3.0 Yes

String getDatabaseProductName() 3.0 Yes Hard-coded to


Impala.

54 | Cloudera JDBC Driver for Apache Hive


Features

Suppo
rted Suppo
Since rted
Method Notes
JDBC by the
Versio Driver
n

String getDatabaseProductVersion() 3.0 Yes

int getDefaultTransactionIsolation() 3.0 Yes Hard-coded to


TRANSACTION_
READ_
UNCOMMITTED.

int getDriverMajorVersion() 3.0 Yes

int getDriverMinorVersion() 3.0 Yes

String getDriverName() 3.0 Yes Hard-coded to


ImpalaJDBC.

String getDriverVersion() 3.0 Yes

ResultSet getExportedKeys(String 3.0 Yes


catalog, String schema, String table)

String getExtraNameCharacters() 3.0 Yes Returns an empty


String.

ResultSet getFunctionColumns(String 4.0 Yes


catalog, String schemaPattern, String
functionNamePattern, String
columnNamePattern)

ResultSet getFunctions(String catalog, 4.0 Yes


String schemaPattern, String
functionNamePattern)

String getIdentifierQuoteString() 3.0 Yes Returns a


backquote (`)

ResultSet getImportedKeys(String 3.0 Yes


catalog, String schema, String table)

ResultSet getIndexInfo(String catalog, 3.0 Yes


String schema, String table, boolean
unique, boolean approximate)

Cloudera JDBC Driver for Apache Hive | 55


Features

Suppo
rted Suppo
Since rted
Method Notes
JDBC by the
Versio Driver
n

int getJDBCMajorVersion() 3.0 Yes

int getJDBCMinorVersion() 3.0 Yes

int getMaxBinaryLiteralLength() 3.0 Yes Returns 0.

int getMaxCatalogNameLength() 3.0 Yes Returns 128.

int getMaxCharLiteralLength() 3.0 Yes Returns 0.

int getMaxColumnNameLength() 3.0 Yes Returns 128.

int getMaxColumnsInGroupBy() 3.0 Yes Returns 0.

int getMaxColumnsInIndex() 3.0 Yes Returns 0.

int getMaxColumnsInOrderBy() 3.0 Yes Returns 0.

int getMaxColumnsInSelect() 3.0 Yes Returns 0.

int getMaxColumnsInTable() 3.0 Yes Returns 0.

int getMaxConnections() 3.0 Yes Returns 0.

int getMaxCursorNameLength() 3.0 Yes Returns 0.

int getMaxIndexLength() 3.0 Yes Returns 0.

int getMaxProcedureNameLength() 3.0 Yes Returns 0.

int getMaxRowSize() 3.0 Yes Returns 0.

int getMaxSchemaNameLength() 3.0 Yes Returns 128.

int getMaxStatementLength() 3.0 Yes Returns 0.

int getMaxStatements() 3.0 Yes Returns 0.

int getMaxTableNameLength() 3.0 Yes Returns 128.

56 | Cloudera JDBC Driver for Apache Hive


Features

Suppo
rted Suppo
Since rted
Method Notes
JDBC by the
Versio Driver
n

int getMaxTablesInSelect() 3.0 Yes Returns 0.

int getMaxUserNameLength() 3.0 Yes Returns 0.

String getNumericFunctions() 3.0 Yes Returns the


Numeric Functions
list from the
specification related
to the JDBC version
of the driver.

ResultSet getPrimaryKeys(String 3.0 Yes


catalog, String schema, String table)

ResultSet getProcedureColumns(String 3.0 Yes


catalog, String schemaPattern, String
procedureNamePattern, String
columnNamePattern)

ResultSet getProcedures(String 3.0 Yes


catalog, String schemaPattern, String
procedureNamePattern)

String getProcedureTerm() 3.0 Yes Returns


procedure.

ResultSet getPseudoColumns(String 4.1 Yes


catalog, String schemaPattern, String
tableNamePattern, String
columnNamePattern)

int getResultSetHoldability() 3.0 Yes Returns CLOSE_


CURSORS_AT_
COMMIT.

RowIdLifetime getRowIdLifetime() 4.0 Yes Returns ROWID_


UNSUPPORTED.

ResultSet getSchemas() 3.0 Yes

Cloudera JDBC Driver for Apache Hive | 57


Features

Suppo
rted Suppo
Since rted
Method Notes
JDBC by the
Versio Driver
n

ResultSet getSchemas(String catalog, 4.0 Yes


String schemaPattern)

String getSchemaTerm() 3.0 Yes Returns schema.

String getSearchStringEscape() 3.0 Yes Returns a backslash


(\).

String getSQLKeywords() 3.0 Yes Returns an empty


String.

int getSQLStateType() 3.0 Yes Returns


sqlStateSQL99.

String getStringFunctions() 3.0 Yes Returns the String


Functions list from
the specification
related to the JDBC
version of the
driver.

ResultSet getSuperTables(String 3.0 Yes


catalog, String schemaPattern, String
tableNamePattern)

ResultSet getSuperTypes(String 3.0 Yes


catalog, String schemaPattern, String
typeNamePattern)

String getSystemFunctions() 3.0 Yes Returns


DATABASE,IFNUL
L,USER.

ResultSet getTablePrivileges(String 3.0 Yes


catalog, String schemaPattern, String
tableNamePattern)

ResultSet getTables(String catalog, 3.0 Yes


String schemaPattern, String
tableNamePattern, String[] types)

58 | Cloudera JDBC Driver for Apache Hive


Features

Suppo
rted Suppo
Since rted
Method Notes
JDBC by the
Versio Driver
n

ResultSet getTableTypes() 3.0 Yes

String getTimeDateFunctions() 3.0 Yes Returns the Time


and Date Functions
list from the
specification related
to the JDBC version
of the driver.

ResultSet getTypeInfo() 3.0 Yes

ResultSet getUDTs(String catalog, 3.0 Yes


String schemaPattern, String
typeNamePattern, int[] types)

String getURL() 3.0 Yes

String getUserName() 3.0 Yes

ResultSet getVersionColumns(String 3.0 Yes


catalog, String schema, String table)

boolean insertsAreDetected(int type) 3.0 Yes

boolean isCatalogAtStart() 3.0 Yes

boolean isReadOnly() 3.0 Yes Returns true.

boolean locatorsUpdateCopy() 3.0 Yes Returns false.

boolean nullPlusNonNullIsNull() 3.0 Yes Returns true.

boolean nullsAreSortedAtEnd() 3.0 Yes Returns false.

boolean nullsAreSortedAtStart() 3.0 Yes Returns false.

boolean nullsAreSortedHigh() 3.0 Yes Returns false.

boolean nullsAreSortedLow() 3.0 Yes Returns true.

Cloudera JDBC Driver for Apache Hive | 59


Features

Suppo
rted Suppo
Since rted
Method Notes
JDBC by the
Versio Driver
n

boolean othersDeletesAreVisible(int 3.0 Yes


type)

boolean othersInsertsAreVisible(int 3.0 Yes


type)

boolean othersUpdatesAreVisible(int 3.0 Yes


type)

boolean ownDeletesAreVisible(int type) 3.0 Yes

boolean ownInsertsAreVisible(int type) 3.0 Yes

boolean ownUpdatesAreVisible(int type) 3.0 Yes

boolean storesLowerCaseIdentifiers() 3.0 Yes Returns false.

boolean 3.0 Yes Returns false.


storesLowerCaseQuotedIdentifiers()

boolean storesMixedCaseIdentifiers() 3.0 Yes Returns true.

boolean 3.0 Yes Returns true.


storesMixedCaseQuotedIdentifiers()

boolean storesUpperCaseIdentifiers() 3.0 Yes Returns false.

boolean 3.0 Yes Returns false.


storesUpperCaseQuotedIdentifiers()

boolean 3.0 Yes Returns false.


supportsAlterTableWithAddColumn()

boolean 3.0 Yes Returns false.


supportsAlterTableWithDropColumn()

boolean supportsANSI92EntryLevelSQL() 3.0 Yes Returns true.

60 | Cloudera JDBC Driver for Apache Hive


Features

Suppo
rted Suppo
Since rted
Method Notes
JDBC by the
Versio Driver
n

boolean supportsANSI92FullSQL() 3.0 Yes Returns false.

boolean supportsANSI92IntermediateSQL 3.0 Yes Returns false.


()

boolean supportsBatchUpdates() 3.0 Yes Returns false.

boolean 3.0 Yes Returns true.


supportsCatalogsInDataManipulation()

boolean 3.0 Yes Returns true.


supportsCatalogsInIndexDefinitions()

boolean 3.0 Yes Returns true.


supportsCatalogsInPrivilegeDefinitions
()

boolean 3.0 Yes Returns true.


supportsCatalogsInProcedureCalls()

boolean 3.0 Yes Returns true.


supportsCatalogsInTableDefinitions()

boolean supportsColumnAliasing() 3.0 Yes Returns true.

boolean supportsConvert() 3.0 Yes Returns true.

boolean supportsConvert(int fromType, 3.0 Yes


int toType)

boolean supportsCoreSQLGrammar() 3.0 Yes Returns true.

boolean supportsCorrelatedSubqueries() 3.0 Yes Returns true.

boolean 3.0 Yes Returns false.


supportsDataDefinitionAndDataManipulat
ionTransactions()

Cloudera JDBC Driver for Apache Hive | 61


Features

Suppo
rted Suppo
Since rted
Method Notes
JDBC by the
Versio Driver
n

boolean 3.0 Yes Returns false.


supportsDataManipulationTransactionsOn
ly()

boolean 3.0 Yes Returns false.


supportsDifferentTableCorrelationNames
()

boolean supportsExpressionsInOrderBy() 3.0 Yes Returns true.

boolean supportsExtendedSQLGrammar() 3.0 Yes Returns false.

boolean supportsFullOuterJoins() 3.0 Yes Returns true.

boolean supportsGetGeneratedKeys() 3.0 Yes Returns false.

boolean supportsGroupBy() 3.0 Yes Returns true.

boolean supportsGroupByBeyondSelect() 3.0 Yes Returns true.

boolean supportsGroupByUnrelated() 3.0 Yes Returns false.

boolean 3.0 Yes Returns false.


supportsIntegrityEnhancementFacility()

boolean supportsLikeEscapeClause() 3.0 Yes Returns true.

boolean supportsLimitedOuterJoins() 3.0 Yes Returns false.

boolean supportsMinimumSQLGrammar() 3.0 Yes Returns true.

boolean supportsMixedCaseIdentifiers() 3.0 Yes Returns false.

boolean 3.0 Yes Returns true.


supportsMixedCaseQuotedIdentifiers()

boolean supportsMultipleOpenResults() 3.0 Yes Returns false.

62 | Cloudera JDBC Driver for Apache Hive


Features

Suppo
rted Suppo
Since rted
Method Notes
JDBC by the
Versio Driver
n

boolean supportsMultipleResultSets() 3.0 Yes Returns false.

boolean supportsMultipleTransactions() 3.0 Yes Returns true.

boolean supportsNamedParameters() 3.0 Yes Returns false.

boolean supportsNonNullableColumns() 3.0 Yes Returns false.

boolean 3.0 Yes Returns false.


supportsOpenCursorsAcrossCommit()

boolean 3.0 Yes Returns false.


supportsOpenCursorsAcrossRollback()

boolean 3.0 Yes Returns true.


supportsOpenStatementsAcrossCommit()

boolean 3.0 Yes Returns true.


supportsOpenStatementsAcrossRollback()

boolean supportsOrderByUnrelated() 3.0 Yes Returns false.

boolean supportsOuterJoins() 3.0 Yes Returns false.

boolean supportsPositionedDelete() 3.0 Yes Returns false.

boolean supportsPositionedUpdate() 3.0 Yes Returns false.

boolean supportsResultSetConcurrency 3.0 Yes


(int type, int concurrency)

boolean supportsResultSetHoldability 3.0 Yes


(int holdability)

boolean supportsResultSetType(int 3.0 Yes


type)

boolean supportsSavepoints() 3.0 Yes Returns false.

Cloudera JDBC Driver for Apache Hive | 63


Features

Suppo
rted Suppo
Since rted
Method Notes
JDBC by the
Versio Driver
n

boolean 3.0 Yes Returns true.


supportsSchemasInDataManipulation()

boolean 3.0 Yes Returns true.


supportsSchemasInIndexDefinitions()

boolean 3.0 Yes Returns true.


supportsSchemasInPrivilegeDefinitions
()

boolean 3.0 Yes Returns false.


supportsSchemasInProcedureCalls()

boolean 3.0 Yes Returns true.


supportsSchemasInTableDefinitions()

boolean supportsSelectForUpdate() 3.0 Yes Returns false.

boolean supportsStatementPooling() 3.0 Yes Returns false.

boolean 4.0 Yes Returns false.


supportsStoredFunctionsUsingCallSyntax
()

boolean supportsStoredProcedures() 3.0 Yes Returns true.

boolean 3.0 Yes Returns true.


supportsSubqueriesInComparisons()

boolean supportsSubqueriesInExists() 3.0 Yes Returns true.

boolean supportsSubqueriesInIns() 3.0 Yes Returns true.

boolean 3.0 Yes Returns true.


supportsSubqueriesInQuantifieds()

boolean supportsTableCorrelationNames 3.0 Yes Returns true.


()

64 | Cloudera JDBC Driver for Apache Hive


Features

Suppo
rted Suppo
Since rted
Method Notes
JDBC by the
Versio Driver
n

boolean 3.0 Yes


supportsTransactionIsolationLevel(int
level)

boolean supportsTransactions() 3.0 Yes Returns false.

boolean supportsUnion() 3.0 Yes Returns true.

boolean supportsUnionAll() 3.0 Yes Returns true.

boolean updatesAreDetected(int type) 3.0 Yes Returns true.

boolean usesLocalFilePerTable() 3.0 Yes Returns false.

boolean usesLocalFiles() 3.0 Yes Returns false.

boolean isWrapperFor(Class<?> iface) 4.0 Yes

<T> T unwrap(Class<T> iface) 4.0 Yes

DataSource

The following table lists the methods that belong to the DataSource interface, and describes
whether each method is supported by the Cloudera JDBC Driver for Apache Hive and which
version of the JDBC API is the earliest version that supports the method.

For detailed information about each method in the DataSource interface, see the Java API
documentation: http://docs.oracle.com/javase/1.5.0/docs/api/javax/sql/DataSource.html.

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

Connection getConnection() 3.0 Yes

Connection getConnection 3.0 Yes


(String username, String
password)

int getLoginTimeout() 3.0 Yes

Cloudera JDBC Driver for Apache Hive | 65


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

PrintWriter getLogWriter() 3.0 Yes

Logger getParentLogger() 4.1 No The driver does not use


java.util.logging.

void setLoginTimeout(int 3.0 Yes


seconds)

void setLogWriter 3.0 Yes


(PrintWriter out)

boolean isWrapperFor 4.0 Yes


(Class<?> iface)

<T> T unwrap(Class<T> 4.0 Yes


iface)

Driver

The following table lists the methods that belong to the Driver interface, and describes whether
each method is supported by the Cloudera JDBC Driver for Apache Hive and which version of the
JDBC API is the earliest version that supports the method.

For detailed information about each method in the Driver interface, see the Java API
documentation: http://docs.oracle.com/javase/1.5.0/docs/api/java/sql/Driver.html.

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

boolean acceptsURL(String 3.0 Yes


url)

Connection connect(String 3.0 Yes


url, Properties info)

int getMajorVersion() 3.0 Yes

int getMinorVersion() 3.0 Yes

Logger getParentLogger() 4.1 No

66 | Cloudera JDBC Driver for Apache Hive


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

DriverPropertyInfo[] 3.0 Yes


getPropertyInfo(String url,
Properties info)

boolean jdbcCompliant() 3.0 Yes

ParameterMetaData

The following table lists the methods that belong to the ParameterMetaData interface, and
describes whether each method is supported by the Cloudera JDBC Driver for Apache Hive and
which version of the JDBC API is the earliest version that supports the method.

For detailed information about each method in the ParameterMetaData interface, see the
Java API documentation:
http://docs.oracle.com/javase/1.5.0/docs/api/java/sql/ParameterMetaData.html.

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

String 3.0 Yes


getParameterClassName(int
param)

int getParameterCount() 3.0 Yes

int getParameterMode(int 3.0 Yes


param)

int getParameterType(int 3.0 Yes


param)

String getParameterTypeName 3.0 Yes


(int param)

int getPrecision(int param) 3.0 Yes

int getScale(int param) 3.0 Yes

int isNullable(int param) 3.0 Yes

Cloudera JDBC Driver for Apache Hive | 67


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

boolean isSigned(int param) 3.0 Yes

boolean isWrapperFor 4.0 Yes


(Class<?> iface)

<T> T unwrap(Class<T> 4.0 Yes


iface)

PooledConnection

The following table lists the methods that belong to the PooledConnection interface, and
describes whether each method is supported by the Cloudera JDBC Driver for Apache Hive and
which version of the JDBC API is the earliest version that supports the method.

For detailed information about each method in the PooledConnection interface, see the Java
API documentation:
http://docs.oracle.com/javase/1.5.0/docs/api/javax/sql/PooledConnection.html.

Supporte
Supporte
d
d
Method Since Notes
by the
JDBC
Driver
Version

void 3.0 Yes


addConnectionEventListener
(ConnectionEventListener
listener)

void 4.0 Yes


addStatementEventListener
(StatementEventListener
listener)

void close() 3.0 Yes

Connection getConnection() 3.0 Yes

void 3.0 Yes


removeConnectionEventListen
er(ConnectionEventListener
listener)

68 | Cloudera JDBC Driver for Apache Hive


Features

Supporte
Supporte
d
d
Method Since Notes
by the
JDBC
Driver
Version

void 4.0 Yes Removes the specified


removeStatementEventListener StatementEventListene
(StatementEventListener r from the list of components
listener) that will be notified when the
driver detects that a
PreparedStatement has
been closed or is invalid.

PreparedStatement

The PreparedStatement interface extends the Statement interface.

The following table lists the methods that belong to the PreparedStatement interface, and
describes whether each method is supported by the Cloudera JDBC Driver for Apache Hive and
which version of the JDBC API is the earliest version that supports the method.

For detailed information about each method in the PooledConnection interface, see the Java
API documentation:
http://docs.oracle.com/javase/1.5.0/docs/api/java/sql/PreparedStatement.html.

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

void addBatch() 3.0 Yes

void clearParameters() 3.0 Yes

boolean execute() 3.0 Yes

ResultSet executeQuery() 3.0 Yes

int executeUpdate() 3.0 Yes

ResultSetMetaData 3.0 Yes


getMetaData()

ParameterMetaData 3.0 Yes


getParameterMetaData()

Cloudera JDBC Driver for Apache Hive | 69


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

void setArray(int 3.0 No


parameterIndex, Array x)

void setAsciiStream(int 4.0 Yes


parameterIndex, InputStream
x)

void setAsciiStream(int 3.0 Yes


parameterIndex, InputStream
x, int length)

void setAsciiStream(int 4.0 Yes


parameterIndex, InputStream
x, long length)

void setBigDecimal(int 3.0 Yes


parameterIndex, BigDecimal
x)

void setBinaryStream(int 4.0 Yes


parameterIndex, InputStream
x)

void setBinaryStream(int 3.0 Yes


parameterIndex, InputStream
x, int length)

void setBinaryStream(int 4.0 Yes


parameterIndex, InputStream
x, long length)

void setBlob(int 3.0 No


parameterIndex, Blob x)

void setBlob(int 4.0 No


parameterIndex, InputStream
inputStream)

void setBlob(int 4.0 No


parameterIndex, InputStream
inputStream, long length)

70 | Cloudera JDBC Driver for Apache Hive


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

void setBoolean(int 3.0 Yes


parameterIndex, boolean x)

void setByte(int 3.0 Yes


parameterIndex, byte x)

void setBytes(int 3.0 Yes


parameterIndex, byte[] x)

void setCharacterStream(int 4.0 Yes


parameterIndex, Reader
reader)

void setCharacterStream(int 3.0 Yes


parameterIndex, Reader
reader, int length)

void setCharacterStream(int 4.0 Yes


parameterIndex, Reader
reader, long length)

void setClob(int 3.0 No


parameterIndex, Clob x)

void setClob(int 4.0 No


parameterIndex, Reader
reader)

void setClob(int 4.0 No


parameterIndex, Reader
reader, long length)

void setDate(int 3.0 Yes


parameterIndex, Date x)

void setDate(int 3.0 Yes


parameterIndex, Date x,
Calendar cal)

void setDouble(int 3.0 Yes


parameterIndex, double x)

Cloudera JDBC Driver for Apache Hive | 71


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

void setFloat(int 3.0 Yes


parameterIndex, float x)

void setInt(int 3.0 Yes


parameterIndex, int x)

void setLong(int 3.0 Yes


parameterIndex, long x)

void setNCharacterStream 4.0 No


(int parameterIndex, Reader
value)

void setNCharacterStream 4.0 No


(int parameterIndex, Reader
value, long length)

void setNClob(int 4.0 No


parameterIndex, NClob
value)

void setNClob(int 4.0 No


parameterIndex, Reader
reader)

void setNClob(int 4.0 No


parameterIndex, Reader
reader, long length)

void setNString(int 4.0 No


parameterIndex, String
value)

void setNull(int 3.0 Yes


paramIndex, int sqlType,
String typeName)

void setObject(int 3.0 Yes


parameterIndex, Object x)

void setObject(int 3.0 Yes


parameterIndex, Object x,
int targetSqlType)

72 | Cloudera JDBC Driver for Apache Hive


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

void setObject(int 3.0 Yes


parameterIndex, Object x,
int targetSqlType, int
scale)

void setRef(int 3.0 No


parameterIndex, Ref x)

void setRowId(int 4.0 No


parameterIndex, RowId x)

void setShort(int 3.0 No


parameterIndex, short x)

void setSQLXML(int 4.0 Yes


parameterIndex, SQLXML
xmlObject)

void setString(int 3.0 Yes


parameterIndex, String x)

void setTime(int 3.0 Yes


parameterIndex, Time x)

void setTime(int 3.0 Yes


parameterIndex, Time x,
Calendar cal)

void setTimestamp(int 3.0 Yes


parameterIndex, Timestamp
x)

void setTimestamp(int 3.0 Yes


parameterIndex, Timestamp
x, Calendar cal)

void setUnicodeStream(int 3.0 Yes Deprecated.


parameterIndex, InputStream
x, int length)

void setURL(int 3.0 No


parameterIndex, URL x)

Cloudera JDBC Driver for Apache Hive | 73


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

boolean isWrapperFor 4.0 Yes


(Class<?> iface)

<T> T unwrap(Class<T> 4.0 Yes


iface)

ResultSet

The following table lists the methods that belong to the ResultSet interface, and describes
whether each method is supported by the Cloudera JDBC Driver for Apache Hive and which
version of the JDBC API is the earliest version that supports the method.

For detailed information about each method in the ResultSet interface, see the Java API
documentation: http://docs.oracle.com/javase/1.5.0/docs/api/java/sql/ResultSet.html.

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

boolean absolute(int row) 3.0 No

void afterLast() 3.0 No

void beforeFirst() 3.0 No

void cancelRowUpdates() 3.0 No Not valid because the driver is


read-only.

void clearWarnings() 3.0 Yes

void close() 3.0 Yes

void deleteRow() 3.0 No Not valid because the driver is


read-only.

int findColumn(String 3.0 Yes


columnName)

boolean first() 3.0 No

Array getArray(int i) 3.0 No

74 | Cloudera JDBC Driver for Apache Hive


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

Array getArray(String 3.0 No


colName)

InputStream getAsciiStream 3.0 Yes


(int columnIndex)

InputStream getAsciiStream 3.0 Yes


(String columnName)

BigDecimal getBigDecimal 3.0 Yes


(int columnIndex)

BigDecimal getBigDecimal 3.0 Yes Deprecated.


(int columnIndex, int
scale)

BigDecimal getBigDecimal 3.0 Yes


(String columnName)

BigDecimal getBigDecimal 3.0 Yes Deprecated.


(String columnName, int
scale)

InputStream getBinaryStream 3.0 Yes


(int columnIndex)

InputStream getBinaryStream 3.0 Yes


(String columnName)

Blob getBlob(int i) 3.0 No

Blob getBlob(String 3.0 No


colName)

boolean getBoolean(int 3.0 Yes


columnIndex)

boolean getBoolean(String 3.0 Yes


columnName)

getByte(int columnIndex) 3.0 Yes

Cloudera JDBC Driver for Apache Hive | 75


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

byte getByte(String 3.0 Yes


columnName)

byte[] getBytes(int 3.0 Yes


columnIndex)

byte[] getBytes(String 3.0 Yes


columnName)

Reader getCharacterStream 3.0 Yes


(int columnIndex)

Reader getCharacterStream 3.0 Yes


(String columnName)

Clob getClob(int i) 3.0 No

Clob getClob(String 3.0 No


colName)

int getConcurrency() 3.0 Yes

String getCursorName() 3.0 Yes

Date getDate(int 3.0 Yes


columnIndex)

Date getDate(int 3.0 Yes


columnIndex, Calendar cal)

Date getDate(String 3.0 Yes


columnName)

Date getDate(String 3.0 Yes


columnName, Calendar cal)

double getDouble(int 3.0 Yes


columnIndex)

double getDouble(String 3.0 Yes


columnName)

int getFetchDirection() 3.0 Yes

76 | Cloudera JDBC Driver for Apache Hive


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

int getFetchSize() 3.0 Yes

float getFloat(int 3.0 Yes


columnIndex)

float getFloat(String 3.0 Yes


columnName)

int getHoldability() 4.0 Yes

int getInt(int columnIndex) 3.0 Yes

int getInt(String 3.0 Yes


columnName)

long getLong(int 3.0 Yes


columnIndex)

long getLong(String 3.0 Yes


columnName)

ResultSetMetaData 3.0 Yes


getMetaData()

Reader getNCharacterStream 4.0 No


(int columnIndex)

Reader getNCharacterStream 4.0 No


(String columnLabel

NClob getNClob(int 4.0 No


columnIndex)

NClob getNClob(String 4.0 No


columnLabel)

String getNString(int 4.0 No


columnIndex)

String getNString(String 4.0 No


columnLabel)

Cloudera JDBC Driver for Apache Hive | 77


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

Object getObject(int 3.0 Yes


columnIndex)

<T> T getObject(int 4.1 No


columnIndex, Class<T> type)

Object getObject(int i, 3.0 No


Map<String,Class<?>> map)

Object getObject(String 3.0 No


columnName)

<T> T getObject(String 4.1 No


columnName, Class<T> type)

Object getObject(String 3.0 Yes


colName,
Map<String,Class<?>> map)

Ref getRef(int i) 3.0 No

Ref getRef(String colName) 3.0 No

int getRow() 3.0 Yes

RowId getRowId(int 4.0 No


columnIndex)

RowId getRowId(String 4.0 No


columnLabel)

short getShort(int 3.0 Yes


columnIndex)

short getShort(String 3.0 Yes


columnName)

SQLXML getSQLXML(int 4.0 No


columnIndex)

SQLXML getSQLXML(String 4.0 No


columnLabel)

78 | Cloudera JDBC Driver for Apache Hive


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

Statement getStatement() 3.0 Yes

String getString(int 3.0 Yes


columnIndex)

String getString(String 3.0 Yes


columnName)

Time getTime(int 3.0 Yes


columnIndex)

Time getTime(int 3.0 Yes


columnIndex, Calendar cal)

Time getTime(String 3.0 Yes


columnName)

Time getTime(String 3.0 Yes


columnName, Calendar cal)

Timestamp getTimestamp(int 3.0 Yes


columnIndex)

Timestamp getTimestamp(int 3.0 Yes


columnIndex, Calendar cal)

Timestamp getTimestamp 3.0 Yes


(String columnName)

Timestamp getTimestamp 3.0 Yes


(String columnName,
Calendar cal)

int getType() 3.0 Yes

InputStream 3.0 Yes Deprecated.


getUnicodeStream(int
columnIndex)

InputStream 3.0 Yes Deprecated.


getUnicodeStream(String
columnName)

Cloudera JDBC Driver for Apache Hive | 79


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

URL getURL(int columnIndex) 3.0 No

URL getURL(String 3.0 No


columnName)

SQLWarning getWarnings() 3.0 Yes

void insertRow() 3.0 No Not valid because the driver is


read-only.

boolean isAfterLast() 3.0 Yes

boolean isBeforeFirst() 3.0 Yes

boolean isClosed() 4.0 Yes

boolean isFirst() 3.0 Yes

boolean isLast() 3.0 No

boolean last() 3.0 No

void moveToCurrentRow() 3.0 No Not valid because the driver is


read-only.

void moveToInsertRow() 3.0 No Not valid because the driver is


read-only.

boolean next() 3.0 Yes

boolean previous() 3.0 No

void refreshRow() 3.0 No

boolean relative(int rows) 3.0 No

boolean rowDeleted() 3.0 Yes Hard-coded to false.

boolean rowInserted() 3.0 Yes Hard-coded to false.

80 | Cloudera JDBC Driver for Apache Hive


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

boolean rowUpdated() 3.0 Yes Hard-coded to false.

void setFetchDirection(int 3.0 No Not valid because the driver is


direction) forward-only.

void setFetchSize(int rows) 3.0 Yes

void updateArray(int 3.0 No


columnIndex, Array x)

void updateArray(String 3.0 No


columnName, Array x)

void updateAsciiStream(int 4.0 No Not valid because the driver is


columnIndex, InputStream x) read-only.

void updateAsciiStream(int 3.0 No Not valid because the driver is


columnIndex, InputStream x, read-only.
int length)

void updateAsciiStream(int 4.0 No Not valid because the driver is


columnIndex, InputStream x, read-only.
long length)

void updateAsciiStream 4.0 No Not valid because the driver is


(String columnName, read-only.
InputStream x)

void updateAsciiStream 3.0 No Not valid because the driver is


(String columnName, read-only.
InputStream x, int length)

void updateAsciiStream 4.0 No Not valid because the driver is


(String columnName, read-only.
InputStream x, long length)

void updateBigDecimal(int 3.0 No Not valid because the driver is


columnIndex, BigDecimal x) read-only.

Cloudera JDBC Driver for Apache Hive | 81


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

void updateBigDecimal 3.0 No Not valid because the driver is


(String columnName, read-only.
BigDecimal x)

void updateBinaryStream(int 4.0 No Not valid because the driver is


columnIndex, InputStream x) read-only.

void updateBinaryStream(int 3.0 No Not valid because the driver is


columnIndex, InputStream x, read-only.
int length)

void updateBinaryStream(int 4.0 No Not valid because the driver is


columnIndex, InputStream x, read-only.
long length)

void updateBinaryStream 4.0 No Not valid because the driver is


(String columnName, read-only.
InputStream x)

void updateBinaryStream 3.0 No Not valid because the driver is


(String columnName, read-only.
InputStream x, int length)

void updateBinaryStream 4.0 No Not valid because the driver is


(String columnName, read-only.
InputStream x, long length)

void updateBlob(int 4.0 No


columnIndex, InputStream
inputStream)

void updateBlob(int 3.0 No


columnIndex, Blob x)

void updateBlob(int 4.0 No


columnIndex, InputStream
inputStream, long length)

void updateBlob(String 4.0 No


columnName, InputStream
inputStream)

82 | Cloudera JDBC Driver for Apache Hive


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

void updateBlob(String 3.0 No


columnName, Blob x)

void updateBlob(String 4.0 No


columnLabel, InputStream
inputStream, long length)

void updateBoolean(int 3.0 No Not valid because the driver is


columnIndex, boolean x) read-only.

void updateBoolean(String 3.0 No Not valid because the driver is


columnName, boolean x) read-only.

void updateByte(int 3.0 No Not valid because the driver is


columnIndex, byte x) read-only.

void updateByte(String 3.0 No Not valid because the driver is


columnName, byte x) read-only.

void updateBytes(int 3.0 No Not valid because the driver is


columnIndex, byte[] x) read-only.

void updateBytes(String 3.0 No Not valid because the driver is


columnName, byte[] x) read-only.

void updateCharacterStream 3.0 No Not valid because the driver is


(int columnIndex, Reader x, read-only.
int length)

void updateCharacterStream 3.0 No Not valid because the driver is


(String columnName, Reader read-only.
reader, int length)

void updateBlob(int 4.0 No


columnIndex, InputStream
inputStream)

void updateClob(int 3.0 No


columnIndex, Clob x)

Cloudera JDBC Driver for Apache Hive | 83


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

void updateBlob(int 4.0 No


columnIndex, InputStream
inputStream, long length)

void updateBlob(String 4.0 No


columnName, InputStream
inputStream)

void updateClob(String 3.0 No


columnName, Clob x)

void updateBlob(String 4.0 No


columnName, InputStream
inputStream, long length)

void updateDate(int 3.0 No Not valid because the driver is


columnIndex, Date x) read-only.

void updateDate(String 3.0 No Not valid because the driver is


columnName, Date x) read-only.

void updateDouble(int 3.0 No Not valid because the driver is


columnIndex, double x) read-only.

void updateDouble(String 3.0 No Not valid because the driver is


columnName, double x) read-only.

void updateFloat(int 3.0 No Not valid because the driver is


columnIndex, float x) read-only.

void updateFloat(String 3.0 No Not valid because the driver is


columnName, float x) read-only.

void updateInt(int 3.0 No Not valid because the driver is


columnIndex, int x) read-only.

void updateInt(String 3.0 No Not valid because the driver is


columnName, int x) read-only.

void updateLong(int 3.0 No Not valid because the driver is


columnIndex, long x) read-only.

84 | Cloudera JDBC Driver for Apache Hive


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

void updateLong(String 3.0 No Not valid because the driver is


columnName, long x) read-only.

void updateNCharacterStream 4.0 No


(int columnIndex, Reader x)

void updateNCharacterStream 4.0 No


(int columnIndex, Reader x,
long length)

void updateNCharacterStream 4.0 No


(String columnName, Reader
reader)

void updateNCharacterStream 4.0 No


(String columnName, Reader
reader, long length)

void updateNClob(int 4.0 No


columnIndex, NClob nClob)

void updateNClob(int 4.0 No


columnIndex, Reader reader)

void updateNClob(int 4.0 No


columnIndex, Reader reader,
long length)

void updateNClob(String 4.0 No


columnName, NClob nClob)

void updateNClob(String 4.0 No


columnName, Reader reader)

void updateNClob(String 4.0 No


columnName, Reader reader,
long length)

void updateNString(int 4.0 No


columnIndex, String
nString)

Cloudera JDBC Driver for Apache Hive | 85


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

void updateNString(String 4.0 No


columnName, String nString)

void updateNull(int 3.0 No Not valid because the driver is


columnIndex) read-only.

void updateNull(String 3.0 No Not valid because the driver is


columnName) read-only.

void updateObject(int 3.0 No Not valid because the driver is


columnIndex, Object x) read-only.

void updateObject(int 3.0 No Not valid because the driver is


columnIndex, Object x, int read-only.
scale)

void updateObject(String 3.0 No Not valid because the driver is


columnName, Object x) read-only.

void updateObject(String 3.0 No Not valid because the driver is


columnName, Object x, int read-only.
scale)

void updateRef(int 3.0 No Not valid because the driver is


columnIndex, Ref x) read-only.

void updateRef(String 3.0 No Not valid because the driver is


columnName, Ref x) read-only.

void updateRow() 3.0 No Not valid because the driver is


read-only.

void updateRowId(int 4.0 No


columnIndex, RowId x)

void updateRowId(String 4.0 No


columnName, RowId x)

void updateShort(int 3.0 No Not valid because the driver is


columnIndex, short x) read-only.

86 | Cloudera JDBC Driver for Apache Hive


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

void updateShort(String 3.0 No Not valid because the driver is


columnName, short x) read-only.

void updateSQLXML(int 4.0 No


columnIndex, SQLXML
xmlObject)

void updateSQLXML(String 4.0 No


columnName, SQLXML
xmlObject)

void updateString(int 3.0 No Not valid because the driver is


columnIndex, String x) read-only.

void updateString(String 3.0 No Not valid because the driver is


columnName, String x) read-only.

void updateTime(int 3.0 No Not valid because the driver is


columnIndex, Time x) read-only.

void updateTime(String 3.0 No Not valid because the driver is


columnName, Time x) read-only.

void updateTimestamp(int 3.0 No Not valid because the driver is


columnIndex, Timestamp x) read-only.

void updateTimestamp(String 3.0 No Not valid because the driver is


columnName, Timestamp x) read-only.

boolean wasNull() 3.0 Yes

boolean isWrapperFor 4.0 Yes


(Class<?> iface)

<T> T unwrap(Class<T> 4.0 Yes


iface)

ResultSetMetaData

The following table lists the methods that belong to the ResultSetMetaData interface, and
describes whether each method is supported by the Cloudera JDBC Driver for Apache Hive and
which version of the JDBC API is the earliest version that supports the method.

Cloudera JDBC Driver for Apache Hive | 87


Features

For detailed information about each method in the ResultSetMetaData interface, see the
Java API documentation:
http://docs.oracle.com/javase/1.5.0/docs/api/java/sql/ResultSetMetaData.html.

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

String getCatalogName(int 3.0 Yes


column)

String getColumnClassName 3.0 Yes


(int column)

int getColumnCount() 3.0 Yes

int getColumnDisplaySize 3.0 Yes


(int column)

String getColumnLabel(int 3.0 Yes


column)

String getColumnName(int 3.0 Yes


column)

int getColumnType(int 3.0 Yes


column)

String getColumnTypeName 3.0 Yes


(int column)

int getPrecision(int 3.0 Yes


column)

int getScale(int column) 3.0 Yes

String getSchemaName(int 3.0 Yes


column)

String getTableName(int 3.0 Yes


column)

boolean isAutoIncrement(int 3.0 Yes


column)

boolean isCaseSensitive(int 3.0 Yes


column)

88 | Cloudera JDBC Driver for Apache Hive


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

boolean isCurrency(int 3.0 Yes


column)

boolean 3.0 Yes


isDefinitelyWritable(int
column)

int isNullable(int column) 3.0 Yes

boolean isReadOnly(int 3.0 Yes


column)

boolean isSearchable(int 3.0 Yes


column)

boolean isSigned(int 3.0 Yes


column)

boolean isWritable(int 3.0 Yes


column)

boolean isWrapperFor 4.0 Yes


(Class<?> iface)

<T> T unwrap(Class<T> 4.0 Yes


iface)

Statement

The following table lists the methods that belong to the Statement interface, and describes
whether each method is supported by the Cloudera JDBC Driver for Apache Hive and which
version of the JDBC API is the earliest version that supports the method.

For detailed information about each method in the Statement interface, see the Java API
documentation: http://docs.oracle.com/javase/1.5.0/docs/api/java/sql/Statement.html.

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

void addBatch(String sql) 3.0 Yes

Cloudera JDBC Driver for Apache Hive | 89


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

void cancel() 3.0 Yes

void clearBatch() 3.0 Yes

void clearWarnings() 3.0 Yes

void close() 3.0 Yes

void closeOnCompletion() 4.1 Yes

boolean execute(String sql) 3.0 Yes

boolean execute(String sql, 3.0 No


int autoGeneratedKeys)

boolean execute(String sql, 3.0 No


int[] columnIndexes)

boolean execute(String sql, 3.0 No


String[] columnNames)

int[]executeBatch() 3.0 No

ResultSet executeQuery 3.0 Yes


(String sql)

int executeUpdate(String 3.0 Yes


sql)

int executeUpdate(String 3.0 No


sql, int autoGeneratedKeys)

int executeUpdate(String 3.0 No


sql, int[] columnIndexes)

int executeUpdate(String 3.0 No


sql, String[] columnNames)

Connection getConnection() 3.0 Yes

90 | Cloudera JDBC Driver for Apache Hive


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

int getFetchDirection() 3.0 Yes

int getFetchSize() 3.0 Yes

ResultSet getGeneratedKeys 3.0 Yes


()

int getMaxFieldSize() 3.0 Yes

int getMaxRows() 3.0 Yes

boolean getMoreResults() 3.0 Yes

boolean getMoreResults(int 3.0 No


current)

int getQueryTimeout() 3.0 Yes

ResultSet getResultSet() 3.0 Yes

int getResultSetConcurrency 3.0 Yes Hard-coded to CONCUR_


() READ_ONLY.

int getResultSetHoldability 3.0 Yes Hard-coded to CLOSE_


() CURSORS_AT_COMMIT.

int getResultSetType() 3.0 Yes Hard-coded to TYPE_


FORWARD_ONLY.

int getUpdateCount() 3.0 Yes

SQLWarning getWarnings() 3.0 Yes

boolean isClosed() 4.0 Yes

boolean isCloseOnCompletion 4.1 Yes


()

boolean isPoolable() 4.0 Yes

Cloudera JDBC Driver for Apache Hive | 91


Features

Supported
Supported
Since
Method by the Notes
JDBC
Driver
Version

void setCursorName(String 3.0 No


name)

void setEscapeProcessing 3.0 Yes


(boolean enable)

void setFetchDirection(int 3.0 No


direction)

void setFetchSize(int rows) 3.0 Yes

void setMaxFieldSize(int 3.0 Yes


max)

void setMaxRows(int max) 3.0 Yes

void setPoolable(boolean 4.0 Yes


poolable)

void setQueryTimeout(int 3.0 Yes


seconds)

boolean isWrapperFor 4.0 Yes


(Class<?> iface)

<T> T unwrap(Class<T> 4.0 Yes


iface)

92 | Cloudera JDBC Driver for Apache Hive


Driver Configuration Options

Driver Configuration Options


Driver Configuration Options lists and describes the properties that you can use to configure the
behavior of the Cloudera JDBC Driver for Apache Hive.

You can set configuration properties using the connection URL. For more information, see
"Building the Connection URL" on page 10.

Note:

Property names and values are case-sensitive.

AllowSelfSignedCerts
Default Value Data Type Required

0 Integer No

Description

This property specifies whether the driver allows the server to use self-signed SSL certificates.
l 1: The driver allows self-signed certificates.

Important:

When this property is set to 1, SSL verification is disabled. The driver does not verify the
server certificate against the trust store, and does not verify if the server's host name
matches the common name in the server certificate.

l 0: The driver does not allow self-signed certificates.

Note:

This property is applicable only when SSL connections are enabled.

AsyncExecPollInterval
Default Value Data Type Required

10 Integer No

Description

The time in milliseconds between each poll for the asynchronous query execution status.

"Asynchronous" refers to the fact that the RPC call used to execute a query against Hive is
asynchronous. It does not mean that JDBC asynchronous operations are supported.

Cloudera JDBC Driver for Apache Hive | 93


Driver Configuration Options

Note:

This option is applicable only to HDInsight clusters.

AuthMech
Default Value Data Type Required

Depends on the Integer No


transportMode setting.
For more information, see
"transportMode" on page
102.

Description

The authentication mechanism to use. Set the property to one of the following values:
l 0 for No Authentication.
l 1 for Kerberos.
l 2 for User Name.
l 3 for User Name And Password.
l 6 for Hadoop Delegation Token.

CAIssuedCertsMismatch
Default Value Data Type Required

0 Integer No

Description

This property specifies whether the driver requires the name of the CA-issued SSL certificate to
match the host name of the Hive server.
l 0: The driver requires the names to match.
l 1: The driver allows the names to mismatch.

Note:

This property is applicable only when SSL connections are enabled.

CatalogSchemaSwitch
Default Value Data Type Required

0 Integer No

94 | Cloudera JDBC Driver for Apache Hive


Driver Configuration Options

Description

This property specifies whether the driver treats Hive catalogs as schemas or as catalogs.
l 1: The driver treats Hive catalogs as schemas as a restriction for filtering.
l 0: Hive catalogs are treated as catalogs, and Hive schemas are treated as schemas.

DecimalColumnScale
Default Value Data Type Required

10 Integer No

Description

The maximum number of digits to the right of the decimal point for numeric data types.

DefaultStringColumnLength
Default Value Data Type Required

255 Integer No

Description

The maximum number of characters that can be contained in STRING columns. The range of
DefaultStringColumnLength is 0 to 32767.

By default, the columns metadata for Hive does not specify a maximum data length for STRING
columns.

DelegationToken
Default Value Data Type Required

None String Yes, if AuthMech is set to 6


(Hadoop Delegation Token)

Description

A Hadoop delegation token for authentication.

This token must be provided to the driver in the form of a Base64 URL-safe encoded string. It can
be obtained from the driver using the getDelegationToken() function, or by utilizing the
Hadoop distribution .jar files.

Cloudera JDBC Driver for Apache Hive | 95


Driver Configuration Options

DelegationUID
Default Value Data Type Required

None String No

Description

Use this option to delegate all operations against Hive to a user that is different than the
authenticated user for the connection.

Note:

This option is applicable only when connecting to a Hive Server 2 instance that supports this
feature.

httpPath
Default Value Data Type Required

None String Yes, if


transportMode=http.

Description

The partial URL corresponding to the Hive server.

The driver forms the HTTP address to connect to by appending the httpPath value to the host
and port specified in the connection URL. For example, to connect to the HTTP address
http://localhost:10002/cliservice, you would use the following connection URL:
jdbc:hive2://localhost:10002;AuthMech=3;transportMode=http;httpP
ath=cliservice;
UID=hs2;PWD=cloudera;

Note:

By default, Hive servers use cliservice as the partial URL.

KrbAuthType
Default Value Data Type Required

0 Integer No

Description

This property specifies how the driver obtains the Subject for Kerberos authentication.

96 | Cloudera JDBC Driver for Apache Hive


Driver Configuration Options

l 0: The driver automatically detects which method to use for obtaining the Subject:
1. First, the driver tries to obtain the Subject from the current thread's inherited
AccessControlContext. If the AccessControlContext contains multiple Subjects, the
driver uses the most recent Subject.
2. If the first method does not work, then the driver checks the
java.security.auth.login.config system property for a JAAS
configuration. If a JAAS configuration is specified, the driver uses that information to
create a LoginContext and then uses the Subject associated with it.
3. If the second method does not work, then the driver checks the KRB5_CONFIG and
KRB5CCNAME system environment variables for a Kerberos ticket cache. The driver
uses the information from the cache to create a LoginContext and then uses the
Subject associated with it.
l 1: The driver checks the java.security.auth.login.config system property for a
JAAS configuration. If a JAAS configuration is specified, the driver uses that information to
create a LoginContext and then uses the Subject associated with it.
l 2: The driver checks the KRB5_CONFIG and KRB5CCNAME system environment variables for
a Kerberos ticket cache. The driver uses the information from the cache to create a
LoginContext and then uses the Subject associated with it.

KrbHostFQDN
Default Value Data Type Required

None String Yes, if AuthMech=1.

Description

The fully qualified domain name of the Hive Server 2 host.

KrbRealm
Default Value Data Type Required

Depends on your Kerberos String No


configuration

Description

The realm of the Hive Server 2 host.

If your Kerberos configuration already defines the realm of the Hive Server 2 host as the default
realm, then you do not need to configure this property.

Cloudera JDBC Driver for Apache Hive | 97


Driver Configuration Options

KrbServiceName
Default Value Data Type Required

None String Yes, if AuthMech=1.

Description

The Kerberos service principal name of the Hive server.

LogLevel
Default Value Data Type Required

0 Integer No

Description

Use this property to enable or disable logging in the driver and to specify the amount of detail
included in log files.

Important:

Only enable logging long enough to capture an issue. Logging decreases performance and can
consume a large quantity of disk space.

Set the property to one of the following numbers:


l 0: Disable all logging.
l 1: Enable logging on the FATAL level, which logs very severe error events that will lead the
driver to abort.
l 2: Enable logging on the ERROR level, which logs error events that might still allow the
driver to continue running.
l 3: Enable logging on the WARNING level, which logs events that might result in an error if
action is not taken.
l 4: Enable logging on the INFO level, which logs general information that describes the
progress of the driver.
l 5: Enable logging on the DEBUG level, which logs detailed information that is useful for
debugging the driver.
l 6: Enable logging on the TRACE level, which logs all driver activity.

When logging is enabled, the driver produces the following log files in the location specified in the
LogPath property:
l A HiveJDBC_driver.log file that logs driver activity that is not specific to a
connection.

98 | Cloudera JDBC Driver for Apache Hive


Driver Configuration Options

l A HiveJDBC_connection_[Number].log file for each connection made to the


database, where [Number] is a number that distinguishes each log file from the others. This
file logs driver activity that is specific to the connection.

If the LogPath value is invalid, then the driver sends the logged information to the standard
output stream (System.out).

LogPath
Default Value Data Type Required

The current working String No


directory.

Description

The full path to the folder where the driver saves log files when logging is enabled.

PreparedMetaLimitZero
Default Value Data Type Required

0 Integer No

Description

This property specifies whether the PreparedStatement.getMetadata() call will request


metadata from the server with LIMIT 0.
l 1: The PreparedStatement.getMetadata() call uses LIMIT 0.
l 0: The PreparedStatement.getMetadata() call does not use LIMIT 0.

PWD
Default Value Data Type Required

anonymous String Yes, if AuthMech=3.

Description

The password corresponding to the user name that you provided using the property "UID" on
page 103.

Important:

If you set the AuthMech to 3, the default PWD value is not used and you must specify a
password.

Cloudera JDBC Driver for Apache Hive | 99


Driver Configuration Options

RowsFetchedPerBlock
Default Value Data Type Required

10000 Integer No

Description

The maximum number of rows that a query returns at a time.

Any positive 32-bit integer is a valid value, but testing has shown that performance gains are
marginal beyond the default value of 10000 rows.

SocketTimeout
Default Value Data Type Required

0 Integer No

Description

The number of seconds that the TCP socket waits for a response from the server before raising an
error on the request.

When this property is set to 0, the connection does not time out.

SSL
Default Value Data Type Required

0 Integer No

Description

This property specifies whether the driver communicates with the Hive server through an SSL-
enabled socket.
l 1: The driver connects to SSL-enabled sockets.
l 0: The driver does not connect to SSL-enabled sockets.

Note:

SSL is configured independently of authentication. When authentication and SSL are both
enabled, the driver performs the specified authentication method over an SSL connection.

100 | Cloudera JDBC Driver for Apache Hive


Driver Configuration Options

SSLKeyStore
Default Value Data Type Required

None String No

Description

The full path of the Java KeyStore containing the server certificate for one-way SSL authentication.

See also the property "SSLKeyStorePwd" on page 101.

Note:

The Cloudera JDBC Driver for Apache Hive accepts TrustStores and KeyStores for one-way SSL
authentication. See also the property "SSLTrustStore" on page 101.

SSLKeyStorePwd
Default Value Data Type Required

None Integer Yes, if you are using a


KeyStore for connecting over
SSL.

Description

The password for accessing the Java KeyStore that you specified using the property "SSLKeyStore"
on page 101.

SSLTrustStore
Default Value Data Type Required

jssecacerts, if it exists. String No

If jssecacerts does not


exist, then cacerts is used.
The default location of
cacerts is
jre\lib\security\.

Description

The full path of the Java TrustStore containing the server certificate for one-way SSL
authentication.

See also the property "SSLTrustStorePwd" on page 102.

Cloudera JDBC Driver for Apache Hive | 101


Driver Configuration Options

Note:

The Cloudera JDBC Driver for Apache Hive accepts TrustStores and KeyStores for one-way SSL
authentication. See also the property "SSLKeyStore" on page 101.

SSLTrustStorePwd
Default Value Data Type Required

None String Yes, if using a TrustStore

Description

The password for accessing the Java TrustStore that you specified using the property
"SSLTrustStore" on page 101.

transportMode
Default Value Data Type Required

sasl String No

Description

The transport protocol to use in the Thrift layer.


l binary: The driver uses the Binary transport protocol.

When connecting to a Hive Server 1 instance, you must use this setting. If you use this
setting but do not specify the AuthMech property, then the driver uses AuthMech=0 by
default. This setting is valid only when the AuthMech property is set to 0 or 3.
l sasl: The driver uses the SASL transport protocol.

If you use this setting but do not specify the AuthMech property, then the driver uses
AuthMech=2 by default. This setting is valid only when the AuthMech property is set to
1, 2, or 3.
l http: The driver uses the HTTP transport protocol.

If you use this setting but do not specify the AuthMech property, then the driver uses
AuthMech=3 by default. This setting is valid only when the AuthMech property is set to
3.

If you set this property to http, then the port number in the connection URL corresponds
to the HTTP port rather than the TCP port, and you must specify the httpPath property.
For more information, see "httpPath" on page 96.

102 | Cloudera JDBC Driver for Apache Hive


Driver Configuration Options

UID
Default Value Data Type Required

anonymous String Yes, if AuthMech=3.

Description

The user name that you use to access the Hive server.

Important:

If you set the AuthMech to 3, the default UID value is not used and you must specify a user
name.

UseNativeQuery
Default Value Data Type Required

0 Integer No

Description

This property specifies whether the driver transforms the queries emitted by applications.
l 1: The driver does not transform the queries emitted by applications, so the native query is
used.
l 0: The driver transforms the queries emitted by applications and converts them into an
equivalent form in HiveQL.

Note:

If the application is Hive-aware and already emits HiveQL, then enable this option to avoid the
extra overhead of query transformation.

zk
Default Value Data Type Required

None String No

Description

The connection string to one or more ZooKeeper quorums, written in the following format where
[ZK_IP] is the IP address, [ZK_Port] is the port number, and [ZK_Namespace] is the namespace:
[ZK_IP]:[ZK_Port]/[ZK_Namespace]

Cloudera JDBC Driver for Apache Hive | 103


Driver Configuration Options

For example:
jdbc:hive2://zk=192.168.0.1:2181/hiveserver2

Use this option to enable the Dynamic Service Discovery feature, which allows you to connect to
Hive servers that are registered against a ZooKeeper service by connecting to the ZooKeeper
service.

You can specify multiple quorums in a comma-separated list. If connection to a quorum fails, the
driver will attempt to connect to the next quorum in the list.

104 | Cloudera JDBC Driver for Apache Hive


Contact Us

Contact Us
If you are having difficulties using the driver, our Community Forum may have your solution. In
addition to providing user to user support, our forums are a great place to share your questions,
comments, and feature requests with us.

If you are a Subscription customer you may also use the Cloudera Support Portal to search the
Knowledge Base or file a Case.

Important:

To help us assist you, prior to contacting Cloudera Support please prepare a detailed summary
of the client and server environment including operating system version, patch level, and
configuration.

Cloudera JDBC Driver for Apache Hive | 105


Third-Party Trademarks

Third-Party Trademarks
Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be
trademarks of their respective owners.

Apache Hive, Apache, and Hive are trademarks or registered trademarks of The Apache Software
Foundation or its subsidiaries in Canada, United States and/or other countries.

All other trademarks are trademarks of their respective owners.

106 | Cloudera JDBC Driver for Apache Hive


Third-Party Licenses

Third-Party Licenses
The licenses for the third-party libraries that are included in this product are listed below.

Simple Logging Façade for Java (SLF4J) License

Copyright © 2004-2015 QOS.ch

All rights reserved.

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and
associated documentation files (the "Software"), to deal in the Software without restriction,
including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense,
and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do
so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial
portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED,
INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN
ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH
THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Apache License, Version 2.0

The following notice is included in compliance with the Apache License, Version 2.0 and is
applicable to all software licensed under the Apache License, Version 2.0.

Apache License

Version 2.0, January 2004

http://www.apache.org/licenses/

TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION


1. Definitions.

"License" shall mean the terms and conditions for use, reproduction, and distribution as
defined by Sections 1 through 9 of this document.

"Licensor" shall mean the copyright owner or entity authorized by the copyright owner that
is granting the License.

"Legal Entity" shall mean the union of the acting entity and all other entities that control,
are controlled by, or are under common control with that entity. For the purposes of this
definition, "control" means (i) the power, direct or indirect, to cause the direction or
management of such entity, whether by contract or otherwise, or (ii) ownership of fifty
percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity.

Cloudera JDBC Driver for Apache Hive | 107


Third-Party Licenses

"You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by
this License.

"Source" form shall mean the preferred form for making modifications, including but not
limited to software source code, documentation source, and configuration files.

"Object" form shall mean any form resulting from mechanical transformation or translation
of a Source form, including but not limited to compiled object code, generated
documentation, and conversions to other media types.

"Work" shall mean the work of authorship, whether in Source or Object form, made
available under the License, as indicated by a copyright notice that is included in or
attached to the work (an example is provided in the Appendix below).

"Derivative Works" shall mean any work, whether in Source or Object form, that is based
on (or derived from) the Work and for which the editorial revisions, annotations,
elaborations, or other modifications represent, as a whole, an original work of authorship.
For the purposes of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of, the Work and
Derivative Works thereof.

"Contribution" shall mean any work of authorship, including the original version of the
Work and any modifications or additions to that Work or Derivative Works thereof, that is
intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by
an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the
purposes of this definition, "submitted" means any form of electronic, verbal, or written
communication sent to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems, and issue tracking
systems that are managed by, or on behalf of, the Licensor for the purpose of discussing
and improving the Work, but excluding communication that is conspicuously marked or
otherwise designated in writing by the copyright owner as "Not a Contribution."

"Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a
Contribution has been received by Licensor and subsequently incorporated within the
Work.
2. Grant of Copyright License. Subject to the terms and conditions of this License, each
Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge,
royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the Work and such Derivative
Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of this License, each
Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge,
royalty-free, irrevocable (except as stated in this section) patent license to make, have
made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license
applies only to those patent claims licensable by such Contributor that are necessarily
infringed by their Contribution(s) alone or by combination of their Contribution(s) with the
Work to which such Contribution(s) was submitted. If You institute patent litigation against
any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a

108 | Cloudera JDBC Driver for Apache Hive


Third-Party Licenses

Contribution incorporated within the Work constitutes direct or contributory patent


infringement, then any patent licenses granted to You under this License for that Work shall
terminate as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works
thereof in any medium, with or without modifications, and in Source or Object form,
provided that You meet the following conditions:

(a) You must give any other recipients of the Work or Derivative Works a copy of this
License; and

(b) You must cause any modified files to carry prominent notices stating that You
changed the files; and

(c) You must retain, in the Source form of any Derivative Works that You distribute,
all copyright, patent, trademark, and attribution notices from the Source form of
the Work, excluding those notices that do not pertain to any part of the
Derivative Works; and

(d) If the Work includes a "NOTICE" text file as part of its distribution, then any
Derivative Works that You distribute must include a readable copy of the
attribution notices contained within such NOTICE file, excluding those notices
that do not pertain to any part of the Derivative Works, in at least one of the
following places: within a NOTICE text file distributed as part of the Derivative
Works; within the Source form or documentation, if provided along with the
Derivative Works; or, within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents of the NOTICE
file are for informational purposes only and do not modify the License. You may
add Your own attribution notices within Derivative Works that You distribute,
alongside or as an addendum to the NOTICE text from the Work, provided that
such additional attribution notices cannot be construed as modifying the License.

You may add Your own copyright statement to Your modifications and may provide
additional or different license terms and conditions for use, reproduction, or distribution of
Your modifications, or for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with the conditions stated
in this License.
5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution
intentionally submitted for inclusion in the Work by You to the Licensor shall be under the
terms and conditions of this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify the terms of any
separate license agreement you may have executed with Licensor regarding such
Contributions.
6. Trademarks. This License does not grant permission to use the trade names, trademarks,
service marks, or product names of the Licensor, except as required for reasonable and
customary use in describing the origin of the Work and reproducing the content of the
NOTICE file.

Cloudera JDBC Driver for Apache Hive | 109


Third-Party Licenses

7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor


provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including,
without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT,
MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for
determining the appropriateness of using or redistributing the Work and assume any risks
associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory, whether in tort (including
negligence), contract, or otherwise, unless required by applicable law (such as deliberate
and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for
damages, including any direct, indirect, special, incidental, or consequential damages of any
character arising as a result of this License or out of the use or inability to use the Work
(including but not limited to damages for loss of goodwill, work stoppage, computer failure
or malfunction, or any and all other commercial damages or losses), even if such
Contributor has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative
Works thereof, You may choose to offer, and charge a fee for, acceptance of support,
warranty, indemnity, or other liability obligations and/or rights consistent with this License.
However, in accepting such obligations, You may act only on Your own behalf and on Your
sole responsibility, not on behalf of any other Contributor, and only if You agree to
indemnify, defend, and hold each Contributor harmless for any liability incurred by, or
claims asserted against, such Contributor by reason of your accepting any such warranty or
additional liability.

END OF TERMS AND CONDITIONS

APPENDIX: How to apply the Apache License to your work.

To apply the Apache License to your work, attach the following boilerplate notice, with the
fields enclosed by brackets "[]" replaced with your own identifying information. (Don't
include the brackets!) The text should be enclosed in the appropriate comment syntax for
the file format. We also recommend that a file or class name and description of purpose be
included on the same "printed page" as the copyright notice for easier identification within
third-party archives.

Copyright [yyyy] [name of copyright owner]

Licensed under the Apache License, Version 2.0 (the "License"); you may not use this
file except in compliance with the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under


the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
CONDITIONS OF ANY KIND, either express or implied. See the License for the specific
language governing permissions and limitations under the License.

110 | Cloudera JDBC Driver for Apache Hive


Third-Party Licenses

This product includes software that is licensed under the Apache License, Version 2.0 (listed
below):

Apache Commons
Copyright © 2001-2015 The Apache Software Foundation

Apache Commons Codec


Copyright © 2002-2014 The Apache Software Foundation

Apache Hadoop Common


Copyright © 2014 The Apache Software Foundation

Apache Hive
Copyright © 2008-2015 The Apache Software Foundation

Apache HttpComponents Client


Copyright © 1999-2012 The Apache Software Foundation

Apache HttpComponents Core


Copyright © 1999-2012 The Apache Software Foundation

Apache Logging Services


Copyright © 1999-2012 The Apache Software Foundation
Apache Thrift
Copyright © 2006-2010 The Apache Software Foundation
Apache ZooKeeper
Copyright © 2010 The Apache Software Foundation

Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in
compliance with the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is
distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either
express or implied. See the License for the specific language governing permissions and limitations
under the License.

Cloudera JDBC Driver for Apache Hive | 111

You might also like