Wednesday, October 5, 2016

Java JDBC Application to connect to IBM BigSQL or IBM DB2 in a Kerberos Cluster

This blog provides the complete code to connect to the IBM BigSQL Database in a Kerberos Cluster. The same code works for IBM DB2 also.

In the below code, we do not pass the database User/Passowd to connect to Database. Here, we use the Kerberos Authentication to connect to Database.

1) Java - JDBC Code

public class KerberosTest {

       public static void main(String[] args) {
             if (null != args && args.length != 2) {
                    throw new IllegalArgumentException(
                                 "Invalid arguments. " +
                                 "Specify headnode hostname/ip " +
                                 "& database port");

             String ServerName = args[0];
             int PortNumber = Integer.parseInt(args[1]);
             String DatabaseName = "bigsql";

             java.util.Properties properties = new java.util.Properties();
             // 11 is the integer value for kerberos
             properties.put("securityMechanism", new String("11"));
             // Provide the Kerberos Principal

             String url = String.format("jdbc:db2://%s:%d/%s", ServerName,
                           PortNumber, DatabaseName);

             java.sql.Connection con = null;
             try {
             } catch (Exception e) {
                    System.out.println("Error: " +
                                 "failed to load Db2 jcc driver.");

             try {
                    System.out.println("url: " + url);
                    con = java.sql.DriverManager.
                                 getConnection(url, properties);
                    java.sql.Statement s2 = con.createStatement();

                    try {
                           s2.executeUpdate("drop table t1");
                           s2.executeUpdate("drop table tbint");

                           System.out.println("Drop Hadoop & DB2 " +
                                        "tables successfull!!!");
                    } catch (Exception e) {
                           System.out.println("drop is failing");

                    try {
                           // Create DB2 Table
                           s2.executeUpdate("create table t1 (c1 int)");
                           // Create BigSQL Table
                                        "create hadoop table " +
                                        "if not exists "+
                                        "tbint " +
                                        "(col1 INT, col2 INT, col3 INT)");

                                        "Created Hadoop & DB2 tables " +
                    } catch (Exception e) {
                           System.out.println("create is failing");

                    // Insert to DB2 Table
                    String str = "insert into t1 values (100)";

                    // Query the DB2 Table
                    java.sql.PreparedStatement ps = con
                                 .prepareStatement("select * from t1");
                    java.sql.ResultSet rs = ps.executeQuery();

                    while ( {

                    // Insert to BIGSQL Table
                    str = "insert into tbint values(1,2,3),(1,2,3),(1,2,3)";

                    // Query the BIGSQL Table
                    ps = con.prepareStatement("select * from tbint");
                    rs = ps.executeQuery();

                    while ( {
                           System.out.printf("%s,%s,%s", rs.getString(1),
                                        rs.getString(2), rs.getString(3));

             } catch (Exception e) {

To complie the above code, ensure the DB2 jars - db2jcc_license_cu.jar, db2jcc.jar & db2jcc4.jar in the classpath.

2) Testing the JDBC Application in Kerberos Cluster

Do the Kinit and get the valid ticket.

[biadmin@btestserver ~]$
[biadmin@btestserver ~]$ date
Wed Oct  5 16:50:18 PDT 2016
[biadmin@btestserver ~]$
[biadmin@btestserver ~]$
[biadmin@btestserver ~]$ /opt/ibm/biginsights/jdk/jre/bin/kinit -k -t /opt/ibm/biginsights/conf/security/keytabs/biadmin.keytab biadmin/
New ticket is stored in cache file /home/biadmin/krb5cc_biadmin
[biadmin@btestserver ~]$
[biadmin@btestserver ~]$
[biadmin@btestserver ~]$ klist -c krb5cc_biadmin
Ticket cache: FILE:krb5cc_biadmin
Default principal: biadmin/

Valid starting     Expires            Service principal
10/05/16 16:50:22  10/06/16 16:50:22  krbtgt/IBM.COM@IBM.COM
[biadmin@btestserver ~]$
[biadmin@btestserver ~]$

Run the Java code.

[biadmin@btestserver ~]$
[biadmin@btestserver ~]$ java -jar /opt/nisan/BigSQL_jdbc_kerberos.jar "" 51000
url: jdbc:db2://
Drop Hadoop & DB2 tables successfull!!!
Created Hadoop & DB2 tables successfully!!!
[biadmin@btestserver ~]$

3) Verifying the records inserted to the table

[biadmin@btestserver ~]$
[biadmin@btestserver ~]$ su bigsql
[bigsql@btestserver biadmin]$
[bigsql@btestserver biadmin]$ db2 "connect to bigsql"

   Database Connection Information

 Database server        = DB2/LINUXX8664 10.6.2
 SQL authorization ID   = BIGSQL
 Local database alias   = BIGSQL

[bigsql@btestserver biadmin]$
[bigsql@btestserver biadmin]$ db2 "select * from biadmin.t1"

  1 record(s) selected.

[bigsql@btestserver biadmin]$
[bigsql@btestserver biadmin]$ db2 "select * from biadmin.tbint"
COL1        COL2        COL3
----------- ----------- -----------
          1           2           3
          1           2           3
          1           2           3

  3 record(s) selected.

[bigsql@btestserver biadmin]$

I will be covering the JDBC Application to connect to Apache Hive in my next blog.


Louie Barker said...

Bigdata analytics is getting popular day by day. There are a lot more job opportunities for the offshore developers who now Bigdata analytics and crunching. Hadoop is one of the most prominent architectures that is used by the Bigdata developers. Hence, knowledge of Hadoop is also crucial in the industry.

raju v said...

Thank you so much for sharing such an awesome blog...
Big data certification services
Big data training services
self paced E learning program
sas certified big data professional

logistic-solutions said...

Thank you for sharing useful information,Really it contains good stuff and its helpful to beginners
SAP Consulting Services in usa
Minority Business Enterprise (MBE)
top software service providers in North America
SAS Consulting Services in usa

stella murugesan said...

Thanks for the information. Hope devotes will be careful after reading this post.Regards,If you want to learn Big Data Company