Sunday, March 5, 2017

Configuring and Running Apache Kafka in IBM BigInsights

This blog describes on Configuring and running the Kafka from IBM BigInsights.

Apache Kafka is an open source that provides a publish-subscribe model for messaging system. Refer : https://kafka.apache.org/

I assume that you were aware of  terminologies like Producer, Subscriber, Kafka Brokers, Topic and Partitions. Here, I will be focusing on creating multiple Brokers in BigInsights then create a topic and publish the messages from command line and consumer getting it from the Broker.


Environment: BigInsights 4.2

 Step 1: Creating Kafka Brokers from Ambari

By default, Ambari will have one Kafka Broker configured.  Based on your usecase, you may need to create multiple brokers.

Login to Ambari UI --> Click on Host and add the Kafka Broker to the node where you need to install Broker.


 You can see multiple brokers running in Kafka UI.




















 
Step 2: Create a Topic

Login to one of the node where broker is running.  Then create a topic.

cd /usr/iop/4.2.0.0/kafka/bin

su kafka -c "./kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 2 -partitions 1 --topic CustomerOrder"









You can get the details of the topic using the below describe command.

su kafka -c "./kafka-topics.sh --describe --zookeeper localhost:2181 --topic CustomerOrder"






 
Step 3: Start the Producer

In the argument --broker-list, pass all the brokers that are running.

su kafka -c "./kafka-console-producer.sh --broker-list bi1.test.com:6667,bi2.test.com:6667 --topic CustomerOrder"

When you run the above command, it will be waiting for user input. You can pass a sample message

{"ID":99, "CUSTOMID":234,"ADDRESS":"12,5-7,westmead", "ORDERID":99, "ITEM":"iphone6", "COST":980}









Step 4: Start the Consumer

Open an other Linux terminal and start the consumer. It will display all the messages send to producer.

su kafka -c "./kafka-console-consumer.sh --zookeeper localhost:2181 --from-beginning --topic CustomerOrder"

 

 Thus, We are able to configure and perfom a sample pub-sub system using Kafka.


6 comments:

ajay prakash said...

Very useful information, Keep posting more blog like this, Thank you.
Airport management courses in chennai
Airport Management Training in Chennai
airport courses in chennai
airline and airport management courses in chennai

Rainbow Training Institute said...

Thank you for sharing your valuable article this is the best blog for the students. learn Workday HCM Online Training.

Workday HCM Online Training

Rainbow Training Institute said...

Thanking to provide best article blog having good information useful for everyone.you can also learn Big Data Hadoop Online Training.

Big Data and Hadoop Training In Hyderabad

Karthik said...

Thanks for sharing the view from which i gather knowledge about the related topic. Your way of writing is easy to understand and expecting your next post
Big Data analytics training in Tambaram
Big Data analytics training in Porur
Big Data analytics training in Adyar
Big data training in velachery
Big data training in chennai anna nagar
Big Data analytics training in chennai
Big Data analytics courses in chennai
Hadoop training in Velachery


rashid said...

Hi buddies, it is great written piece entirely defined, continue the good work constantly. big data analytics

Kayal m said...

Such a great post! It looks like a blog and in this content was useful for freshers. Keeping a good job.

Tableau Training in Chennai
Tableau Course in Chennai
Pega Training in Chennai
Excel Training in Chennai
Power BI Training in Chennai
Oracle Training in Chennai
Unix Training in Chennai
Tableau Training in Chennai
Tableau Course in Chennai