Sunday, March 5, 2017

Configuring and Running Apache Kafka in IBM BigInsights

This blog describes on Configuring and running the Kafka from IBM BigInsights.

Apache Kafka is an open source that provides a publish-subscribe model for messaging system. Refer : https://kafka.apache.org/

I assume that you were aware of  terminologies like Producer, Subscriber, Kafka Brokers, Topic and Partitions. Here, I will be focusing on creating multiple Brokers in BigInsights then create a topic and publish the messages from command line and consumer getting it from the Broker.


Environment: BigInsights 4.2

 Step 1: Creating Kafka Brokers from Ambari

By default, Ambari will have one Kafka Broker configured.  Based on your usecase, you may need to create multiple brokers.

Login to Ambari UI --> Click on Host and add the Kafka Broker to the node where you need to install Broker.


 You can see multiple brokers running in Kafka UI.




















 
Step 2: Create a Topic

Login to one of the node where broker is running.  Then create a topic.

cd /usr/iop/4.2.0.0/kafka/bin

su kafka -c "./kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 2 -partitions 1 --topic CustomerOrder"









You can get the details of the topic using the below describe command.

su kafka -c "./kafka-topics.sh --describe --zookeeper localhost:2181 --topic CustomerOrder"






 
Step 3: Start the Producer

In the argument --broker-list, pass all the brokers that are running.

su kafka -c "./kafka-console-producer.sh --broker-list bi1.test.com:6667,bi2.test.com:6667 --topic CustomerOrder"

When you run the above command, it will be waiting for user input. You can pass a sample message

{"ID":99, "CUSTOMID":234,"ADDRESS":"12,5-7,westmead", "ORDERID":99, "ITEM":"iphone6", "COST":980}









Step 4: Start the Consumer

Open an other Linux terminal and start the consumer. It will display all the messages send to producer.

su kafka -c "./kafka-console-consumer.sh --zookeeper localhost:2181 --from-beginning --topic CustomerOrder"

 

 Thus, We are able to configure and perfom a sample pub-sub system using Kafka.


8 comments:

Anexas Europe said...

Wonderful article, very useful and well explanation. Your post is extremely incredible. I will refer this to my candidates...

Digital Marketing Training in Mumbai

Six Sigma Training in Dubai

Six Sigma Abu Dhabi

Anexas Europe said...

Digital Marketing Training in Mumbai

indhumati said...

Great post and well explained. Thank you

Hadoop Training Chennai
Hadoop Training in Chennai

Aruna Ram said...

I have many things to learn in this blog about big data if you want to learn more about big data then we provide both online and offline trainings.
Big Data Training
Big Data Training
Big Data Training

Aruna Ram said...


Thanks for a marvelous posting!Very great content. Much thanks to you for setting aside opportunity to composed your experience.
Hadoop Training Chennai
Hadoop Training Chennai
Hadoop Training Chennai

Rohil singh said...

Great post and informative blog.it was awesome to read, thanks for sharing this great content to my vision.
Good discussion.
Six Sigma Training in Abu Dhabi
Six Sigma Training in Dammam
Six Sigma Training in Riyadh

Allen Christina said...

thanks for sharing the amaizing info with us.really this article slove one of my doubts.keep update your informations.Big data hadoop training
Hadoop big data training in chennai
Big data training institute in chennai
Best hadoop training in chennai

swetha singh said...

Such an excellent and interesting blog, Do post like this more with more information, This was very useful, Thank you.
Aviation Academy in Chennai
Aviation Courses in Chennai
best aviation academy in chennai
aviation institute in chennai