With the amount of data getting generated in modern world, the need is to process the data both at real time and than store it for longer term for analytics insight. Whenever a data point is generated it has a real time value which has to be extracted immediately. Delay in processing the data results in loss of value. The same data point also needs to be processed and stored for long term analytics. The challenge is to deal with volume and velocity of data with variety a possible third dimension.

Apache Kafka and Apache Storm are two such technologies which provide a framework to deal with big and fast data. Apache Kafka provides a distributed publish/subscribe model and Apache Storm provides a distributed real time computation system. The talk will focus around the integration of two frameworks to deal with Big and Fast data.


Lalit Bhatt is an IT Professional with 16+ years of experience. He holds a Masters from IIT Delhi and has wide exposure to various standards and open source frameworks and technologies. He has extensive experience in building products from conception to delivery in startups as well as big organizations.

Handling Big and Fast Data With Kafka and Storm
Tagged on: