With the amount of data getting generated in modern world, the need is to process the data both at real time and than store it for longer term for analytics insight. Whenever a data point is generated it has a real time value which has to be extracted immediately. Delay in processing the data results in loss of value. The same data point also needs to be processed and stored for long term analytics. The challenge is to deal with volume and velocity of data with variety a possible third dimension.
Apache Kafka and Apache Storm are two such technologies which provide a framework to deal with big and fast data. Apache Kafka provides a distributed publish/subscribe model and Apache Storm provides a distributed real time computation system. The talk will focus around the integration of two frameworks to deal with Big and Fast data.