Case Study 2 : Real-time CO2 Level Data Streaming & Analytics with Apache Kafka & Apache Druid

Real-time CO2 Level Data Streaming & Analytics with Kafka & Druid

The customer is a leading supplier of Industrial Measurement Instruments in India. They have been offering a wide range of measuring tools and testing devices for various industries

Real-time CO2 Level Data Streaming & Analytics with Apache Kafka & Apache Druid

This project was for collecting and analysing the various types of real time events through their professional-Grade CO2 detectors, installed on the multiple locations in a hospital. These can easily detect the air quality of a room having a HVAC system and offer an accurate reading of the air pollutant CO2.

Based on the reading, appropriate measures could be taken to improve the air quality.

challenge

  • To measure and analyse the real-time level of Carbon Dioxide content in the air in each room of the hospital.
  • The real-time generated events or signals are being consolidated and sending to multiple adapters through http/https. This process needs immediate and seamless communication between the event processing system and subsequently real-time analytics database that delivers sub-second queries on streaming CO2 or air quality data.

solution

  • We integrated Apache Kafka with Third party systems through their APIs to collect the real-time events, generating and consolidating continuously by various CO2 transmitters and CO2 sensors installed in various rooms of the hospital.
  • We developed producer to fetch the data from the CO2 transmitters and CO2 sensors, and continuously publishing it to multi-cluster brokers.
  • We developed the Kafka supervisor specifications, which is required by Apache Druid for streaming data consumption.
  • Configured Druid with Kafka’s broker to fetch the published events/data continuously for analyzing using Druid’s SQL query engine
  • Configured Apache Druid with HDFS for deep storage of events for future analysis.

outcome

  • Kafka Installation & Configuration
  • Data/Event consumption from the adapters and ingestion to Kafka brokers
  • Apache Druid Installation, configuration and integration with Kafka brokers
  • Developed Kafka supervisor specifications for Apache Druid
  • Querying and Analysing  all real-time data/event for decision making