Druid Summit - Bengaluru

Druid Summit – Bengaluru, India
September 6-7, 2022

Today, digital organizations need real-time insights delivered through internal and external applications. This requires developers like you to build apps that bring together streaming data and interactive analytics to push real-time visibility to the next level. But with 100’s of databases out there how do you know what the right architecture is?

Do you find yourself asking – how can I ingest millions of events per second? How can I analyze billions of rows interactively in seconds? How can I scale to thousands of end-users concurrently without a steep expense?

Apache Druid provides the flexibility, efficiency, and resilience needed to support your analytics applications at scale, for streaming and batch data.

Join us on September 6-7 for the Druid Summit for an insightful event to learn from data professionals and industry experts about building analytics architecture to power your applications.

During this event you will:

  • Learn what makes a modern analytics application
  • See how organizations use analytics apps to prevent loss, improve customer experience, and even drive new revenue
  • Get practical actions you can take, no matter your company size or budget
  • Hands-on experience on how to get started with Druid via our Apache Druid Basic Training Accreditation Program!

Speakers

Sachin Govind / Engineering Manager (Tech lead) / Yellow.ai

Started work in a young startup yellow.ai that was struggling to keep up with marketing insights on chatbot interactions. yellow.ai is a chatbot provider. An analytics solution providing filters and aggregations on billions of rows of data was the requirement. Built a query language (like Metabase) and BI explorer on top of Druid to provide rich visualizations and alerts on the fly using druid. Sachin has been using Druid for about 5 years now and we have built many projects on top of it.


Karan Kumar / Software Engineer / Imply

Karan has been working in Data teams for the more than 10 years contributing to various OSS projects like Falcon, Pig, Kafka, Presto. These days, he is actively working on core druid development with Imply.


Prathiba Murugesan / Senior Engineering Manager / Lowe's

Leading an engineering team for Data Visualization and Analytics Platform within the Data, Analytics, & Computational Intelligence division of Lowe's


Jayasankar Vijay Narayanan / Senior solutions engineer / Imply

Vijay is a senior solution engineer with Imply and has been working with customers on big data and analytics for over 15 years. Before Imply Vijay was with Cloudera and Informatica. At present Vijay works with customers in the APAC region helping them create mordern analytic applications using Druid


Asit Parija / CTO / Whiteklay

A big data technology enthusiast with over 10 plus years of experience in designing, implementing and managing highly scalable real-time systems. Worked as a technical solution architect in end-to-end big data projects with varied technology experience in Hadoop, Druid, Hbase, Kafka, and Graph technologies. He is also the chief architect for IZAC, an open-source data exchange product built on Kafka and Flink.


Tijo Thomas / Senior Solutions Architect / Imply

Tijo Thomas is a Senior Solutions Architect at Imply and an experienced Data Engineer. He has over 18 years of experience in software development, mostly in big data and streaming technologies. He has been helping customers in setting up their stream processing infrastructures using Apache Druid over the last couple of years. During this time, he has collected best practices, patterns and anti-patterns applied in production environments.


Agenda

Day 1 – September 6 – Main Sessions

09:15 Registration
09:45 Opening
10:00 Keynote: Become the Next Analytics Hero

We are in an era where off-the-shelf BI products can’t provide everything you need. Enter the developer–the next analytics hero–who will build the analytics applications their organizations need, using Apache Druid as their database foundation.

In this session we will discuss what it takes to build modern analytics applications.

10:45 Analyzing Billions of Transactions, Every Single Day

Paytm, serving 300+ million customers, 21+ million merchants, processed 7.4+ billion transactions, is the largest fintech in India. In this talk, Manoj Kumar, Senior VP and Head of Tech & Data Platforms of Paytm, will share how Paytm leverages Druid to solve its data challenge.

11:15 Refreshment break
11:30 Apache Druid Data Modeling Best Practices

Druid is one of the fastest analytical database which is capable of handling realtime and streaming data. Druid architecture facilitate to resond to analytical queries in sub second level of latencies. How ever these capabilties can be enhanced further by following some of the best practices guidelines. Also there are situations where the data can be modeled to in a way the expected query latency , concurreny and the like can be achieved.
In this presentation we talk about the data modeling best practices and how data modeling helps in handling situations of concurrency , latency , high cardinality…etc.

Tijo Thomas, Senior Solutions Architect, Imply

 

12:00 How Druid Enables yellow.ai to Provide Analytics on 10B+ Rows in 10ms

tarted work in a young startup yellow.ai that was struggling keep up with marketing insights on chatbot interactions. yellow.ai is a chatbot provider. An analytics solution providing filters, aggregations on billions of rows was the requirement. Built a query language (like metabase) and BI explorer on top of druid providing rich visualisations and alerts on the fly using druid. Been using druid for about 5 years now and we have built any projects on top of it.

Sachin Govind, Engineering Manager (Tech lead), Yellow.ai

12:30 Lunch break
14:00 Adding Horsepower to Real-time Analytics – Going Beyond Hadoop

Hadoop architectures are falling short of the requirements being generated by the realtime analytics world. There is a whole new world of realtime datawarehousing and realtime analytics being empowered on Kafka. Its important to understand how we architect a stream first architecture and why time series databases like Druid play an important role in addressing the world of “Data Mesh”.

Asit Parija, CTO, Whiteklay

14:30 Analyzing Data at Scale with Druid in Lowe’s Tech

Lowe’s is a Fortune® 50 company and the world’s second-largest home improvement retailer , we operate roughly 2,200 home improvement and hardware stores in the U.S. and Canada which spreads across 200 million square feet of retail selling space. Approximately 20 million customers served each week.

Lowe’s Data Analyze platform DACI Discover, Visualize & Analyze (DVA) teams build and manage data platforms providing Data Governance, Data Cataloging, Data Discovery, Data Quality, Data Modeling, Data Movement, Data Warehousing, Data Query, Business Intelligence and Master Data Management.

In this talk we will cover why Lowe’s chose Druid as a platform to Analyze data at scale and our strategy to have dedicated clusters for each business portfolios and Furthermore, we will cover how Druid’s versatility and speed allow us to offer interactive analytics for unique business use-cases.

Prathiba Murugesan, Senior Engineering Manager, Lowe’s

15:00 Refreshment break
15:15 Muti-Stage Query Architecture OverviewThe session will deep dive into the architecture of the latest multi-stage query that is Druid is getting.

Karan Kumar, Software Engineer, Imply

15:45 Next-level Druid: Awesome New Features in the Works

Project Shapeshift is our strategic initiative that reimagines the Druid experience in a cloud-first, developer-centric world. In this session, get a look at new features available now and in the near future.

16:20 Closing

Day 2 – September 7 – Apache Druid Basics Training

09:15 Registration
09:30 Imply Accreditation Program – Apache Druid Basics Training

A hands-on instructor-led introductory course where you will learn about the fully scalable database architecture plus how to ingest, roll-up, and query data.

Besides some amazing new knowledge and skills, you will also get a certificate of completion to add to your LinkedIn (and other social media profiles) once you complete the training.

Register

Location