Delivery Hero is a company running in 40+ countries in which several entities run their own systems and storages around the world. Building new global micro-services with real-time data processing gets complicated in such an environment. Therefore, we built a global data pipeline, called “Data Fridge” which provides data normalization and data validation to downstream consumers.
Our data pipeline is fully serverless and provides three different ways to consume data (HTTP push, HTTP pull and SQL batch for big data processing).
In this session, I will cover in detail how we have built this data pipeline using several AWS and GCP services. I will describe why we chose these services and what are the challenges and limitations we have faced so far.
Vicente Valls Rios is currently leading the streaming data pipeline at Delivery Hero. After finishing his degree of Telecommunications Engineering, he moved to Berlin 9 years ago. During this time he has collected extensive experience as frontend, backend, devops and architect. Since 2018, his team has developed a service ingesting several million events per day which are consumed by 15+ teams in real-time within the company.