About the job
Teamwork makes the stream work.
Roku is revolutionizing the way the world experiences television
As the leading TV streaming platform in the U. S., Canada, and Mexico, Roku aims to empower every television globally. With a pioneering spirit, we connect users to their favorite content, support content creators in growing their audiences, and provide advertisers with innovative ways to engage viewers.
From your first day at Roku, you'll be an integral part of our journey. As a fast-growing public company, everyone here plays a crucial role. Join us in delighting millions of TV streamers worldwide while gaining invaluable experience across various disciplines.
About the Team
The Roku Data Engineering team is dedicated to building a state-of-the-art big data platform that empowers both internal and external stakeholders to leverage data for business growth. Our team collaborates closely with business partners and engineering teams to gather metrics on essential initiatives for success. As a Senior Data Engineer focused on device metrics, you will design data models and develop scalable data pipelines to capture critical business metrics across our diverse range of Roku devices.
About the Role
At Roku, we connect users to the streaming content they love and enable content publishers to monetize large audiences while providing advertisers with unique capabilities to engage consumers. Our Roku streaming players and Roku TV™ models are available worldwide through direct retail sales and partnerships with TV brands and pay-TV operators. With millions of devices sold in numerous countries, thousands of streaming channels, and billions of hours of content consumed, the development of a scalable, highly available, fault-tolerant big data platform is crucial to our success. This role is based in Bengaluru, India and requires hybrid work, with three days in the office.
What You'll Be Doing
- Develop and maintain highly scalable, fault-tolerant distributed data processing systems (both batch and streaming) handling terabytes of data ingested daily and managing a petabyte-sized data warehouse.
- Design and implement efficient data models and pipelines that support business growth and decision-making.

