About the job
About Us:
Zora is a revolutionary on-chain social network that empowers individuals to express themselves freely, connect with a diverse community, and unlock the potential of their creativity. We are committed to utilizing cryptocurrency to create a more equitable ecosystem for creators, fundamentally reimagining the power of the internet.
Our goal is to make Zora accessible to everyone through an engaging, user-friendly, and rewarding consumer application. Our mission is to ensure that creating online is both free and valuable.
About The Role:
We are seeking a talented Lead Data Engineer to develop the data infrastructure that fuels our Trade Platform. As our on-chain social network expands, it is essential to have robust data systems capable of processing billions of trading events, calculating real-time earnings, optimizing liquidity, and identifying the best trading opportunities for our users. You will be working at the intersection of blockchain data, high-volume trading systems, and consumer social products to establish the data foundation that supports our creator economy.
This is a high-impact position where you will take ownership of the complete data architecture for trading, from analyzing blockchain events and user activities to delivering real-time analytics to millions of users.
What You'll Do:
- Design and build scalable data pipelines for ingesting, processing, and transforming blockchain data, trading events, user activities, and market signals at high volumes with low latency.
- Architect and maintain data infrastructure that powers real-time trading analytics, P&L calculations, leaderboards, market cap tracking, and liquidity monitoring across the platform.
- Manage ETL/ELT processes that convert raw on-chain data from various blockchains into clean, reliable datasets used by product, engineering, analytics, and ML teams.
- Create and optimize data models and schemas that support both operational systems (providing live trading data) and analytical use cases (analyzing market dynamics and user behavior).
- Establish data quality frameworks that include monitoring, alerting, testing, and validation to ensure pipeline reliability and data accuracy at scale.
- Collaborate with backend engineers to design event schemas, data contracts, and APIs that facilitate real-time data flows between systems.
- Partner with product and analytics teams to understand data requirements and translate them into effective engineering solutions.

