About the job
About Pantheon
Pantheon WebOps Platform is at the forefront of the open web, hosting over 300,000 sites in the cloud for esteemed clients such as Google, Princeton, Salesloft, and Doctors Without Borders. Every day, a multitude of developers and marketers engage in creating, iterating, and scaling WordPress and Drupal sites to connect with billions globally. Pantheon’s multi-tenant, container-based platform empowers organizations to oversee all their websites from a unified dashboard. Leading organizations like Clorox and the United Nations leverage Pantheon’s collaborative workflows to accelerate development and enable real-time publishing.
The Role
Join our innovative Data Platform team that powers Pantheon’s data infrastructure and analytics capabilities. We are committed to delivering reliable, scalable, and high-performance data solutions that enhance business intelligence and operational excellence across the organization. We are on the lookout for a passionate and skilled Software Engineer II to become an integral part of our team and help shape the future of data at Pantheon.
What You Will Do
- Develop scalable and reliable data systems while upholding service level objectives for critical business pipelines.
- Collaborate with internal teams such as Product, Sales Operations, and Finance to create impactful data solutions.
- Promote self-service tooling and foster a data-driven culture across Pantheon’s teams.
- Contribute to the technical strategy and operational excellence of Pantheon’s data platform.
- Keep abreast of industry trends and advancements in data engineering, analytics, and modern data platforms.
- Enhance our engineering standards by implementing best practices for data architecture, testing, pipeline reliability, and documentation.
What You Need to Succeed
- Customer/Product Focus: A strong understanding of the business value of your work, ensuring data solutions align with company goals and deliver significant impact.
- Understanding of Distributed Data Systems: Knowledge of processing large-scale datasets across distributed systems and understanding the trade-offs in designing solutions for high throughput and low latency.
- Data Modeling and Architecture: The ability to design, implement, and optimize scalable data models for both data warehouse and transactional systems.

