About the job
About Reach Security:
At Reach Security (reach.security), we are pioneering the future of cybersecurity with our innovative self-driving technology. Our platform employs cutting-edge generative AI to enhance your organization's existing security stack, striving to achieve an unparalleled security posture using the tools you already have.
About the Role:
We are on the lookout for talented Data Platform Engineers at various experience levels to design, develop, and manage the infrastructure that drives our powerful data pipelines and analytic query engines. You will be instrumental in crafting scalable, high-performance solutions utilizing technologies such as Trino (Presto), Redshift, BigQuery, Apache Iceberg, and other advanced columnar technologies that facilitate sophisticated analytics and reporting.
The ideal candidate is a proactive problem solver focused on delivering high-quality solutions and adept at navigating complex challenges. As an early team member, you will take ownership of numerous backend components from day one. Your contributions will be vital in establishing engineering best practices, aligning engineering initiatives with business objectives, and discovering innovative techniques to provide exceptional value to our users. You will leverage your engineering expertise to create superior architectures, offer constructive feedback on technical designs, tackle complex problems, and conduct meticulous code reviews to ensure that our software remains maintainable and reliable.
Your Responsibilities:
- Design, build, and maintain a scalable and dependable data platform infrastructure.
- Optimize analytic query engines utilizing technologies like Trino (Presto), Redshift, and BigQuery.
- Develop and support robust data management solutions using Apache Iceberg.
- Work collaboratively with Data Engineering and Analytics teams to ensure seamless integration, schema detection, and evolution.
- Create and maintain observability frameworks for monitoring and troubleshooting data pipelines and platform performance.
- Apply best practices for data modeling, schema design, and pipeline fan-out strategies.
- Guarantee data integrity, quality, and consistency across Medallion architectures, star schemas, and Lakehouse environments.
- Proactively identify opportunities to improve platform scalability, efficiency, and reliability.
Qualifications:
The candidate should have at least 3 years of experience in software engineering or a related field, showcasing proficiency in data platform technologies and practices.

