About the job
About Atria
Atria Health is a pioneering membership-based preventive healthcare practice that provides state-of-the-art primary and specialty care across New York, South Florida, Los Angeles (anticipated in 2026), and globally via 24/7 telemedicine services.
Our dedicated team of over 60 leading physicians specializes in delivering proactive, preventive, and precision care tailored to the unique needs of Atria members and their families. We are committed to enhancing both lifespan and healthspan through comprehensive screenings, innovative therapeutics, and customized interventions aimed at disease prevention and early detection.
Each member's healthcare journey is guided by a dedicated Chief Medical Officer collaborating with specialists in various fields, including cardiology, neurology, pediatrics, women's health, endocrinology, integrative health, performance and movement, nutrition, and more.
Through the nonprofit Atria Research Institute and Public Health Institute, we are also focused on expediting the application of medical breakthroughs and broadening access to preventive care for a greater number of individuals.
Atria Health is on the lookout for a Senior Software Engineer to join our Data Engineering team. This senior role will be instrumental in advancing our data infrastructure, pipelines, and analytics platforms. You will be responsible for designing and constructing scalable data systems, mentoring fellow engineers, and collaborating with cross-functional product teams (Clinical Experience, Member Experience, and Care Delivery) to ensure the delivery of reliable, high-quality data that informs insights and decision-making. Your contributions will be critical in supporting Atria's long-term research goals and empowering our physicians to extract innovative clinical insights. You will take ownership of intricate data initiatives from conception through execution while establishing best practices that enhance the entire engineering organization.
Technology Stack
- Languages: Python, SQL
- Data Processing: Dagster, dbt
- Infrastructure: Google Cloud Platform, Terraform, Kubernetes, Docker
- Data Storage: Snowflake, PostgreSQL, MySQL
- Streaming: Pub/Sub, Kafka
- CI/CD: GitHub Actions
- Monitoring & Observability: Datadog, Grafana, OpenTelemetry
Key Responsibilities
Data Infrastructure & Pipelines
- Design, build, and maintain scalable data pipelines and ETL/ELT workflows.
- Contribute to the ownership and enhancement of our Snowflake data warehouse architecture.
- Implement data modeling best practices to cater to analytics and reporting requirements.
- Lead initiatives to enhance data quality, including validation, testing, and monitoring frameworks.
Cross-Team Collaboration
- Collaborate with product engineering teams to create data contracts and integrate data capture into applications.
- Work closely with Analytics and Business Intelligence teams to comprehend reporting needs and deliver reliable data solutions.

