About the job
Role Overview
Paytm seeks a Principal Engineer - Data DevOps for its Noida, Uttar Pradesh office. This role leads the development, management, and optimization of secure, large-scale big data platforms. The position calls for a blend of technical depth and leadership, driving best practices in cloud infrastructure, automation, CI/CD, and big data technologies. The Principal Engineer will guide cross-functional goals, mentor engineers, and ensure delivery of scalable data solutions.
Main Responsibilities
- Lead, mentor, and develop a high-performing Data DevOps team with a focus on technical excellence and accountability.
- Direct the architecture, design, and implementation of cloud and data infrastructures to meet scalability, performance, and security needs.
- Work closely with Data Engineering, Data Science, Analytics, and Product teams to deliver reliable and efficient data platforms.
- Manage and optimize AWS-based infrastructure, including VPC, EC2, S3, EMR, EKS, SageMaker, Lambda, CloudFront, CloudWatch, and IAM.
- Scale and oversee big data platforms using Kafka, Hive HMS, Apache Ranger, Apache Airflow, EMR, Spark, Trino, Looker, and Jupyter Notebooks.
- Establish and maintain CI/CD pipelines and infrastructure automation with Terraform, Ansible, and CloudFormation.
- Ensure observability, proactive incident management, and compliance with SLAs.
- Promote cloud security practices, including API security, TLS/HTTPS, and access control policies.
- Collaborate with stakeholders to set priorities, manage budgets, and optimize cloud and operational spending.
Required Qualifications
- At least 8 years of experience in DevOps, Data DevOps, or related fields, with a minimum of 4 years in a leadership role.
- Demonstrated success managing large-scale big data infrastructure and leading engineering teams.
- Hands-on expertise with AWS services and infrastructure automation tools such as Terraform, Ansible, and CloudFormation.
- Extensive experience with Kafka, Hive HMS, Apache Ranger, Apache Airflow, EMR, Spark, Trino, Jupyter Notebooks, and Looker.
- Proficiency with Kubernetes/EKS, Docker, ECS, and CI/CD tools.
- Strong understanding of networking, cloud security, and compliance requirements.
- Excellent communication, stakeholder management, and decision-making skills.
- Familiarity with SQL and data query optimization is considered a plus.

