Senior Lead Data Engineer jobs in Chennai – Browse 563 openings on RoboApply Jobs

Senior Lead Data Engineer jobs in Chennai

Open roles matching “Senior Lead Data Engineer” with location signals for Chennai. 563 active listings on RoboApply Jobs.

563 jobs found

1 - 20 of 563 Jobs
Apply
company
Full-time|On-site|Chennai, Tamil Nadu, India

Join Our Team as a Senior Lead Data EngineerAre you passionate about data and technology? At Brillio, we are seeking a highly skilled Senior Lead Data Engineer to guide our data engineering initiatives. You will play a pivotal role in driving data strategy, architecture, and implementation across our projects, ensuring that we leverage data to deliver exceptional solutions for our clients.

Jan 6, 2026
Apply
company
Full-time|On-site|Chennai, Tamil Nadu, India

Lead Data EngineerJoin our innovative team at Brillio as a Lead Data Engineer, where you will spearhead data engineering initiatives and lead a talented group of engineers. Your expertise will guide our projects, ensuring the development of robust data solutions that drive business decisions. Collaborate with AI and data engineering experts, and leverage cutting-edge technologies to transform data into actionable insights.

Jan 6, 2026
Apply
companyForbes Advisor logo
Lead Data Engineer

Forbes Advisor

Full-time|On-site|Chennai

Role overview Forbes Advisor seeks a Lead Data Engineer based in Chennai. The position centers on designing and building data infrastructure to support business strategy and deliver insights. Maintaining data integrity and ensuring that information is available to those who need it are key priorities. Main responsibilities Direct the architecture and implementation of data systems Convert raw data into insights that inform business decisions Collaborate with teams across the company to understand and meet their data needs Create solutions that enable better decision-making through data Collaboration This role works closely with cross-functional teams. Gathering input, clarifying requirements, and delivering tailored data solutions are part of daily work to address real business challenges.

Apr 27, 2026
Apply
company
Full-time|On-site|Chennai, Tamil Nadu, India

Lead Data EngineerAs a Lead Data Engineer at Brillio, you will play a pivotal role in designing, implementing, and managing robust data pipelines and architectures that drive our data-driven decision-making processes. You will lead a team of talented engineers and collaborate with cross-functional teams to deliver innovative data solutions that empower our clients and enhance their operational efficiencies.

Jan 6, 2026
Apply
company
Full-time|On-site|Chennai, Tamil Nadu, India

Join Infotel India as a seasoned ETL, Python, and Visualization Lead in our vibrant team. In this pivotal role, you'll spearhead the design and execution of robust ETL processes, craft innovative Python solutions, and develop insightful data visualizations to drive strategic business decisions. As a leader, you will work in synergy with stakeholders, data engineers, and analysts to establish efficient data workflows that yield impactful insights. This is a remarkable opportunity for a technical leader to significantly influence our data initiatives and enhance client outcomes.Key Responsibilities Design, develop, and maintain scalable ETL pipelines. Manage and optimize large datasets to guarantee data quality and integrity. Create data processing solutions utilizing Python. Develop interactive dashboards and reports leveraging data visualization tools such as Power BI, Tableau, or equivalents. Mentor and guide junior team members. Collaborate with cross-functional teams to foster data-driven decision-making. Enhance data workflows and boost system performance.

Apr 9, 2026
Apply
companyMinderacraft logo
Full-time|On-site|Chennai, Tamil Nadu, India

Join the dynamic team at Minderacraft as a Team Lead Data Engineer. We are looking for a motivated and talented individual who possesses extensive experience in AWS cloud services, Databricks, Apache Spark, Python, and SQL. Your role will be pivotal in designing, developing, and optimizing data pipelines and analytical solutions that drive key business initiatives.Primary Responsibilities:Architect and maintain robust data pipelines and transformation workflows utilizing Databricks and Spark.Develop and enhance ETL/ELT processes to efficiently manage large datasets.Leverage AWS services such as S3, Lambda, IAM, ECS, and CloudWatch to meet data architecture and operational requirements.Collaborate with data analysts, scientists, and business stakeholders to gather requirements and convert them into effective technical solutions.Ensure the integrity, reliability, and performance of all data systems.Advocate for and implement best practices in coding, version control, continuous integration/deployment, and environment management.Monitor, troubleshoot, and ensure the high availability of data pipelines.Contribute to architectural design decisions, documentation, and process enhancements.Qualifications:Proven experience with AWS Cloud (including S3, Lambda, IAM, EC2 or similar).In-depth knowledge of Databricks, Delta Lake, and Unity Catalog.Strong proficiency in Apache Spark (preferably PySpark).Excellent programming skills in Python.Advanced SQL skills, with experience in performance tuning and managing large datasets.Adept at thriving in fast-paced, agile environments.Strong analytical and problem-solving capabilities, with a proactive mindset to drive improvements.Exceptional communication and stakeholder management skills, capable of engaging with diverse teams.Familiarity with data governance tools and frameworks (e.g., DataHub, Soda).Experience with CI/CD tools like GitHub Actions.Availability: Must accommodate meetings and calls in US Pacific Time (PT).Personal Attributes:Self-motivated, accountable, and proactive.Strong ownership mentality with the ability to work independently.Adept at engaging with both technical and non-technical stakeholders.Passionate about building reliable, scalable, and high-quality data systems.

Jan 2, 2026
Apply
company
Full-time|On-site|Chennai, Tamil Nadu, India

Senior Data Science LeadJoin our dynamic team at Brillio as a Senior Data Science Lead in Chennai! In this pivotal role, you will spearhead innovative data science initiatives, leveraging advanced analytics and machine learning techniques to drive business outcomes. You will collaborate with cross-functional teams to develop cutting-edge solutions that harness the power of data.

Jun 5, 2025
Apply
company
Full-time|On-site|Chennai, Tamil Nadu, India

Senior Data Science LeadAs a Senior Data Science Lead, you will spearhead innovative data solutions and guide our talented team of data scientists. Your expertise will be crucial in leveraging data analytics to drive strategic decision-making and enhance business outcomes. You will collaborate with cross-functional teams, employing advanced data models and machine learning techniques to solve complex problems and deliver actionable insights.

Jun 5, 2025
Apply
company
Full-time|On-site|Chennai, Tamil Nadu, India

About the RoleJoin our innovative team at Brillio as a Lead Data Engineer specializing in Microsoft Fabric. We are looking for a seasoned professional with over 7 years of experience in data engineering. In this pivotal role, you will spearhead the design, development, and deployment of data pipelines, optimizing data movement and integration within the Microsoft Fabric ecosystem.Key Responsibilities- Data Pipeline Development: Create, develop, and implement data pipelines using Microsoft Fabric, incorporating OneLake, Data Factory, and Apache Spark for efficient, scalable, and secure data operations.- ETL Architecture: Design and execute ETL workflows tailored for Fabric’s integrated data platform to enhance data ingestion, transformation, and storage processes.- Data Integration: Develop and maintain solutions that consolidate both structured and unstructured data sources into Fabric’s OneLake environment, utilizing SQL, Python, Scala, and R for advanced data handling.- Fabric OneLake & Synapse: Utilize OneLake as the central data repository to facilitate enterprise-level analytics, seamlessly working with Synapse Data Warehousing for comprehensive big data processing and reporting.- Cross-functional Collaboration: Collaborate with Data Scientists, Analysts, and BI Engineers to ensure that Fabric’s data infrastructure effectively supports Power BI, AI workloads, and advanced analytics.- Performance Optimization: Oversee, troubleshoot, and enhance Fabric pipelines to ensure high availability, rapid query performance, and minimized downtime.- Data Governance & Security: Enforce governance and compliance frameworks within Fabric, ensuring data lineage, privacy, and security across the unified platform.- Leadership & Mentorship: Lead and mentor a talented team of engineers, overseeing Fabric workspace design, code reviews, and the implementation of new Fabric features.- Automation & Monitoring: Automate workflows using Fabric Data Factory, Azure DevOps, and Airflow to ensure operational efficiency.- Documentation & Standards: Thoroughly document Fabric pipeline architecture, data models, and ETL processes while contributing to engineering best practices and enterprise guidelines.- Innovation: Stay abreast of Fabric’s evolving functionalities (such as Real-Time Analytics and AI integration) and foster a culture of innovation within the team.

Jan 6, 2026
Apply
companyValGenesis logo
Full-time|On-site|Chennai

About ValGenesis ValGenesis stands at the forefront of digital validation solutions tailored for life sciences. Our innovative platform is embraced by 30 of the top 50 pharmaceutical and biotech companies globally, enabling them to drive digital transformation, ensure total compliance, and achieve manufacturing excellence throughout their product lifecycle. Discover more about the exceptional work environment at ValGenesis, recognized as the industry standard for paperless validation in Life Sciences: valgenesis.com/about

Nov 19, 2025
Apply
company
Full-time|On-site|Chennai, Tamil Nadu, India

Join Our Team as a Senior Data Science Lead!Are you ready to take your data science expertise to the next level? At Brillio, we are seeking a passionate Senior Data Science Lead to guide our innovative projects in AI & Data Engineering. You will play a pivotal role in leveraging advanced analytical techniques to drive business solutions. If you're eager to make an impact and lead a talented team, we want to hear from you!

Jun 5, 2025
Apply
companyValGenesis logo
Full-time|On-site|Chennai

ValGenesis builds digital validation platforms for life sciences organizations. Its products support pharmaceutical and biotech companies as they move toward digital processes, maintain regulatory compliance, and ensure manufacturing quality throughout the product lifecycle. Thirty of the top fifty global firms in this industry use ValGenesis solutions. More details about the company's work in paperless validation are available at valgenesis.com/about. Role overview The Senior Software Engineer - Data Engineering position is based in Chennai. This role focuses on developing and maintaining data engineering solutions to support ValGenesis platforms. Work will center on building systems that help life sciences clients manage and analyze data for compliance and quality throughout their operations.

Apr 24, 2026
Apply
companyminderacraft logo
Full-time|On-site|Chennai, Tamil Nadu, India

Join our dynamic team at minderacraft as a Senior Data Engineer, where your expertise will be pivotal in shaping our data infrastructure. We are seeking a highly skilled individual with a deep understanding of big data technologies, ETL/ELT processes, and data modeling methodologies. Your primary focus will be to design, optimize, and maintain robust data pipelines, ensuring the integrity of our data and supporting our analytics initiatives.

Jan 13, 2026
Apply
companyMindera logo
Full-time|On-site|Chennai, Tamil Nadu, India

Join our dynamic Data Team at Mindera, where as a Senior Data Engineer, you will play a pivotal role in creating the data pipelines and tables that drive our business-critical dashboards, empower self-service analytics, and support advanced machine learning models and real-time data products. Utilizing state-of-the-art tools such as DBT, Spark, and Airflow, you will convert high-volume raw event data into user-friendly, impactful datasets.You will collaborate cross-functionally with Machine Learning Engineers, Data Scientists, and BI Developers to facilitate data-driven decision-making throughout the organization. Our engineers benefit from a culture of autonomy, innovation, and continuous learning, supported by structured career progression paths and access to training resources.As a Senior Data Engineer, your responsibilities will include:Designing and constructing scalable data pipelines, models, and feature stores to support analytics and machine learning workloads.Deploying and managing cloud-native data applications on AWS, leveraging CI/CD pipelines to automate builds, tests, and releases.Ensuring the technical quality, performance, and reliability of production-grade data pipelines through robust observability and engineering best practices.

Mar 17, 2026
Apply
companyValGenesis logo
Full-time|On-site|Chennai

About ValGenesis ValGenesis stands at the forefront of digital validation solutions for the life sciences sector. Our comprehensive suite of products empowers 30 of the top 50 global pharmaceutical and biotech firms to embrace digital transformation, ensuring total compliance, and achieving excellence in manufacturing intelligence across their product lifecycle.Discover the opportunity to be part of ValGenesis, the leading standard for paperless validation in the Life Sciences industry: valgenesis.com/aboutAbout the Role:As a Senior Software Engineer in Data Engineering, you will play a pivotal role in developing scalable data solutions that support our innovative products.

Nov 19, 2025
Apply
company
Full-time|On-site|Chennai, Tamil Nadu, India

We are on the lookout for an exceptional Senior Data Engineer to architect, construct, and sustain scalable data pipelines for our enterprise-level data platforms focused on the Risk & Compliance sector. The successful candidate will possess a deep understanding of PySpark, Python, and data engineering best practices, emphasizing data quality, governance, and security.Key ResponsibilitiesDesign, develop, and enhance scalable data pipelines leveraging PySpark and PythonCreate robust ETL/ELT workflows to manage substantial volumes of both structured and unstructured dataCollaborate with data scientists, analysts, and business stakeholders to produce high-quality datasetsGuarantee data integrity, accuracy, and reliability through comprehensive validation frameworks and monitoringImplement data security and access control mechanisms that align with compliance standardsPartner closely with Risk & Compliance teams to fulfill regulatory and reporting obligationsOptimize the performance of data processing jobs and queriesMaintain and upgrade existing data architecture and pipelinesRequired Skills & Experience6+ years of experience in Data EngineeringExtensive hands-on experience with PySpark and PythonSolid background in SQL and Oracle databasesProven experience in constructing and managing large-scale data pipelinesStrong understanding of data warehousing concepts and ETL frameworksExperience with data validation, data quality, and governance frameworksFamiliarity with cloud platforms (AWS/Azure/GCP) is a plusExperience in the banking, financial services, or risk & compliance domain is preferredKey CompetenciesStrong analytical and problem-solving skillsAdept at working in a fast-paced, collaborative environmentExcellent communication and stakeholder management abilitiesMeticulous attention to detail, especially regarding data quality and securityNice to HaveExperience with Big Data ecosystems (Hadoop, Spark)Knowledge of data security and regulatory compliance frameworksPrior experience with enterprise data platforms

Apr 6, 2026
Apply
company
Full-time|On-site|Chennai, Tamil Nadu, India

We are seeking a talented Senior Data Engineer with extensive knowledge of real-time data streaming and distributed data processing to architect, develop, and enhance state-of-the-art data platforms. This pivotal role is essential for advancing event-driven architecture and real-time analytics within critical banking systems, particularly in risk and compliance domains.In this position, you will work synergistically with data architects, platform engineers, and business stakeholders to create low-latency, high-throughput data pipelines that empower sophisticated analytics and informed decision-making.Key ResponsibilitiesDesign, develop, and maintain robust real-time streaming pipelines utilizing Apache Kafka, PySpark, and FlinkConstruct scalable and fault-tolerant event-driven data architecturesHandle high-volume streaming data ensuring low latency and high reliabilityIntegrate diverse data sources into centralized data platforms (Data Lake / Lakehouse)Enhance data pipelines for performance, scalability, and cost-effectivenessUphold data quality, governance, and compliance in line with banking regulationsCollaborate with cross-functional teams to convert business needs into technical solutionsMonitor and debug streaming jobs and production pipelinesRequired Skills & Experience5+ years of experience in Data EngineeringDemonstrated proficiency in:PySpark / Spark StreamingApache Kafka (Producers, Consumers, Kafka Streams)Apache Flink or other real-time processing frameworksProven experience in building real-time / near real-time data pipelinesStrong understanding of distributed systems and event-driven architectureProficiency in Python / Java / ScalaExperience with data lakes, ETL/ELT pipelines, and big data ecosystemsFamiliarity with cloud platforms (AWS / Azure / GCP) is advantageousKnowledge of banking, risk, or compliance data systems is highly preferredPreferred QualificationsExperience in the financial services or banking domainExposure to data governance, regulatory reporting, or compliance systemsUnderstanding of CI/CD pipelines and DevOps practices for data platforms

Apr 6, 2026
Apply
company
Full-time|On-site|IND - Chennai

About Gen Digital Gen Digital is a global company focused on digital freedom and security. Our brands include Norton, Avast, LifeLock, and MoneyLion, serving nearly 500 million users in over 150 countries. We provide cybersecurity, online privacy, identity protection, and financial wellness products. Our mission centers on helping people manage and secure their digital and financial lives. We value diverse experiences and ideas, and we see AI as a partner for innovation. Gen Digital encourages autonomy, supports career growth, and offers flexible work options, generous time off, competitive pay, and wellness programs. The company culture emphasizes customer satisfaction, open discussion, experimentation, and continuous learning. Team members collaborate in an environment that respects and values differences as strengths. Senior Staff Data Engineer – Role Overview The Senior Staff Data Engineer will serve as a senior technical leader within the organization. This role focuses on designing and implementing large-scale data solutions that support Gen Digital’s cybersecurity platform strategy. The position combines deep technical skill with organizational influence. Key responsibilities include: Designing complex data architectures for enterprise-scale needs Implementing solutions that support a multi-petabyte data infrastructure Mentoring and guiding engineering teams Shaping the technical vision for data systems serving millions of users Location Chennai, India

Apr 14, 2026
Apply
company
Full-time|On-site|IND - Chennai

About Gen Digital Inc. Gen Digital Inc. brings together trusted consumer brands like Norton, Avast, LifeLock, and MoneyLion, serving nearly 500 million users in over 150 countries. The company’s mission centers on digital freedom, cybersecurity, online privacy, identity protection, and financial wellness. Gen’s legacy is built on helping people secure and manage their digital and financial lives. Employees at Gen benefit from flexible work options, comprehensive support, and resources to help them succeed. The company values open communication, experimentation, and continuous learning, and actively welcomes diverse backgrounds and perspectives. Gen offers competitive pay, benefits, and wellness programs to support work-life balance. Role Overview: Senior Data Platform Engineer Location: Chennai, India The Senior Data Platform Engineer joins the Data Platform Operations team, focusing on the daily management and monitoring of Gen’s data platforms and pipelines. This role is key to ensuring smooth operations, ongoing improvements, and reliable maintenance of data infrastructure. What You Will Do Oversee daily operations and health of data platforms and pipelines Drive enhancements for operational efficiency and platform maintenance Work closely with data engineers, analysts, and platform teams Maintain platform stability, observability, and cost-effectiveness Support readiness for new data use cases and business needs What Gen Values Customer focus and a collaborative approach Openness to new ideas and continuous improvement Respect for diverse experiences and backgrounds Supportive teamwork and recognition of individual strengths If the mission and values at Gen resonate, consider exploring a career with the team in Chennai.

Apr 15, 2026
Apply
companyBigID logo
Full-time|On-site|Chennai

About Us:BigID is a pioneering tech startup specializing in cutting-edge solutions for data security, compliance, privacy, and AI data management. We are at the forefront of the data landscape, empowering our customers to mitigate risks, foster business innovation, achieve compliance, build trust, make informed decisions, and maximize the value of their data.We are committed to building a global team united by a passion for innovation and advanced technology. BigID has received numerous accolades, including:Named a Hot Company in Artificial Intelligence and Machine Learning at the Global InfoSec AwardsListed in Citizens JMP Cyber 66 as one of the Hottest Privately Held Cybersecurity CompaniesCRN 100 list recognizes BigID as one of the 20 Coolest Identity Access Management and Data Protection Companies for three consecutive yearsRanked among the DUNS 100 Best Tech Companies to Work forFeatured as a Top 3 Big Data and AI Vendor to Watch in the 2023 BigDATAwire Readers' and Editors' Choice AwardsIncluded in the 2024 Inc. 5000 list for the fourth consecutive year!Shortlisted for the 2024 AI Awards in the Best Use of AI in Cybersecurity categoryAt BigID, our team is the cornerstone of our success. Join our dynamic, people-centric culture where you’ll have the opportunity to collaborate with some of the most talented professionals in the industry who prioritize innovation, diversity, integrity, and teamwork.Who We Are Looking For:We are on the hunt for a Senior Data Platform Engineer to enhance our Data Platform team. The ideal candidate will possess substantial experience in data engineering, particularly with Kafka and Elasticsearch, to design and maintain our robust data platforms. You will collaborate closely with cross-functional teams to ensure the scalability and reliability of our data solutions.Role Overview:As a Senior Data Platform Engineer, you will be instrumental in the design, development, maintenance, troubleshooting, and implementation of our big data architecture. Your proficiency in Elastic, Kafka, and Node.js will play a vital role in ensuring the scalability and performance of our data systems.Key Responsibilities:Develop data processing pipelines utilizing Kafka for real-time data streaming.Enhance and manage search functionalities leveraging Elastic technologies.Work alongside product managers, data analysts, and stakeholders to gather requirements and translate them into technical specifications.Lead code reviews and promote best practices in coding and data handling.

Feb 12, 2026

Sign in to browse more jobs

Create account — see all 563 results

Tailoring 0 resumes

We'll move completed jobs to Ready to Apply automatically.