About the job
Assume full responsibility for the technical implementation of data pipelines and migration requirements, ensuring optimal platform utilization by designing and developing applications tailored to business stakeholder needs.
Engage directly with stakeholders to gather essential requirements, overseeing automated end-to-end data engineering solutions.
Develop and implement data pipelines that facilitate the ingestion, transformation, and enhancement of structured, unstructured, and real-time data, while adhering to best practices for pipeline operations.
Identify, design, and realize internal process improvements, including the automation of manual tasks, optimization of data delivery, and infrastructure redesign for enhanced scalability.
Diagnose and resolve data quality issues raised by pipeline alerts or downstream consumers.
Apply Data Governance best practices, coupled with comprehensive knowledge of functional and technical impact analysis.
Offer insights and recommendations for technical solutions and enhancements to existing data systems.
Draft and maintain clear documentation on data models/schemas, including transformation and validation rules.
Implement tools that empower data consumers to extract, analyze, and visualize data more efficiently via data pipelines.
Lead the entire software lifecycle, including hands-on development, code reviews, testing, deployment, and documentation for batch ETL processes.
Collaborate with internal product and technical teams to ensure seamless and effective integration of our technology infrastructure, while migrating current data applications and pipelines to the Cloud (AWS) utilizing PaaS technologies.

