About the job
Key Responsibilities
- Design, test, deploy, orchestrate, monitor, and troubleshoot cloud-based data pipelines and automation workflows, adhering to best practices and security standards.
- Collaborate with data scientists, architects, ETL developers, and business stakeholders to capture, format, and integrate data from internal systems, external sources, and data warehouses.
- Investigate and assess batch and streaming data technologies to gauge their business impact and relevance to current use cases.
- Contribute to defining and continuously enhancing data engineering processes and methodologies.
- Ensure data integrity, accuracy, and security across all corporate data assets.
- Uphold high data quality standards for Data Services, Analytics, and Master Data Management.
- Create automated, scalable, and test-driven data pipelines.
- Utilize software development practices, including Git-based version control, CI/CD, and release management, to enhance AWS CI/CD pipelines.
- Partner with DevOps engineers and architects to refine DataOps tools and frameworks.

