1 - 20 of 59,044 Jobs

Search for Lead Data Engineer - LATAM (Python/PySpark/AWS Glue)

59,044 results

Apply
companywizdaa logo
Full-time|Remote|Remote job

We are seeking a top-tier Data Engineer to join our team at wizdaa. If you are a developer who excels in:Leading your team with technical expertiseResolving complex challenges that others find difficultDelivering intricate features at an accelerated paceCreating exceptionally clean and maintainable codeEnhancing our codebase with pride and diligenceYour skills and experience will help us drive efficiency and innovation in data processing.Key Responsibilities:Develop, enhance, and scale data pipelines and infrastructure utilizing Python, TypeScript, Apache Airflow, PySpark, AWS Glue, and Snowflake.Design, implement, and monitor data ingestion and transformation workflows, ensuring optimal performance and reliability.Work collaboratively with platform and AI/ML teams to automate data workflows and develop a comprehensive feature store.Integrate health metrics into engineering dashboards for enhanced visibility and operational insight.Model data and execute scalable transformations in Snowflake and PostgreSQL.Create reusable frameworks and connectors to streamline internal data processes.

Sep 8, 2025
Apply
companypridelogic logo
Full-time|Remote|Remote job

Are you an exceptional Data Engineer with a flair for problem-solving and a passion for optimizing data processes? At pridelogic, we are on the lookout for a technical powerhouse to join our innovative team. If you pride yourself on being the technical leader who consistently delivers complex features ahead of schedule, and you write code that stands as an example for others, we want to hear from you!This position is designed for those who know they are extraordinary in their field. We seek developers with a proven track record of success in data engineering.Your Responsibilities:Develop, optimize, and scale data pipelines and infrastructure utilizing technologies such as Python, TypeScript, Apache Airflow, PySpark, AWS Glue, and Snowflake.Design, implement, and monitor data ingestion and transformation workflows including DAGs, alerting systems, retries, SLAs, lineage, and cost management.Collaborate with platform and AI/ML teams to automate ingestion, validation, and real-time compute workflows, aiming towards a feature store.Enhance engineering dashboards with pipeline health metrics and observability features for comprehensive insight.Model data and execute efficient, scalable transformations in Snowflake and PostgreSQL.Create reusable frameworks and connectors to standardize internal data publishing and consumption processes.

Sep 8, 2025
Apply
companywizdaa logo
Full-time|Remote|Remote job

We are in search of a highly skilled and innovative Data Engineer to join our dynamic team. As a pivotal technical leader, you will:Be the go-to expert in your team, guiding projects with your technical acumen.Conquer complex challenges that others find daunting.Deliver intricate features at an unparalleled pace.Produce exceptionally clean and maintainable code.Enhance the quality of our entire codebase.If you're an exceptional developer with a proven track record, we want to hear from you! This role requires a unique blend of skills and experience, designed for the best in the field.Responsibilities:Develop, optimize, and scale data pipelines and infrastructure utilizing technologies such as Python, TypeScript, Apache Airflow, PySpark, AWS Glue, and Snowflake.Design, operationalize, and oversee ingestion and transformation workflows, including DAGs, alerting, retries, SLAs, lineage, and cost controls.Partner with platform and AI/ML teams to automate ingestion, validation, and real-time compute workflows, contributing towards a feature store.Integrate pipeline health and metrics into engineering dashboards for enhanced visibility and observability.Model data and execute efficient, scalable transformations using Snowflake and PostgreSQL.Create reusable frameworks and connectors to standardize internal data publishing and consumption.

Sep 8, 2025
Apply
companypridelogic logo
Full-time|Remote|Remote job

We are on the lookout for an exceptional Data Engineer, a technical leader who thrives on challenges and excels in coding. If you are the person who:Acts as the definitive technical authority within your teamSolves complex technical problems with easeDelivers intricate features at a remarkable speedWrites code that exemplifies best practices and clarityIs dedicated to enhancing the overall quality of the codebaseThen we want to hear from you!We are not looking for just anyone; we want developers who are confident in their skills and have proven their excellence.What you will be responsible for:Designing, optimizing, and expanding data pipelines and infrastructure leveraging Python, TypeScript, Apache Airflow, PySpark, AWS Glue, and Snowflake.Creating, operationalizing, and monitoring data ingestion and transformation workflows including DAGs, alerting mechanisms, retries, SLAs, lineage, and cost management.Partnering with platform and AI/ML teams to streamline ingestion, validation, and real-time compute workflows; contributing towards the development of a feature store.Incorporating pipeline health metrics into engineering dashboards to ensure complete visibility and observability.Modeling data and executing efficient, scalable transformations within Snowflake and PostgreSQL.Establishing reusable frameworks and connectors to standardize internal data publishing and consumption.

Sep 8, 2025
Apply
company
Full-time|Remote|Remote — Dallas, Texas, United States

Tiger Analytics is a rapidly expanding advanced analytics consulting firm that specializes in delivering exceptional insights through Data Science, Machine Learning, and Artificial Intelligence. Our team possesses profound expertise, making us a trusted analytics partner for numerous Fortune 500 companies, empowering them to derive substantial business value from their data. Our leadership and contribution to the analytics field have been recognized by prominent market research firms such as Forrester and Gartner. We are on the lookout for outstanding talent to enhance our global analytics consulting team.As a Lead Data Engineer, you will play a pivotal role in architecting, constructing, and sustaining scalable data pipelines within the AWS cloud ecosystem. You will collaborate with diverse teams to facilitate data analytics, machine learning, and business intelligence projects. The ideal candidate will bring extensive experience with AWS services, Databricks, and Apache Airflow.

Apr 28, 2025
Apply
companyqodeworld logo
Full-time|Hybrid|Pennsylvania, Pennsylvania, United States

About the Role qodeworld is hiring a Senior Data Engineer with deep experience in Informatica and PySpark. This position focuses on building and optimizing data solutions for large-scale analytics. The role is based in either Cleveland, OH or Pittsburgh, PA and requires onsite work three days per week. Main Responsibilities 8–10 years of experience in data engineering and data analysis. Hands-on expertise with Informatica PowerCenter and Informatica Data Quality (IDQ) for ETL design, development, and optimization. Advanced proficiency with PySpark for processing, transforming, and analyzing large datasets. Strong knowledge of Hadoop technologies, including HDFS, Hive, Sqoop, and MapReduce. Solid programming skills in Python and Kafka to build both streaming and batch data pipelines. Thorough understanding of database concepts, data modeling, data design, and ETL workflows. Experience across all phases of the ETL lifecycle: data extraction, ingestion, quality checks, normalization, and loading. Comfort working in Agile environments, using tools such as Jira to support project delivery. Proven background in client-facing roles, with strong communication and leadership skills to manage the software development lifecycle. Preferred Skills Familiarity with AWS data components and analytics. Understanding of machine learning models and AI concepts. Experience with data modeling tools like Erwin. Qualifications Bachelor’s or Master’s degree in Computer Science or a related discipline. Strong problem-solving skills and a collaborative approach to cross-functional teamwork.

Apr 20, 2026
Apply
companyqodeworld logo
Full-time|Hybrid|Pennsylvania, Pennsylvania, United States

Role Overview qodeworld is hiring a Senior Data Engineer with deep experience in Informatica and PySpark. This role is based in Pennsylvania and requires working onsite three days each week. What You Will Do Apply 8–10 years of hands-on experience in data engineering and data analysis. Design, develop, and optimize ETL processes using Informatica PowerCenter and Informatica Data Quality (IDQ). Build and maintain large-scale data processing and analytics solutions with advanced PySpark skills. Work with Hadoop technologies, including HDFS, Hive, Sqoop, and MapReduce. Develop streaming and batch data pipelines using Python and Kafka. Use strong knowledge of database concepts, data modeling, and ETL workflows to support data architecture and design. Manage the full ETL lifecycle: data extraction, ingestion, quality checks, normalization, and loading. Contribute to Agile projects, using Jira for tracking and delivery. Engage directly with clients, coordinating across the software development lifecycle and communicating clearly with stakeholders. Preferred Qualifications Experience with AWS data services and analytics tools. Familiarity with machine learning models and AI concepts. Knowledge of data modeling tools such as Erwin.

Apr 20, 2026
Apply
companydev2 logo
Full-time|Hybrid|Albany

We are seeking a highly skilled Manager of Data Engineering to lead our data engineering projects at dev2. This is a unique opportunity to work in a hybrid environment where you can collaborate with a talented team while having the flexibility to work remotely. You will be responsible for designing, building, and maintaining our data architecture using technologies such as Python, AWS, Airflow, and Snowflake. Your leadership will guide the successful execution of data strategies and ensure the availability of high-quality data for analysis and decision making.

Dec 11, 2023
Apply
companydev2 logo
Full-time|Hybrid|New Haven

dev2 is seeking an experienced and innovative Data Engineering Manager to lead our data engineering team in a hybrid work environment. In this role, you will be responsible for overseeing the design, development, and implementation of scalable data pipelines and architecture using technologies such as Python, AWS, Airflow, and Snowflake. Your leadership will drive the data strategy and empower the team to enhance data availability and quality.

Dec 11, 2023
Apply
companydev2 logo
Full-time|Hybrid|Longview

Join dev2 as a Manager of Data Engineering, where you will lead a talented team in designing and implementing data solutions using cutting-edge technologies such as Python, AWS, Airflow, and Snowflake. This hybrid role allows you to enjoy a flexible work environment while collaborating closely with stakeholders to drive data initiatives that enhance business operations.Your expertise will help shape data strategies that promote efficiency and innovation within our organization. If you're passionate about data engineering and leadership, we want to hear from you!

Dec 11, 2023
Apply
companydev2 logo
Full-time|Hybrid|Portland

Join dev2 as a Data Engineering Manager, where you will lead a dynamic team in harnessing the power of data to drive strategic decisions and innovations. In this hybrid role, you will be responsible for overseeing data engineering projects, ensuring the implementation of best practices in data management and cloud solutions using technologies such as Python, AWS, Airflow, and Snowflake.Your leadership will guide the team in developing scalable data pipelines, optimizing data architecture, and enhancing data quality. Collaborate with cross-functional teams to align data solutions with business objectives, while mentoring junior engineers and fostering a culture of continuous learning.

Dec 11, 2023
Apply
companydev2 logo
Full-time|Hybrid|Poughkeepsie

We are seeking a talented and experienced Data Engineering Manager to lead our data engineering team at dev2. In this hybrid role, you will be responsible for overseeing the development and implementation of robust data pipelines and architectures using Python, AWS, Airflow, and Snowflake. Your expertise will help drive data-driven decision-making across the organization.

Dec 11, 2023
Apply
companydev2 logo
Full-time|Hybrid|Concord

Join dev2 as a Manager of Data Engineering where you will lead a talented team in designing and implementing data solutions leveraging Python, AWS, Airflow, and Snowflake. This hybrid role offers the flexibility of working remotely while collaborating with your team on-site.In this dynamic position, you will be responsible for guiding the data engineering team through complex data challenges, ensuring optimal performance, and driving innovation in our data architecture. Your leadership will be crucial in fostering a collaborative environment that encourages creativity and efficiency.

Dec 11, 2023
Apply
companyqodeworld logo
Full-time|Remote|Remote — United States

Job OverviewJoin our dynamic team at qodeworld as an AWS Data Engineer. In this pivotal role, you will architect, develop, and sustain scalable data pipelines utilizing AWS technologies. Collaborate with technical analysts, client stakeholders, data scientists, and various team members to ensure the accuracy and integrity of data while optimizing storage solutions for both performance and cost-effectiveness. Your expertise with AWS native technologies and Databricks will be essential for efficient data transformations and processing.Key Responsibilities• Lead and support projects aimed at modernizing our data platform.• Design and implement robust, scalable data pipelines leveraging AWS native services.• Optimize ETL workflows to enhance data transformation efficiency.• Transition workflows from on-premise systems to AWS cloud, maintaining data quality and consistency.• Create automations and integrations to address data inconsistencies and quality concerns.• Conduct system testing and validation to ensure seamless integration and functionality.• Establish security and compliance measures within the cloud environment.• Validate data quality before and after migration, addressing issues related to completeness, consistency, and accuracy.• Collaborate with data architects and lead developers to document manual data movement and design automation strategies.Qualifications• A minimum of 10 years of experience in data engineering with a strong focus on AWS native technologies (AWS Glue, Python, Snowflake, S3, Redshift).• Proven ability to design and implement data pipelines and ETL processes.• Strong analytical skills and attention to detail for data quality assurance.• Experience in cloud migration strategies and automation of data workflows.• Excellent collaboration skills to work with cross-functional teams and stakeholders.

Jan 30, 2026
Apply
company
Full-time|On-site|Suitland-Silver Hill, Maryland, United States

Position OverviewAs the Lead Data Engineer, you will spearhead the design and development of advanced, scalable data architectures to facilitate the transition of outdated, file-based analytical systems to modern AWS Cloud Native environments. This pivotal role emphasizes transforming legacy SAS-based data storage models—including flat files, batch outputs, and subsystem-specific data artifacts—into structured, governed, and scalable data frameworks optimized for cloud-native processing.You will ensure data integrity, performance, and visibility across a comprehensive modernization initiative while providing technical leadership in data modeling, ingestion patterns, validation frameworks, and transparency reporting.Expert-level proficiency in Python and substantial experience in architecting AWS-based data solutions are essential for this role.

Feb 25, 2026
Apply
companydev2 logo
Full-time|Hybrid|Philadelphia

Join our innovative team at dev2 as a Data Engineering Manager where you'll lead initiatives in data architecture and engineering. In this hybrid role, you will harness your expertise in Python, AWS, Airflow, and Snowflake to drive strategic projects and enhance our data infrastructure. Your leadership will guide a talented team as we tackle complex data challenges and deliver high-quality solutions that empower our business decisions.

Dec 11, 2023
Apply
companyAscentAI logo
Full-time|Remote|Remote (U.S. Based)

Lead Python Engineer - Data Infrastructure About AscentAI AscentAI is at the forefront of developing intelligent software solutions tailored for risk and compliance teams within financial institutions. Our innovative platform simplifies complex regulatory information into actionable insights, empowering teams to mitigate risks, enhance operational efficiency, and proactively adapt to changes in global regulations. As a vibrant, mission-driven organization, we are pushing the limits of machine learning and artificial intelligence, combined with human-in-the-loop systems, to tackle some of the most challenging issues in regulatory compliance. The Role We are seeking a skilled Python Engineer to join our dynamic team. In this pivotal role, you will lead the design and development of robust, large-scale web scraping platforms that underpin AscentAI's data infrastructure. You will work collaboratively with fellow engineers and analysts to define data requirements, architect efficient data pipelines, and ensure the delivery of reliable, high-quality data at scale. Your expertise will also be critical in advising on scraping strategies, counteracting anti-bot measures, and implementing best practices in data extraction for cross-functional stakeholders in engineering, data science, and product development. This is a significant role that offers ownership and visibility, providing an opportunity to influence our technical architecture and overall business success. What You’ll Do Lead the design and development of large-scale web scraping platforms using Python and related frameworks. Mentor junior developers, providing technical guidance and conducting code reviews to ensure high-quality and maintainable code. Devise advanced strategies to navigate and overcome sophisticated anti-bot defenses such as CAPTCHAs, Cloudflare, and IP blocking, while adhering to legal and ethical standards and website terms of service. Collaborate with data analysts and engineers to establish data requirements and facilitate seamless data integration into databases. Optimize scrapers for performance, speed, and stability; set up real-time monitoring and alert systems to quickly respond to failures or changes in target sites. Create comprehensive technical documentation and engage effectively with cross-functional teams to ensure alignment and manage expectations.

Mar 10, 2026
Apply
companyCaptivation Software logo
Full-time|$130K/yr - $270K/yr|Hybrid|Columbia, MD - Hybrid

Build something to be proud of.Captivation Software has established a strong reputation for delivering timely, tailored solutions to our clients. Our dedicated team of engineers takes pride in their work, continuously innovating to provide the best possible outcomes. We are seeking talented software developers who are passionate about making a difference in our mission to protect our country. Position OverviewAs a Cloud Platform Engineer at Captivation Software, you will leverage your expertise in cloud technology (AWS), Python programming, and Linux systems to develop custom software components and integrate open-source solutions. Your work will focus on tackling complex time series analysis challenges using cutting-edge Big Data and Cloud technologies. The ideal candidate will work independently to identify and resolve issues while aligning with the strategic goals set by our architectural team. This position offers a hybrid work environment, collaborating with a dynamic team.

May 19, 2025
Apply
companyqodeworld logo
Full-time|Remote|Remote — United States

Job OverviewJoin our team as an AWS Data Engineer, where you will be instrumental in designing, developing, and maintaining robust data pipelines on the AWS platform. Collaborating with technical analysts, client stakeholders, and data scientists, you will ensure the highest standards of data quality and integrity while optimizing storage solutions for efficiency and cost-effectiveness. Your expertise in AWS native technologies and Databricks will play a key role in transforming and processing data at scale.Key Responsibilities• Drive the modernization of data platforms through effective project leadership and support.• Architect and build scalable data pipelines utilizing AWS native services.• Enhance ETL processes to guarantee efficient data transformation.• Oversee the migration of workflows from on-premises systems to the AWS cloud, maintaining data quality and consistency throughout the process.• Develop automations and integrations to address data quality issues and inconsistencies.• Conduct thorough system testing and validation to confirm successful integration and functionality.• Implement strong security and compliance measures in the cloud environment.• Ensure data integrity before and after migration through comprehensive validation checks addressing completeness, consistency, and accuracy of data sets.• Work closely with data architects and lead developers to document manual data workflows and devise automation strategies.

Jan 30, 2026
Apply
companyTruelogic logo
Full-time|Remote|LatAm

Join our dynamic team at Truelogic as a Senior Back End Engineer specializing in Python, where you will play a crucial role in designing and implementing robust software solutions. We are looking for an innovative problem solver who is passionate about technology and eager to contribute to exciting projects. Your expertise will help us to enhance our back-end systems, ensuring scalability and performance.

Mar 20, 2026

Sign in to browse more jobs

Create account — see all 59,044 results

Tailoring 0 resumes

We'll move completed jobs to Ready to Apply automatically.