Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Unlock Your Potential
Generate Job-Optimized Resume
One Click And Our AI Optimizes Your Resume to Match The Job Description.
Is Your Resume Optimized For This Role?
Find Out If You're Highlighting The Right Skills And Fix What's Missing
Experience Level
Experience
Qualifications
Required Technical SkillsAWS QuickSight: Extensive experience in creating dashboards, calculated fields, and managing reporting pipelines. Advanced SQL: Demonstrated capability to write complex queries for large-scale reporting datasets. AWS CDK: Comprehensive experience in deploying and managing cloud infrastructure using Python or TypeScript. Amazon Aurora: Profound understanding of querying and integrating Aurora as a primary data source for BI tools. General AWS Ecosystem: Familiarity with IAM, Lambda, or Glue as they relate to reporting workflows is an advantage.
About the job
Cloudary specializes in strengthening cloud capabilities by connecting Principal-Vetted professionals with partner technical teams. Founded and led by seasoned cloud architects, the company focuses on delivering strong engineering solutions. Team members work as agile, embedded experts to help move essential projects forward quickly.
What you will do
Dashboard and report development: Build and design interactive dashboards in QuickSight. Set up automated pipelines for generating reports.
Complex data modeling: Write and refine advanced SQL queries tailored for high-performance reporting needs.
Infrastructure as Code: Use AWS CDK to implement, manage, and scale the reporting infrastructure.
Data integration: Connect QuickSight with Amazon Aurora databases to enable efficient data retrieval and synchronization.
Role overview
This AWS Data Developer position is fully remote. The role centers on building reporting solutions, optimizing data models, and automating infrastructure for Cloudary's partners using AWS technologies.
About Cloudary
Cloudary is a leader in cloud solutions, dedicated to optimizing cloud growth by deploying highly skilled talent directly into the technical teams of our partners. Our team, led by seasoned cloud architects, ensures the delivery of superior engineering practices that drive innovation and efficiency.
Job OverviewJoin our team as an AWS Data Engineer, where you will be instrumental in designing, developing, and maintaining robust data pipelines on the AWS platform. Collaborating with technical analysts, client stakeholders, and data scientists, you will ensure the highest standards of data quality and integrity while optimizing storage solutions for efficiency and cost-effectiveness. Your expertise in AWS native technologies and Databricks will play a key role in transforming and processing data at scale.Key Responsibilities• Drive the modernization of data platforms through effective project leadership and support.• Architect and build scalable data pipelines utilizing AWS native services.• Enhance ETL processes to guarantee efficient data transformation.• Oversee the migration of workflows from on-premises systems to the AWS cloud, maintaining data quality and consistency throughout the process.• Develop automations and integrations to address data quality issues and inconsistencies.• Conduct thorough system testing and validation to confirm successful integration and functionality.• Implement strong security and compliance measures in the cloud environment.• Ensure data integrity before and after migration through comprehensive validation checks addressing completeness, consistency, and accuracy of data sets.• Work closely with data architects and lead developers to document manual data workflows and devise automation strategies.
Job OverviewJoin our dynamic team at qodeworld as an AWS Data Engineer. In this pivotal role, you will architect, develop, and sustain scalable data pipelines utilizing AWS technologies. Collaborate with technical analysts, client stakeholders, data scientists, and various team members to ensure the accuracy and integrity of data while optimizing storage solutions for both performance and cost-effectiveness. Your expertise with AWS native technologies and Databricks will be essential for efficient data transformations and processing.Key Responsibilities• Lead and support projects aimed at modernizing our data platform.• Design and implement robust, scalable data pipelines leveraging AWS native services.• Optimize ETL workflows to enhance data transformation efficiency.• Transition workflows from on-premise systems to AWS cloud, maintaining data quality and consistency.• Create automations and integrations to address data inconsistencies and quality concerns.• Conduct system testing and validation to ensure seamless integration and functionality.• Establish security and compliance measures within the cloud environment.• Validate data quality before and after migration, addressing issues related to completeness, consistency, and accuracy.• Collaborate with data architects and lead developers to document manual data movement and design automation strategies.Qualifications• A minimum of 10 years of experience in data engineering with a strong focus on AWS native technologies (AWS Glue, Python, Snowflake, S3, Redshift).• Proven ability to design and implement data pipelines and ETL processes.• Strong analytical skills and attention to detail for data quality assurance.• Experience in cloud migration strategies and automation of data workflows.• Excellent collaboration skills to work with cross-functional teams and stakeholders.
Capstone Integrated Solutions is hiring a Senior Data Engineer with a focus on AWS. This position is fully remote. Role overview This role centers on designing and building data pipelines using AWS tools and services. The Senior Data Engineer will work with teams across the company to make sure data remains accurate, reliable, and accessible. In addition, the position involves contributing to data architecture decisions and promoting engineering best practices. Collaboration and team approach The team values practical solutions and close collaboration. Colleagues from different functions work together to deliver data systems that help the company meet its goals. What you will do Design and build data pipelines with AWS services Partner with teams to ensure data quality and accessibility Influence data architecture and engineering standards
About Egen:Egen is a rapidly expanding company driven by a data-centric philosophy. We unite top engineering talent with cutting-edge technology platforms such as Google Cloud and Salesforce, empowering our clients to leverage data and insights for impactful decision-making. Our commitment to fostering a workplace where exceptional individuals can thrive allows them to harness their engineering and technology skills to innovate how data and platforms can positively transform the world. We emphasize continual learning, relish tackling complex challenges, and are dedicated to driving rapid, effective outcomes. If this resonates with you, we invite you to join our team.Interested in discovering more about life at Egen? Explore these resources alongside the job description.-> Meet Egen -> Life at Egen-> Culture and Values at Egen-> Career Development at EgenNOTE: This is a 6-month contract.
Mactores has established itself as a frontrunner in delivering innovative data platform solutions since 2008. We empower businesses to unlock their potential through automated, agile, and secure End-to-End Data Solutions. Collaborating closely with our clients, we strategize and guide them on their digital transformation journey, whether through assessments, migrations, or modernization efforts.Mactores is on the lookout for a Senior AWS Data Engineer to enhance our dynamic team. The ideal candidate will possess a profound expertise in PySpark and SQL, coupled with hands-on experience in managing data pipelines utilizing Amazon EMR or Amazon Glue. Proficiency in data modeling and end-user querying via Amazon Redshift or Snowflake, along with familiarity with Amazon Athena, Presto, and orchestration tools such as Airflow, is essential for this role.
We are seeking a skilled AWS Data Architect to join our innovative team at dev2. This fully remote position offers you the flexibility to work from anywhere in the USA while contributing to exciting data-driven projects. As an AWS Data Architect, you will leverage your expertise in cloud solutions to design and implement scalable data architectures that meet our clients’ needs.Your role will involve collaborating with cross-functional teams to define technical requirements and ensure optimal data flow and storage solutions. If you are passionate about cloud technologies and data architecture, we invite you to apply!
Mactores stands as a leading provider of modern data platform solutions, empowering businesses since 2008 to enhance their value through automation. We specialize in comprehensive End-to-End Data Solutions that are agile, automated, and secure. Our collaborative approach allows us to partner with clients to strategize and accelerate their digital transformations through assessments, migrations, and modernizations.As an AWS Data Engineer, you will function as a full-stack data engineer dedicated to solving real-world business challenges. Collaborating closely with business leaders, analysts, and data scientists, you will gain a deep understanding of the business domain and work alongside fellow engineers to develop data products that facilitate improved decision-making. Your passion for data quality and your ability to create scalable solutions will be key in addressing broader business inquiries.If you thrive on problem-solving and possess a passion for innovation, we invite you to become a part of Team Mactores. Our workspace is designed to be casual and enjoyable, consciously steering clear of rigid corporate norms. We prioritize productivity and creativity, allowing you to be part of a world-class team while remaining true to yourself.
Cloudary specializes in strengthening cloud capabilities by connecting Principal-Vetted professionals with partner technical teams. Founded and led by seasoned cloud architects, the company focuses on delivering strong engineering solutions. Team members work as agile, embedded experts to help move essential projects forward quickly. What you will do Dashboard and report development: Build and design interactive dashboards in QuickSight. Set up automated pipelines for generating reports. Complex data modeling: Write and refine advanced SQL queries tailored for high-performance reporting needs. Infrastructure as Code: Use AWS CDK to implement, manage, and scale the reporting infrastructure. Data integration: Connect QuickSight with Amazon Aurora databases to enable efficient data retrieval and synchronization. Role overview This AWS Data Developer position is fully remote. The role centers on building reporting solutions, optimizing data models, and automating infrastructure for Cloudary's partners using AWS technologies.
Full-time|Remote|Remote — Dallas, Texas, United States
Tiger Analytics is a rapidly expanding advanced analytics consulting firm that specializes in delivering exceptional insights through Data Science, Machine Learning, and Artificial Intelligence. Our team possesses profound expertise, making us a trusted analytics partner for numerous Fortune 500 companies, empowering them to derive substantial business value from their data. Our leadership and contribution to the analytics field have been recognized by prominent market research firms such as Forrester and Gartner. We are on the lookout for outstanding talent to enhance our global analytics consulting team.As a Lead Data Engineer, you will play a pivotal role in architecting, constructing, and sustaining scalable data pipelines within the AWS cloud ecosystem. You will collaborate with diverse teams to facilitate data analytics, machine learning, and business intelligence projects. The ideal candidate will bring extensive experience with AWS services, Databricks, and Apache Airflow.
Join our innovative team at Casino Cash Trac as an AWS Cloud Data Engineer. In this dynamic role, you will be responsible for designing, developing, and maintaining robust data pipelines and cloud architecture that support our business goals. You will collaborate with cross-functional teams to optimize our data processing infrastructure and ensure data integrity and security.
Full-time|On-site|Atlanta, Georgia, United States; Center Valley, Pennsylvania, United States; Las Vegas, NV; Tampa, Florida, United States
Role overview Shift4 Payments is looking for an AWS Cloud Data Engineer to help build and maintain cloud-based data processing systems. The position involves designing, developing, and supporting solutions that keep data flowing smoothly and analytics reliable. Locations Atlanta, Georgia Center Valley, Pennsylvania Las Vegas, Nevada Tampa, Florida What you will do Design and develop data processing systems on AWS Maintain and improve cloud-based analytics infrastructure Support consistent data flow for analytics and reporting
Senior DevOps Engineer (L5)We are seeking a highly skilled Senior DevOps Engineer to join our dynamic team at Megazone. As a pivotal member of our tech department, you will lead initiatives to enhance our infrastructure and streamline our application deployment processes.Key Responsibilities:Take charge in a senior capacity within our DevOps team, guiding projects to success.Collaborate on substantial projects while also managing smaller tasks independently.Demonstrate technical expertise and hands-on experience across various tools and practices within the DevOps toolchain.Engage in the design and coding of modules that address infrastructure, application, and process needs.Key Requirements:Bachelor's Degree or a minimum of 5 years of relevant professional or military experience.At least 5 years as a technical specialist with a solid background in DevOps practices.Proficiency in programming languages such as Python, Ruby, Go, Swift, Java, .Net, or C++, with at least 2 years of hands-on experience.Experience in automating cloud-native technologies, application deployment, and infrastructure provisioning.Practical knowledge of Infrastructure as Code using tools like CloudFormation and Terraform.Experience in developing cloud-native CI/CD workflows and tools, including Jenkins, Bamboo, TeamCity, Code Deploy (AWS), and/or GitLab.Hands-on experience with microservices and distributed architectures including containers, Kubernetes, and/or serverless technologies.Familiarity with the complete software development lifecycle and delivery utilizing Agile methodologies.Competency with Chef, Puppet, Salt, or Ansible in production settings.Understanding of IP networking, including VPNs, DNS, load balancing, and firewalls.Experience with monitoring and log aggregation frameworks such as Kafka, Logstash, Splunk, Elasticsearch, and Kibana.Expertise in implementing and designing cloud-native security concepts, DevSecOps, or MLOps.AWS Certifications such as Solutions Architect Pro, DevOps Engineer Pro, SysOps Admin, or Developer Associate are highly desirable.Excellent presentation, verbal communication, and written communication skills.Proven ability to lead effectively across diverse organizations and engagements.
Join TechBiz Global as we provide exceptional recruitment services to top-tier clients in our portfolio. We are currently in search of a DevOps Engineer to become a vital part of one of our esteemed client’s teams. This is an outstanding opportunity for professionals eager to thrive in a dynamic and innovative environment. Key Responsibilities Lead the migration of CI/CD pipelines from Jenkins to GitHub Actions.Work with Groovy and Bash-based shared libraries, Jenkinsfiles, and workflows in shared repositories.Configure and manage Jenkins shared libraries, credentials, pipelines, and automation scripts.Implement GitHub Actions workflows across multiple repositories and jobs.Manage GitHub Actions runners, secrets, and secure communications between GitHub and runners.Configure AWS IAM roles for EC2, IRSA for EKS, STS, and SSH access.Collaborate closely with development teams to enhance pipeline efficiency, security, and reliability.Troubleshoot CI/CD issues and optimize build and deployment processes.
Full-time|On-site|Atlanta, Georgia, United States; Center Valley, Pennsylvania, United States; Las Vegas, NV; Tampa, Florida, United States
About Shift4 Payments Shift4 Payments, Inc. streamlines complex payment systems for businesses around the world. As a leader in commerce-enabling technology, Shift4 supports billions of transactions each year for hundreds of thousands of clients across many industries. Learn more at www.shift4.com. Role Overview Shift4 is growing quickly and is hiring a Senior AWS Cloud Data Engineer to join its in-house engineering team. This position plays a key part in shaping technology, architecture, and engineering delivery within the AWS Cloud Data Services group. What You Will Do Design, build, and maintain data infrastructure solutions on AWS Support data-driven projects across the organization Collaborate with project managers, business analysts, and software engineers Help ensure systems are available, reliable, and scalable Take ownership of assigned projects Who We’re Looking For Motivated, self-starting engineers who enjoy working on challenging projects Strong background in AWS cloud data engineering Comfort working with cross-functional teams Location This is an on-site role based at one of the following Shift4 offices: Tampa, FL; Atlanta, GA; Las Vegas, NV; or Center Valley, PA.
Are you an exceptional Data Engineer with a flair for problem-solving and a passion for optimizing data processes? At pridelogic, we are on the lookout for a technical powerhouse to join our innovative team. If you pride yourself on being the technical leader who consistently delivers complex features ahead of schedule, and you write code that stands as an example for others, we want to hear from you!This position is designed for those who know they are extraordinary in their field. We seek developers with a proven track record of success in data engineering.Your Responsibilities:Develop, optimize, and scale data pipelines and infrastructure utilizing technologies such as Python, TypeScript, Apache Airflow, PySpark, AWS Glue, and Snowflake.Design, implement, and monitor data ingestion and transformation workflows including DAGs, alerting systems, retries, SLAs, lineage, and cost management.Collaborate with platform and AI/ML teams to automate ingestion, validation, and real-time compute workflows, aiming towards a feature store.Enhance engineering dashboards with pipeline health metrics and observability features for comprehensive insight.Model data and execute efficient, scalable transformations in Snowflake and PostgreSQL.Create reusable frameworks and connectors to standardize internal data publishing and consumption processes.
At CapTech, our Data Engineering consultants empower clients to create and sustain sophisticated data systems that consolidate information from various sources, enabling informed decision-making. We design and implement data pipelines, preparing data for utilization by data scientists, analysts, and other data systems. Passionate about problem-solving, we strive to deliver innovative solutions that meet our clients' needs. Our Cloud Data Engineers harness cloud infrastructure to provide immediate value while ensuring scalability for the future. We foster a collaborative environment that encourages knowledge-sharing among developers, architects, and clients.Key responsibilities for the Cloud Data Engineer position include:Creating data pipelines and data products using Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP).Consulting with clients on technology and methodologies for effectively ingesting and processing data.Applying engineering best practices to tackle complex data challenges.Working closely with end users, development teams, and business analysts to ensure data architecture maximizes organizational value.Explaining architectural distinctions between solution approaches along with their pros and cons.
We are seeking a top-tier Data Engineer to join our team at wizdaa. If you are a developer who excels in:Leading your team with technical expertiseResolving complex challenges that others find difficultDelivering intricate features at an accelerated paceCreating exceptionally clean and maintainable codeEnhancing our codebase with pride and diligenceYour skills and experience will help us drive efficiency and innovation in data processing.Key Responsibilities:Develop, enhance, and scale data pipelines and infrastructure utilizing Python, TypeScript, Apache Airflow, PySpark, AWS Glue, and Snowflake.Design, implement, and monitor data ingestion and transformation workflows, ensuring optimal performance and reliability.Work collaboratively with platform and AI/ML teams to automate data workflows and develop a comprehensive feature store.Integrate health metrics into engineering dashboards for enhanced visibility and operational insight.Model data and execute scalable transformations in Snowflake and PostgreSQL.Create reusable frameworks and connectors to streamline internal data processes.
At CapTech, our Data Engineering consultants empower clients to construct and sustain sophisticated data systems that integrate data from diverse sources to facilitate informed decision-making. We specialize in developing pipelines and preparing data for utilization by data scientists, analysts, and other data systems. Our passion lies in problem-solving and delivering innovative solutions tailored to our clients' needs. As Cloud Data Engineers, we leverage the cloud infrastructure of our clients to provide immediate value and ensure scalability for future growth. We thrive in a collaborative environment, offering numerous opportunities to learn from and share insights with fellow developers, architects, and clients.Your Impact and Responsibilities:Serve as a trusted advisor to customers, offering best practices, methodologies, and technologies for implementing robust data engineering solutions.Design, implement, and maintain modern data pipelines that deliver optimal solutions using appropriate cloud technologies.Collaborate with product owners and business subject matter experts (SMEs) to analyze customer requirements and deliver sustainable engineered solutions.Provide technical leadership and collaborate across teams to ensure alignment of technical solutions with customer needs.Stay abreast of the latest cloud technologies, patterns, and methodologies; effectively communicate results and ideas to key stakeholders.
We are in search of a highly skilled and innovative Data Engineer to join our dynamic team. As a pivotal technical leader, you will:Be the go-to expert in your team, guiding projects with your technical acumen.Conquer complex challenges that others find daunting.Deliver intricate features at an unparalleled pace.Produce exceptionally clean and maintainable code.Enhance the quality of our entire codebase.If you're an exceptional developer with a proven track record, we want to hear from you! This role requires a unique blend of skills and experience, designed for the best in the field.Responsibilities:Develop, optimize, and scale data pipelines and infrastructure utilizing technologies such as Python, TypeScript, Apache Airflow, PySpark, AWS Glue, and Snowflake.Design, operationalize, and oversee ingestion and transformation workflows, including DAGs, alerting, retries, SLAs, lineage, and cost controls.Partner with platform and AI/ML teams to automate ingestion, validation, and real-time compute workflows, contributing towards a feature store.Integrate pipeline health and metrics into engineering dashboards for enhanced visibility and observability.Model data and execute efficient, scalable transformations using Snowflake and PostgreSQL.Create reusable frameworks and connectors to standardize internal data publishing and consumption.
As a Senior Data Engineer at CapTech Consulting, you will empower our clients to develop and sustain sophisticated data systems that integrate information from various sources, thereby facilitating informed decision-making for stakeholders. You will be responsible for constructing data pipelines and preparing data for utilization by data scientists, analysts, and other data systems. We thrive on tackling challenges and delivering innovative solutions to our clients. Our Cloud Data Engineers utilize clients' cloud infrastructures to provide immediate value while ensuring scalability for future endeavors. We foster a collaborative atmosphere rich in opportunities to learn from and share insights with fellow developers, architects, and clients.The Value You Will Deliver:Serve as a trusted advisor, guiding clients with best practices, methodologies, and technologies for effective data engineering solutions.Design, implement, and maintain cutting-edge data pipelines that leverage suitable cloud technologies for optimal outcomes.Collaborate with product owners and business subject matter experts to assess client requirements and deliver sustainable engineered solutions.Provide technical leadership and work collaboratively within teams to align overall technical solutions with client needs.Stay abreast of the latest cloud technologies, patterns, and methodologies; share insights by clearly articulating findings and ideas to key stakeholders.
Mar 17, 2026
Sign in to browse more jobs
Create account — see all 90,709 results
Tailoring 0 resumes…
Tailoring 0 resumes…
We'll move completed jobs to Ready to Apply automatically.