Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Unlock Your Potential
Generate Job-Optimized Resume
One Click And Our AI Optimizes Your Resume to Match The Job Description.
Is Your Resume Optimized For This Role?
Find Out If You're Highlighting The Right Skills And Fix What's Missing
Experience Level
Manager
Qualifications
Proven experience in data engineering and management. Strong proficiency in Python programming and data manipulation. Experience with AWS services and cloud infrastructure. Familiarity with Airflow for workflow automation and orchestration. Knowledge of Snowflake for data warehousing solutions. Excellent communication and leadership skills. Ability to work collaboratively in a hybrid team environment.
About the job
dev2 is seeking an experienced and innovative Data Engineering Manager to lead our data engineering team in a hybrid work environment. In this role, you will be responsible for overseeing the design, development, and implementation of scalable data pipelines and architecture using technologies such as Python, AWS, Airflow, and Snowflake. Your leadership will drive the data strategy and empower the team to enhance data availability and quality.
About dev2
dev2 is a forward-thinking technology company dedicated to providing innovative solutions in the data space. We are committed to fostering a culture of creativity and excellence, empowering our employees to drive impactful change. Join us in our mission to transform the way businesses leverage data.
About Egen:Egen is a rapidly expanding company driven by a data-centric philosophy. We unite top engineering talent with cutting-edge technology platforms such as Google Cloud and Salesforce, empowering our clients to leverage data and insights for impactful decision-making. Our commitment to fostering a workplace where exceptional individuals can thrive allows them to harness their engineering and technology skills to innovate how data and platforms can positively transform the world. We emphasize continual learning, relish tackling complex challenges, and are dedicated to driving rapid, effective outcomes. If this resonates with you, we invite you to join our team.Interested in discovering more about life at Egen? Explore these resources alongside the job description.-> Meet Egen -> Life at Egen-> Culture and Values at Egen-> Career Development at EgenNOTE: This is a 6-month contract.
We are seeking a highly skilled Manager of Data Engineering to lead our data engineering projects at dev2. This is a unique opportunity to work in a hybrid environment where you can collaborate with a talented team while having the flexibility to work remotely. You will be responsible for designing, building, and maintaining our data architecture using technologies such as Python, AWS, Airflow, and Snowflake. Your leadership will guide the successful execution of data strategies and ensure the availability of high-quality data for analysis and decision making.
dev2 is seeking an experienced and innovative Data Engineering Manager to lead our data engineering team in a hybrid work environment. In this role, you will be responsible for overseeing the design, development, and implementation of scalable data pipelines and architecture using technologies such as Python, AWS, Airflow, and Snowflake. Your leadership will drive the data strategy and empower the team to enhance data availability and quality.
Join dev2 as a Manager of Data Engineering, where you will lead a talented team in designing and implementing data solutions using cutting-edge technologies such as Python, AWS, Airflow, and Snowflake. This hybrid role allows you to enjoy a flexible work environment while collaborating closely with stakeholders to drive data initiatives that enhance business operations.Your expertise will help shape data strategies that promote efficiency and innovation within our organization. If you're passionate about data engineering and leadership, we want to hear from you!
Join dev2 as a Data Engineering Manager, where you will lead a dynamic team in harnessing the power of data to drive strategic decisions and innovations. In this hybrid role, you will be responsible for overseeing data engineering projects, ensuring the implementation of best practices in data management and cloud solutions using technologies such as Python, AWS, Airflow, and Snowflake.Your leadership will guide the team in developing scalable data pipelines, optimizing data architecture, and enhancing data quality. Collaborate with cross-functional teams to align data solutions with business objectives, while mentoring junior engineers and fostering a culture of continuous learning.
We are seeking a talented and experienced Data Engineering Manager to lead our data engineering team at dev2. In this hybrid role, you will be responsible for overseeing the development and implementation of robust data pipelines and architectures using Python, AWS, Airflow, and Snowflake. Your expertise will help drive data-driven decision-making across the organization.
About RevenueBase:At RevenueBase, we are transforming the data infrastructure landscape to empower AI agents with reliability over error-prone systems.Our platform delivers consistently updated, verified B2B data that fuels autonomous AI agents and go-to-market workflows.With a remarkable growth trajectory, we've tripled our growth while achieving 100% gross dollar retention and maintaining positive cash flow.We are proud to support AI agents for industry leaders such as Clay, ZoomInfo, and Dun & Bradstreet, and we are at the forefront of developing the next generation of AI-driven GTM tools.Position OverviewWe are in search of a Senior Data & AI Platform Engineer to design and implement internal tools and services that enhance our large-scale data infrastructure. Your main objective will be to create systems that utilize vector embeddings, LLM APIs, and semantic search to extract value from both structured and unstructured data.This is a hands-on role tailored for an engineer who is passionate about building impactful AI-driven tools and deploying them in a dynamic startup environment.Your ResponsibilitiesCraft and develop data-centric tools that function on extensive datasets housed in S3 and Snowflake.Establish pipelines that:Extract targeted columns or datasets from Snowflake.Create vector embeddings using APIs such as OpenAI.Store and manage these embeddings in vector databases like Pinecone.Facilitate semantic search and similarity-based retrieval.Develop enrichment workflows that:Query structured data efficiently.Leverage LLM APIs to produce new derived columns.Write enriched outcomes back into Snowflake.Create reusable internal services and SDKs for embedding generation, prompt orchestration, and data augmentation.Optimize performance and cost-effectiveness across AWS infrastructure.Collaborate closely with product and data teams to translate use cases into scalable engineering solutions.Guarantee the reliability, observability, and maintainability of AI-powered pipelines.Sample ProjectsDevelop a tool to extract a specific Snowflake column, generate embeddings, push to Pinecone, and provide a semantic search API.
Join dev2 as a Manager of Data Engineering where you will lead a talented team in designing and implementing data solutions leveraging Python, AWS, Airflow, and Snowflake. This hybrid role offers the flexibility of working remotely while collaborating with your team on-site.In this dynamic position, you will be responsible for guiding the data engineering team through complex data challenges, ensuring optimal performance, and driving innovation in our data architecture. Your leadership will be crucial in fostering a collaborative environment that encourages creativity and efficiency.
Join our innovative team at dev2 as a Data Engineering Manager where you'll lead initiatives in data architecture and engineering. In this hybrid role, you will harness your expertise in Python, AWS, Airflow, and Snowflake to drive strategic projects and enhance our data infrastructure. Your leadership will guide a talented team as we tackle complex data challenges and deliver high-quality solutions that empower our business decisions.
Job OverviewJoin our dynamic team at qodeworld as an AWS Data Engineer. In this pivotal role, you will architect, develop, and sustain scalable data pipelines utilizing AWS technologies. Collaborate with technical analysts, client stakeholders, data scientists, and various team members to ensure the accuracy and integrity of data while optimizing storage solutions for both performance and cost-effectiveness. Your expertise with AWS native technologies and Databricks will be essential for efficient data transformations and processing.Key Responsibilities• Lead and support projects aimed at modernizing our data platform.• Design and implement robust, scalable data pipelines leveraging AWS native services.• Optimize ETL workflows to enhance data transformation efficiency.• Transition workflows from on-premise systems to AWS cloud, maintaining data quality and consistency.• Create automations and integrations to address data inconsistencies and quality concerns.• Conduct system testing and validation to ensure seamless integration and functionality.• Establish security and compliance measures within the cloud environment.• Validate data quality before and after migration, addressing issues related to completeness, consistency, and accuracy.• Collaborate with data architects and lead developers to document manual data movement and design automation strategies.Qualifications• A minimum of 10 years of experience in data engineering with a strong focus on AWS native technologies (AWS Glue, Python, Snowflake, S3, Redshift).• Proven ability to design and implement data pipelines and ETL processes.• Strong analytical skills and attention to detail for data quality assurance.• Experience in cloud migration strategies and automation of data workflows.• Excellent collaboration skills to work with cross-functional teams and stakeholders.
Join our innovative team at dev2 as a Senior Snowflake Administrator/Engineer. In this role, you will play a pivotal role in managing and optimizing our Snowflake data warehouse environment. You will work collaboratively with cross-functional teams to enhance data management practices and ensure data integrity and availability.We are looking for a passionate individual who thrives in a fast-paced environment, embraces challenges, and possesses strong problem-solving skills. Your expertise will help us leverage Snowflake's capabilities to drive our data initiatives forward.
Please note: This position ideally operates on a hybrid schedule in Charlotte, NC (Tuesday-Thursday in-office). Preference will be given to candidates currently residing in Charlotte, NC, or those willing to relocate.About the RoleLendingTree is excited to welcome a Snowflake & Data Platform Operations Engineer to our dynamic Platform Operations team. In this hands-on role, you will play a critical part in automating, scaling, and supporting our cloud-based data ecosystem utilizing Snowflake, as well as AWS and Azure services. Your focus will be on reducing operational overhead through Continuous Integration/Continuous Deployment (CI/CD), implementing robust monitoring solutions, and employing an automation-first engineering approach.Key ResponsibilitiesSnowflake Administration: Oversee environments, manage security policies (RBAC), and conduct performance tuning to optimize costs and query efficiency.Platform Automation: Design and maintain CI/CD pipelines for database deployments using GitLab along with tools such as Liquibase or Red Gate.Cloud Integration: Facilitate integrations between Snowflake and AWS/Azure services, including object storage, messaging, and serverless functions (Lambda/Azure Functions).Pipeline & Streaming Operations: Sustain batch and streaming architectures (Confluent Kafka), troubleshoot ETL/ELT failures, and ensure data observability.Security & Reliability: Implement logging, auditing, and IAM controls; engage in incident response and root cause analysis (RCA).Collaboration: Work closely with Data Engineering and stakeholders to translate business requirements into scalable technical solutions.
About Us: Founded in 2011, Modus Create is a global leader in digital product engineering, comprising a fully remote team of exceptional technologists. We specialize in collaborating with forward-thinking businesses to design, build, and scale custom solutions that deliver measurable results and drive lasting change. Our partnerships with industry giants such as AWS, GitHub, and Atlassian highlight our commitment to innovation. We embraced remote work long before it became a trend! Our recognition as one of the Inc. 5000 Fastest Growing Private Companies for nine consecutive years and being named a top remote work company by FlexJobs exemplifies our ability to help some of the world's largest brands deliver impactful digital experiences. As an award-winning Atlassian partner, we empower organizations to innovate and tackle complex challenges, and we invite talented individuals to join our journey. Opportunity: The Senior Customer Data Platform (CDP) Engineer will be pivotal in our client’s data engineering team, leading the design, implementation, and optimization of scalable data solutions. This role emphasizes building robust data pipelines and curated data models within Snowflake, utilizing event-driven architectures and modern serverless infrastructure on Amazon Web Services, while facilitating seamless integration with Salesforce Data Cloud. In this role, you will ensure high data quality, governance, and secure access, supporting advanced analytics and activation use cases that drive business growth and strategic decision-making. The ideal candidate will possess extensive expertise in event-driven architectures, serverless integration patterns, modern data engineering practices, and exhibit strong analytical thinking and communication skills. This is a fully remote role with collaboration across distributed teams and daily overlap with the US Eastern Time Zone.
Full-time|Remote|Remote — Maitland, Florida, United States
About AssistRxAt AssistRx, we are dedicated to revolutionizing the patient experience through innovative technology. Our platform streamlines access to life-saving therapies, connecting patients, healthcare providers, payers, and manufacturers through secure and scalable data-driven solutions. By facilitating better decision-making, we aim to enhance outcomes across the entire healthcare ecosystem.We invite you to join our team if you are passionate about constructing cutting-edge data platforms, engaging with large-scale cloud data systems, and applying data engineering best practices within a regulated healthcare context.Role OverviewWe are actively seeking a Senior Data Engineer to become an integral part of our Cloud Data Platform (CDP | Architecture) team. This position is highly hands-on and centers on the design, construction, and expansion of our modern data platform, with a particular focus on Snowflake and dbt as essential technologies.Your contributions will be pivotal in shaping AssistRx’s enterprise data architecture, as you will be responsible for transformation logic, performance optimization, and the implementation of data modeling patterns that support analytics, reporting, and downstream products across our organization.This is not an entry-level opportunity; we are looking for a candidate who can immediately contribute in a production Snowflake + dbt environment and establish technical standards for the wider data organization.What You’ll DoCloud Data Platform & ArchitectureDesign, build, and optimize Snowflake-centric data architectures to facilitate enterprise analytics, reporting, and operational use cases.Own dbt transformation layers, including model design, testing, documentation, and deployment best practices.Implement scalable data modeling patterns (star schemas, data vault, dimensional models) aligned with business requirements.Data Engineering & PipelinesDevelop and maintain robust data pipelines integrating sources such as Salesforce, application databases, and external client data.Ensure data integrity through validation, testing, monitoring, and observability frameworks.Optimize Snowflake performance and cost through query tuning, warehouse design, and efficient data modeling.Collaboration & EnablementCollaborate closely with Analytics, BI, Product, and Engineering teams to deliver trusted, analytics-ready datasets.Contribute to establishing architectural standards, code reviews, and best practices across the CDP team.Document data flows, models, and platform decisions to support long-term scalability and knowledge sharing.Governance, Security & ComplianceEnsure data pipelines and models adhere to PHI / PII / HIPAA compliance requirements.Support secure access patterns, role-based permissions, and data governance initiatives.
Job OverviewJoin our team as an AWS Data Engineer, where you will be instrumental in designing, developing, and maintaining robust data pipelines on the AWS platform. Collaborating with technical analysts, client stakeholders, and data scientists, you will ensure the highest standards of data quality and integrity while optimizing storage solutions for efficiency and cost-effectiveness. Your expertise in AWS native technologies and Databricks will play a key role in transforming and processing data at scale.Key Responsibilities• Drive the modernization of data platforms through effective project leadership and support.• Architect and build scalable data pipelines utilizing AWS native services.• Enhance ETL processes to guarantee efficient data transformation.• Oversee the migration of workflows from on-premises systems to the AWS cloud, maintaining data quality and consistency throughout the process.• Develop automations and integrations to address data quality issues and inconsistencies.• Conduct thorough system testing and validation to confirm successful integration and functionality.• Implement strong security and compliance measures in the cloud environment.• Ensure data integrity before and after migration through comprehensive validation checks addressing completeness, consistency, and accuracy of data sets.• Work closely with data architects and lead developers to document manual data workflows and devise automation strategies.
Contract|On-site|San Jose, California, United States
Job Title: Snowflake Developer / Data Engineer (1–3 Years Experience)Employment Type: ContractWork Authorization: US Citizen / Green Card / H4 EADLocation: Open to relocate anywhere within the United StatesJob Description:We are seeking a passionate Snowflake Developer / Data Engineer with 1-3 years of dedicated experience in data engineering and cloud-based data platforms. The ideal candidate will possess hands-on experience with Snowflake, SQL, and various data integration tools to enhance our data warehousing and analytics initiatives.Key Responsibilities:Design, develop, and maintain data pipelines and solutions using Snowflake.Craft efficient SQL queries for data extraction, transformation, and loading (ETL/ELT) processes.Work with structured and semi-structured data in the Snowflake environment.Assist with data warehousing and migration projects.Collaborate with data analysts, developers, and business stakeholders to identify and fulfill data requirements.Optimize data performance, query efficiency, and storage utilization in Snowflake.Support data validation, testing, and troubleshooting of data pipelines.Engage with cloud platforms such as AWS, Azure, or GCP (basic knowledge preferred).
Capstone Integrated Solutions is hiring a Senior Data Engineer with a focus on AWS. This position is fully remote. Role overview This role centers on designing and building data pipelines using AWS tools and services. The Senior Data Engineer will work with teams across the company to make sure data remains accurate, reliable, and accessible. In addition, the position involves contributing to data architecture decisions and promoting engineering best practices. Collaboration and team approach The team values practical solutions and close collaboration. Colleagues from different functions work together to deliver data systems that help the company meet its goals. What you will do Design and build data pipelines with AWS services Partner with teams to ensure data quality and accessibility Influence data architecture and engineering standards
Mactores has established itself as a frontrunner in delivering innovative data platform solutions since 2008. We empower businesses to unlock their potential through automated, agile, and secure End-to-End Data Solutions. Collaborating closely with our clients, we strategize and guide them on their digital transformation journey, whether through assessments, migrations, or modernization efforts.Mactores is on the lookout for a Senior AWS Data Engineer to enhance our dynamic team. The ideal candidate will possess a profound expertise in PySpark and SQL, coupled with hands-on experience in managing data pipelines utilizing Amazon EMR or Amazon Glue. Proficiency in data modeling and end-user querying via Amazon Redshift or Snowflake, along with familiarity with Amazon Athena, Presto, and orchestration tools such as Airflow, is essential for this role.
Mactores stands as a leading provider of modern data platform solutions, empowering businesses since 2008 to enhance their value through automation. We specialize in comprehensive End-to-End Data Solutions that are agile, automated, and secure. Our collaborative approach allows us to partner with clients to strategize and accelerate their digital transformations through assessments, migrations, and modernizations.As an AWS Data Engineer, you will function as a full-stack data engineer dedicated to solving real-world business challenges. Collaborating closely with business leaders, analysts, and data scientists, you will gain a deep understanding of the business domain and work alongside fellow engineers to develop data products that facilitate improved decision-making. Your passion for data quality and your ability to create scalable solutions will be key in addressing broader business inquiries.If you thrive on problem-solving and possess a passion for innovation, we invite you to become a part of Team Mactores. Our workspace is designed to be casual and enjoyable, consciously steering clear of rigid corporate norms. We prioritize productivity and creativity, allowing you to be part of a world-class team while remaining true to yourself.
Cloudary specializes in strengthening cloud capabilities by connecting Principal-Vetted professionals with partner technical teams. Founded and led by seasoned cloud architects, the company focuses on delivering strong engineering solutions. Team members work as agile, embedded experts to help move essential projects forward quickly. What you will do Dashboard and report development: Build and design interactive dashboards in QuickSight. Set up automated pipelines for generating reports. Complex data modeling: Write and refine advanced SQL queries tailored for high-performance reporting needs. Infrastructure as Code: Use AWS CDK to implement, manage, and scale the reporting infrastructure. Data integration: Connect QuickSight with Amazon Aurora databases to enable efficient data retrieval and synchronization. Role overview This AWS Data Developer position is fully remote. The role centers on building reporting solutions, optimizing data models, and automating infrastructure for Cloudary's partners using AWS technologies.
Apr 29, 2026
Sign in to browse more jobs
Create account — see all 39,893 results
Tailoring 0 resumes…
Tailoring 0 resumes…
We'll move completed jobs to Ready to Apply automatically.