Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Unlock Your Potential
Generate Job-Optimized Resume
One Click And Our AI Optimizes Your Resume to Match The Job Description.
Is Your Resume Optimized For This Role?
Find Out If You're Highlighting The Right Skills And Fix What's Missing
Experience Level
Experience
Qualifications
Your ResponsibilitiesBuild and maintain distributed data pipelines utilizing Scala, Spark, and cloud technologies. Work collaboratively with engineers, data scientists, and product teams to deliver reliable, scalable data systems. Design and optimize data ingestion and transformation workflows across both blockchain and traditional datasets. Ensure system accuracy, scalability, and efficiency while processing hundreds of millions of daily data points. Evaluate design options and trade-offs in terms of performance, scalability, reliability, and cost. Contribute to the full lifecycle of data platform development from design and deployment to ongoing enhancements. Enhance pipeline reliability, observability, and automation via code and tooling enhancements. Expand your influence and take on increasing responsibilities as you gain a deeper understanding of our distributed systems and platform architecture. Technical EnvironmentTechnologies we use include: Scala, Spark, Databricks, AWS, Airflow, Kubernetes, Terraform, and Functional Programming. (No Scala experience yet? If you are an experienced data engineer eager to learn, we will support you.)Ideal Candidate AttributesPassionate about writing clean, well-tested, and efficient code. Leverage data and experimentation to guide informed decisions. Thrive in a collaborative environment and embrace challenges with enthusiasm.
About the job
Join Us in Shaping the Future of Blockchain Intelligence
At Elliptic, we are pioneering the intelligence framework for the future of finance. Our dedicated teams are committed to transforming intricate blockchain and off-chain data into actionable insights, equipping financial institutions, regulators, and businesses with the confidence to innovate. Our mission is to make digital-asset intelligence seamlessly accessible, as we design and scale the data streams and services that drive Elliptic’s analytics and decision-making products.
As a Data Engineer, you will architect and optimize systems that process vast blockchain and off-chain datasets, facilitating organizations across the globe in making informed, data-driven decisions.
Your role will involve engaging with platform-focused teams or those directly handling product data, as you tackle challenges related to batch and streaming processing while developing high-quality, scalable solutions in a rapidly changing ecosystem.
About Elliptic
Elliptic is at the forefront of blockchain intelligence, dedicated to building the intelligence layer for the future of finance. We empower organizations with insights derived from complex blockchain data, enabling them to innovate with confidence.
Join our dynamic team at OctoEnergy as a Data Engineer, where you will play a pivotal role in transforming data into actionable insights. We are seeking a motivated and innovative individual who thrives on tackling complex challenges and delivering high-quality solutions.
Role overview Octoenergy is hiring a Senior Treasury Analyst in London. This role manages and improves treasury operations, focusing on both oversight and optimization. The Senior Treasury Analyst reviews financial data and delivers insights that support the company’s financial goals.
Join Us in Pioneering the Renewable Energy Revolution! At Octopus Energy, we're committed to making a significant impact in the green energy sector, and we need talented individuals like you to help turn our ambitious vision into reality. We are seeking skilled Dual Fuel Smart Meter Engineers with a proven dedication to safety and exceptional customer service. As a representative of our brand in customers' homes, your ability to build rapport and engage with customers is as vital as your technical skills in installations. In this role, you will install smart meters, educate customers on the functionalities of their meters and in-home displays, and demonstrate how they can utilize these tools to save energy. You will also address any inquiries related to the green energy movement we are championing. As a member of the Octopus Energy Services team, you'll receive unparalleled support from an office team that shares our performance goals. Together, we work cohesively to deliver the highest standards of service. Leveraging Octopus Energy's advanced technology, we continuously adapt and improve our processes, allowing you to minimize administrative tasks and maximize your focus on exceptional customer service.
About YouLendYouLend is an innovative and rapidly expanding FinTech company, recognized as the leading embedded financing platform for top-tier e-commerce platforms, technology firms, and Payment Service Providers. Our cutting-edge software empowers partners to enhance their offerings by providing customized financing solutions to their merchants under their own branding, all while mitigating capital risk.Backed by EQT, a prominent Private Equity firm, YouLend has achieved remarkable growth, exceeding 100% year-over-year since 2020. Based in London, we also operate across various European countries and the United States, serving renowned partners such as eBay, Amazon, Just Eat, Shopify, and Stripe.Position OverviewWe are on the lookout for a Data Engineer to become an integral part of our expanding Data Engineering & Platform team. This role is pivotal, bridging infrastructure, DevOps, and advanced data tools, with a primary focus on facilitating rapid, secure, and scalable analytics. You will play a crucial role in constructing and scaling a premier data platform that supports a wide array of functions, including dashboards, experimentation, machine learning, and compliance.Key Responsibilities:Develop and oversee the infrastructure for our data platform, utilizing technologies such as AWS, Snowflake, dbt, and Airflow.Design and execute CI/CD pipelines for dbt and various data workflows.Automate data platform operations using Python and infrastructure-as-code tools like Pulumi and Terraform.Collaborate with analytics, machine learning, product, and engineering teams to enhance data solutions.Maintain data quality, lineage, and governance through rigorous testing and monitoring.Utilize cost observability tools to promote efficient platform usage.Qualifications:The ideal candidate will possess the following qualifications:Demonstrable experience with cloud-based data platforms such as Snowflake, Redshift, or BigQuery.Strong proficiency in Python and SQL for automation and analytics purposes.Familiarity with CI/CD processes, particularly in dbt or similar data platforms.Practical experience with Infrastructure as Code (IaC) tools including Pulumi, Terraform, or CloudFormation.Solid understanding of orchestration tools like Airflow.Exceptional communication skills with a history of cross-functional collaboration.Desirable Skills:Experience with AWS services (S3, Lambda, MWAA) and Azure DevOps.Familiarity with monitoring tools such as DataDog.Why Choose YouLend?Recognized as one of the “Best Places to Work in 2024”, YouLend offers a dynamic workplace environment.
About WheelyWheely is revolutionizing premium transportation in major cities across Europe, the US, and the Middle East. Our blend of innovative technology and exquisite chauffeuring services has garnered the trust of over 100,000 active riders and 1,200 corporate accounts.As a profitable and rapidly expanding scale-up with $43M raised and more than $100M in annual revenue, we have recently launched in New York City and are ambitiously growing throughout the US and EMEA. If you are passionate about your craft and eager to contribute to our next growth phase, we want to hear from you.We are seeking a Data Engineer to enhance our Data Team at Wheely, delivering a top-notch and seamless data experience to Business Users and Data Scientists.
As a Data Engineer at dev2, you will play a pivotal role in designing, building, and maintaining data pipelines that enable our teams to make data-driven decisions. You will work closely with data scientists and analysts to ensure the data infrastructure is robust and scalable.Your responsibilities will include optimizing data flow, ensuring data quality, and implementing best practices in data management. This is an exciting opportunity to be part of a forward-thinking company that values innovation and creativity.
ASOS Plc seeks a Data Engineer in London to support data processing and analytics for its online fashion platform. This role centers on improving how data is managed and used across the company. Role overview The Data Engineer will work with modern data technologies, contributing to projects that influence data-driven decisions throughout the business. The position involves collaborating with teams to strengthen data workflows and ensure reliable analytics. Location This is a London-based role, working onsite with ASOS Plc's technology and analytics teams.
About SwapAt Swap, we are revolutionizing modern commerce with our unique AI-native platform that seamlessly integrates backend operations with an innovative storefront experience. Our solution is designed for brands aspiring to sell anything, anywhere, by centralizing global operations and enabling intelligent workflows. With real-time data and capabilities, our products cover cross-border transactions, tax management, returns, demand planning, and our state-of-the-art agentic storefront. This empowers merchants with complete transparency and the confidence to make informed decisions.We are fostering a culture at Swap that emphasizes clarity, creativity, and a shared sense of ownership as we redefine the landscape of global commerce.About the RoleWe are seeking enthusiastic, detail-oriented, and adaptable Data Engineers to join our platform team during an exciting phase of growth. This role is ideal for proactive engineers who thrive in dynamic environments and are eager to take charge, contribute significantly, and collaborate closely with product teams.In this pivotal position, you will play a key role in designing, optimizing, and scaling our premier data platform, which fuels our advanced customer-facing agentic systems. You will engage proactively with Product Managers and stakeholders to advocate for best practices, ensuring our platform remains robust and poised for future advancements.This hands-on builder role is suited for knowledgeable and enthusiastic team players excited about the opportunity to accelerate and expand a greenfield platform.Key ResponsibilitiesData Pipeline Engineering: Develop, optimize, and sustain comprehensive end-to-end data pipelines for essential business operations. Prioritize low latency, strong observability, and effective alerting for batch and stream processing to guarantee data reliability.API Development: Support the design and implementation of new features within the API layer, collaborating with product teams to ensure efficient data access.Platform Contribution: Own specific components of the data platform, actively contributing technical enhancements for scalability, future readiness, and alignment with Swap's long-term product strategy.Data Quality & Governance: Implement and advocate for best practices in data quality to ensure the data's integrity and reliability.What We Would Like to SeeA minimum of 3 years of experience in data engineering or related roles, demonstrating a strong understanding of data systems.Proficiency in programming languages such as Python, Java, or Scala.Experience with cloud platforms such as AWS, Azure, or Google Cloud.Strong problem-solving skills and a passion for data-driven decision-making.
About Tilt At Tilt, we are on a mission to Make Commerce Alive. Our platform revolutionizes the traditional e-commerce landscape, moving away from outdated website builders and faceless marketplaces to create vibrant, community-driven experiences for the new generation of merchants. With millions of engaged shoppers across the UK, from sneaker enthusiasts to collectors, Tilt empowers sellers to achieve remarkable earnings, often exceeding £1M. We are just getting started!Your Mission As our Lead Data Engineer, you will take charge of shaping and advancing our comprehensive data platform. This is a hands-on role with significant ownership where you will oversee the architecture, reliability, and scalability of our data warehouse and analytics stack. Your expertise will help refine our existing systems, including Snowflake, Dagster, dbt, Postgres, Metabase, and ContextFlow, while establishing standards for intelligent growth. We seek an individual who not only writes SQL but deeply understands data, thinks critically, and designs enduring systems.What You’ll Do 0–3 monthsGain an in-depth understanding of our Snowflake warehouse architectureAudit and enhance existing dbt models for clarity and performanceStabilize and review Dagster orchestration pipelinesIdentify data quality gaps and implement rigorous testing standardsEnsure key dashboards in Metabase are based on reliable datasetsDocument current systems and define architectural standards3+ monthsRedesign and optimize warehouse architecture where necessaryImplement scalable data modeling patterns across various domainsEnhance cost efficiency and performance in SnowflakeIntroduce advanced observability, testing, and lineage practicesEnable AI and product features by structuring high-quality, production-ready datasetsEstablish clear version control and deployment workflows for data changesCollaborate with product and engineering teams to define data contracts and ownership
Abound is changing how consumer lending works in the UK and beyond. By combining AI with Open Banking data, the company offers personal finance options that go beyond traditional credit scores. Each applicant is evaluated based on real financial habits and repayment history, aiming for fairer access to credit. Since launch, Abound has issued over £1.3bn in loans and kept default rates well below industry averages. Profitability came within 2.5 years. With more than £2bn in investment from Citi, GSR Ventures, Deutsche Bank, and others, Abound is recognized as one of Europe's fastest-growing fintechs. As the company expands into new markets and products, it seeks people eager to learn, take ownership, and grow alongside the team. Role overview The Data Engineer will join the Platform team in London, focusing on the development and improvement of Abound’s Data Lake. This position acts as a connector between Platform and Data Science, maintaining the infrastructure and data pipelines that drive decision-making across the business. The role is individual contributor level, offering autonomy and the chance to make a measurable impact. Main focus for the first 6–12 months Upgrade the Data Lake into a scalable, high-performance platform for analytics and data science. Migrate production workloads, including model calibration, ad-hoc analytics, and reporting, into the Data Lake. Redesign data structures to improve query speed, lower AWS costs, and ensure timely data availability. Collaborate closely with the Data Science team to support rapid experimentation and delivery through reliable data systems. Technology stack Cloud & Compute: AWS, ECS Fargate, AWS Lambda Databases & Data Lake: Aurora (PostgreSQL, MySQL), Athena, DMS, Glue, Iceberg Languages & Infrastructure as Code: Python, Spark, SQL Observability & Tooling: Amazon Managed Prometheus (AMP), incident.io, GitLab
About The Dot CollectiveAt The Dot Collective, we are a forward-thinking consultancy operating across the UK and EU, dedicated to engineering excellence and empowering individuals to create significant impact.Our team utilizes the latest technology stacks and embraces agile scrum methodologies for all our projects.About YouAre you driven by a passion for data and its transformative potential? Do you thrive on making a substantial difference in a short time frame? If so, we may be the perfect fit for you.
Join Our Team! Contentful is seeking a Senior Data Engineer to play a pivotal role in designing, developing, and scaling advanced data solutions that drive analytics, operational reporting, and strategic decision-making within our organization. You will collaborate closely with our data, analytics, and business teams to create robust data pipelines, models, and integrations that ensure clean, reliable, and actionable data. You will be an integral part of a small, geographically diverse engineering team that operates with a product mindset—delivering iterative value while partnering closely with business and analytics stakeholders, and taking ownership of both the development and operational aspects of a critical data platform. Your Responsibilities Design, build, and maintain scalable data pipelines and transformations that integrate multiple systems and sources. Create high-quality data models that cater to analytics, reporting, and operational use cases. Collaborate with analytics, product, and business teams to identify data needs and translate them into effective technical solutions. Establish and enforce strong data quality, validation, and monitoring protocols to ensure the reliability and trustworthiness of our data. Optimize data storage, processing, and performance within cloud data warehousing environments. Contribute to the ongoing improvement of our modern data platform, including tooling, standards, and best practices. Support the operations of the data platform by troubleshooting issues, enhancing reliability, and ensuring SLAs are consistently met. Engage with cross-functional partners on governance, documentation, definitions, and data stewardship. Implement CI/CD practices and Infrastructure-as-Code (IaC) for automated deployment, testing, and environment management. Participate in code reviews, design discussions, and operational on-call rotations as required. Mentor team members and promote data engineering best practices across the organization. Your Skills and Experience Essential Qualifications 5+ years of experience in a Data Engineering or similar technical role. Expertise in SQL and experience working with cloud data warehouses (e.g., Snowflake, Redshift, BigQuery). Experience in building and maintaining ETL/ELT pipelines using tools such as dbt, Airflow, or similar frameworks. Proficiency in Python or another scripting language. Strong understanding of data modeling, data structures, and modern data architecture principles.
The RoleWe are excited to invite applications for the position of Senior Data Engineer to join our dynamic Data team based in London.This is an office-based position located at our London headquarters.Working at WGSNTogether, we shape the future.At WGSN, we thrive in a fast-paced and innovative environment filled with opportunities for professional growth. Our diverse team consists of consumer and design trend forecasters, content creators, designers, data analysts, advisory consultants, and more, all dedicated to a shared mission: to create tomorrow.Our reliable consumer and design forecasts empower exceptional product design, helping our clients envision a better future. Our offerings span consumer insights, beauty, consumer technology, fashion, interiors, lifestyle, food and drink forecasting, data analytics, and expert advisory services. If you're a specialist in your field, we would love to hear from you.Role OverviewAs WGSN expands its AI and Data capabilities, we aim to enhance our data architecture. As a Senior Data Engineer, you will be instrumental in designing, optimizing, and maintaining the data pipelines, models, and infrastructure that support our classification systems, AI workflows, forecasting models, TikTok insights, and consumer intelligence products.You will collaborate closely with our senior data scientists, analysts, and engineers, especially within the TikTok and Pulse pods, ensuring high-quality, well-structured, reliable data flows across Snowflake, Databricks, and downstream systems. This position is tailored for an individual with substantial technical expertise, advanced SQL and data modeling skills, and a passion for developing scalable, efficient data systems. This role is hands-on and requires a minimum of 5 years of experience in data engineering, including the ability to mentor junior engineers when necessary.Key ResponsibilitiesData Architecture & Modeling- Design, develop, and maintain scalable data architectures...
Join Eucalyptus as a Data Engineer and play a pivotal role in transforming data into actionable insights. We are looking for innovative thinkers who are passionate about data architecture and analytics. You will work closely with cross-functional teams to design, build, and maintain scalable data pipelines that drive strategic business decisions.
About the RoleJane Street is seeking a skilled Data Engineer to enhance our understanding, management, and dissemination of data that informs our trading strategies. At Jane Street, the ability to comprehend and manipulate data accurately is fundamental to our operations.In this role, you will utilize a combination of proprietary and open-source tools to analyze diverse datasets, identifying anomalies, ensuring consistency in formats and symbologies, automating ETL processes, and ultimately simplifying the process for our traders to derive insightful conclusions.We are looking for someone who is passionate about diving deep into data and articulating findings to various stakeholders, collaborating closely with traders and software engineers.While familiarity with financial data is beneficial, we do not require a financial background. We are eager to hire talented engineers and provide the necessary training.
Join Our Journey Welcome to Zopa! Founded in 2005, Zopa pioneered peer-to-peer lending and has since evolved, launching Zopa Bank in 2020. Our mission is to transform the banking experience by prioritizing customer needs and redefining financial services. We empower our team and customers to challenge the norms and aim high. Discover our innovative offerings at Zopa.com!We take pride in our accomplishments and our outstanding team, which has propelled us to be recognized as one of the UK’s Most Loved Workplaces.At Zopa, we welcome those who thrive on unconventional challenges and are eager to make a significant impact. Follow us on Instagram @zopalife to see our culture in action!About the Team:The Data Engineering team is essential in designing, constructing, and managing the intricate data pipelines vital to our bank's operations. We leverage a blend of software engineering, data analytics, and operational skills while collaborating with various teams to tackle diverse and complex challenges. Our culture values teamwork, practicality, and continuous learning over individual accomplishments.This role offers you the opportunity to join Zopa’s Data Engineering team at a pivotal moment of growth and advancement. You will be instrumental in developing and enhancing the data platforms that drive decision-making throughout the bank, from customer behavior insights to critical applications like fraud detection, where every moment counts.
Join a Team Where Your Work Truly Matters.Advance your career with Qodea, where innovation is not just a catchphrase—it's part of our core identity.As a leading global technology group, we are dedicated to shaping the future, providing exceptional professionals with opportunities for high-impact work that shapes careers. By joining us, you are not merely undertaking projects; you are tackling challenges that have yet to be solved.Our exclusive clientele includes global leaders such as Google, Snap, Diageo, PayPal, and Jaguar Land Rover—companies that turn to us when deadlines loom, when others have faltered, and when the solutions must succeed.Say goodbye to mundane consultancy. Here, you will work at the intersection of technology, design, and human behavior to deliver rapid and meaningful outcomes. This is work that creates an impact—work you will proudly share with your peers.Qodea is designed for the future, fostering an environment where your skills will continuously evolve at the cutting edge of innovation and AI, ensuring your growth and development.We are in search of a Senior Data Engineer who will deliver outstanding solutions that exceed our clients' expectations by leveraging state-of-the-art tools and technologies.We seek individuals who exemplify:Innovation to tackle the most challenging problems.Accountability for every outcome.Integrity in all endeavors.Your RoleThe primary objective of this role is to design, develop, and manage scalable data pipelines and infrastructure to enable the effective processing and analysis of extensive, complex data sets. This position is tailored for impact, and we believe our best outcomes arise through collaboration. While we have a flexible working model, we anticipate that you will spend time on-site for collaboration sessions, client meetings, and internal workshops.Key ResponsibilitiesDevelop and maintain automated data processing pipelines using Google Cloud:Design, construct, and sustain data pipelines for data ingestion, ETL, and storage.Establish and uphold automated data pipelines to monitor data quality and resolve issues.Implement and maintain databases and data storage solutions:Stay informed about emerging trends and technologies in big data and data engineering.Ensure data quality, accuracy, and completeness.Implement and enforce data governance policies and procedures to uphold data quality and precision:Collaborate with data scientists and analysts to design and optimize data models for analytics and reporting.Develop and maintain data models for analytical and reporting purposes.Monitor and manage data infrastructure to ensure availability and performance.
Full-time|$1K/yr - $1K/yr|On-site|London, United Kingdom
Join Us in Shaping the Future of Blockchain IntelligenceAt Elliptic, we are pioneering the intelligence framework for the future of finance. Our dedicated teams are committed to transforming intricate blockchain and off-chain data into actionable insights, equipping financial institutions, regulators, and businesses with the confidence to innovate. Our mission is to make digital-asset intelligence seamlessly accessible, as we design and scale the data streams and services that drive Elliptic’s analytics and decision-making products.As a Data Engineer, you will architect and optimize systems that process vast blockchain and off-chain datasets, facilitating organizations across the globe in making informed, data-driven decisions.Your role will involve engaging with platform-focused teams or those directly handling product data, as you tackle challenges related to batch and streaming processing while developing high-quality, scalable solutions in a rapidly changing ecosystem.
Who are we?Smarkets: Shaping the Future of BettingAt Smarkets, we operate one of the most advanced prediction markets globally, with a staggering £29 billion in volume processed since our inception in 2010. Our platform engages over 200,000 traders worldwide, revolutionizing betting across various sectors, including sports and political markets, by providing the most competitive prices and fairest odds.Our tech stack is engineered for scalability, reliability, and performance, utilizing Linux, Kafka, Postgres, and Kubernetes, while Python 3, C++, Rust, and React underpin our platform. We construct infrastructure that institutions can rely on while ensuring trading remains accessible to all users. Our resilience is evident as we have thrived through every market trend and competitive landscape.What sets us apart is our exceptional team. We foster a high-performance culture where talent flourishes, merging extensive business knowledge with a strategic approach to drive growth.If you're eager to help redefine the future of prediction markets with innovative technology and a customer-centric approach, Smarkets is your ideal workplace.The TeamOur Data Team plays a crucial role in harnessing the vast amount of data generated at Smarkets to derive insights that propel our business forward. Given the extensive range of data we produce—from sports event metrics to payment information and user analytics—there are abundant opportunities for the team to create significant business impact.Currently, our team's responsibilities encompass three primary domains:Data Engineering: Developing and maintaining ETL pipelines, APIs, and data infrastructure such as Redshift or BigQuery;Data Science and Machine Learning: Exploring data, training ML models, and implementing ML Ops to uncover fresh insights;Analytics and Reporting: Crafting data models and dashboards while automating reporting pipelines for various teams, stakeholders, and third parties.A typical week for a data engineer within our Data Team might involve:Creating a new Python ETL pipeline to segment users based on their sports interests by analyzing behavior, optimizing marketing communications for these users;Developing a new endpoint for a Flask API, implementing unit tests, and deploying the updated version into our production Kubernetes cluster;Training and assessing an ML model to identify specific user patterns, contributing to our data-driven decision-making.
About the RoleJoin Teza Technologies as a Data Engineer in our dynamic data team, where data is the heartbeat of our systematic trading and vital to every facet of our operations.This hands-on role within a growing team of data engineers offers significant potential for career advancement, as we anticipate rapid growth over the coming years. We seek candidates with exceptional technical expertise, meticulous attention to detail, and a proven track record in architecting and developing robust data platforms.LocationHybrid role based in London, UK, with a requirement of 3 days in-office each week.Key ResponsibilitiesCollaborate with Portfolio Managers and Quantitative Developers to translate business needs into effective technical solutions while providing insights into dataset intricacies.Enhance our data warehouse by designing and integrating new data sources and functionalities; boost system reliability, speed, and scalability while overseeing data access management.Contribute innovative data management, analytics, and technological insights to the team and leadership.Assess and recommend new tools and technologies for organizing, querying, and streaming extensive datasets.Create and implement automated systems for data cleansing, anomaly detection, monitoring, and alerting.Provide support for our production data warehouse as needed.Cultivate and maintain strong vendor partnerships aligned with our business objectives.Essential QualificationsProficiency in Python and Unix/Linux for data manipulation, scripting, and automation.Deep understanding of SQL and familiarity with NoSQL databases, especially Postgres and MongoDB, with skills in query optimization and performance tuning.Strong grasp of data modeling principles, including both normalization and denormalization techniques.Experience with cloud platforms, such as AWS or GCP.Familiarity with Git version control, collaborative workflows (e.g., Github), and CI/CD best practices.Bachelor’s degree in Computer Science, Information Technology, or a related field.Preferred Qualifications...
Jul 1, 2025
Sign in to browse more jobs
Create account — see all 10,427 results
Tailoring 0 resumes…
Tailoring 0 resumes…
We'll move completed jobs to Ready to Apply automatically.