Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Unlock Your Potential
Generate Job-Optimized Resume
One Click And Our AI Optimizes Your Resume to Match The Job Description.
Is Your Resume Optimized For This Role?
Find Out If You're Highlighting The Right Skills And Fix What's Missing
Experience Level
Senior Level Manager
Qualifications
Proven experience in data engineering and web developmentStrong leadership and team management skillsExpertise in data frameworks and web technologiesAbility to collaborate effectively with cross-functional teamsExcellent problem-solving skills and attention to detail
About the job
Qima seeks a Senior Manager, Data & Web Engineering to join their Budapest office. This leadership position manages a team of data engineers and web developers, focusing on the creation and improvement of data solutions and web applications. The role shapes data frameworks and helps ensure web platforms run smoothly, offering a dependable and user-friendly experience.
What you will do
Lead and mentor a group of data engineers and web developers
Oversee development and refinement of data frameworks
Maintain high standards for web platform performance and usability
Collaborate with cross-functional teams to apply best practices in data management and web engineering
Guide the team through complex technical issues
Deliver projects on time and within budget
Encourage continuous learning and foster innovation within the team
Requirements
Demonstrated experience leading teams in data engineering and web development
Strong knowledge of data frameworks and web application performance
Ability to work effectively across departments
Track record of mentoring and developing technical staff
Comfort managing several projects and shifting priorities
This position is located in Budapest.
About qima
Qima is a global leader in quality control and supply chain management. We are dedicated to providing our clients with the highest quality services through innovative technology and a commitment to excellence. Our team is made up of passionate professionals who are driven by a shared goal of helping businesses succeed.
Role overview Qima seeks a Senior Manager, Data & Web Engineering to join their Budapest office. This leadership position manages a team of data engineers and web developers, focusing on the creation and improvement of data solutions and web applications. The role shapes data frameworks and helps ensure web platforms run smoothly, offering a dependable and user-friendly experience. What you will do Lead and mentor a group of data engineers and web developers Oversee development and refinement of data frameworks Maintain high standards for web platform performance and usability Collaborate with cross-functional teams to apply best practices in data management and web engineering Guide the team through complex technical issues Deliver projects on time and within budget Encourage continuous learning and foster innovation within the team Requirements Demonstrated experience leading teams in data engineering and web development Strong knowledge of data frameworks and web application performance Ability to work effectively across departments Track record of mentoring and developing technical staff Comfort managing several projects and shifting priorities This position is located in Budapest.
Join our esteemed client, a well-established IT consulting firm with over two decades of expertise in executing large-scale technology projects across various sectors. Their enduring collaborations with major organizations demonstrate their technical prowess and financial reliability, bolstered by consistently high external credit ratings. Operating with a streamlined core team, the company boasts low turnover rates and fosters a collaborative culture built on trust, long-term partnerships, and a commitment to high-quality deliverables. With engagements spanning telecommunications, healthcare, and academia, this role offers engineers the chance to work on diverse and intricate projects while enjoying a flexible and supportive work environment.We are seeking a Senior Data Engineer to help manage extensive data environments characterized by high-volume datasets, intricate integrations, and analytical applications. This hands-on engineering position emphasizes the creation and upkeep of data pipelines, in-depth understanding of end-to-end data flows, and resolution of data quality concerns across distributed systems.You will collaborate closely with engineers, analysts, and stakeholders to ensure the delivery of reliable, scalable, and well-structured data solutions that enhance analytics and machine learning efforts.Key Responsibilities:Design, construct, and sustain scalable data ingestion and ETL/ELT pipelines.Manage large datasets and interfaces that feed into enterprise-level databases.Examine data flows and diagnose inconsistencies across systems.Conduct root cause analyses to trace erroneous outputs back to original source systems.Prepare datasets for analytics and machine learning applications.Participate in data modeling and mapping activities.Assist machine learning initiatives through data preparation, evaluation, and optimization.Establish and implement data quality monitoring frameworks.Create visualizations to facilitate data interpretation and insights.
Join our dynamic team as a Senior Data Engineer where you will play a pivotal role in shaping our data architecture and driving innovative solutions. You will be responsible for designing, building, and maintaining robust data pipelines that ensure our data is accessible, reliable, and ready for analysis. Your expertise will directly contribute to enhancing our data-driven decision-making processes.
Join YOVO as a Web Product EngineerAt YOVO, we're revolutionizing payment infrastructure tailored for digital entrepreneurs across Europe. Our mission is to equip creators, educators, and innovative online businesses with cutting-edge tools that keep pace with their dynamic needs. Traditional payment solutions fail to cater to this evolving landscape, prompting us to forge a groundbreaking alternative.Founded in 2025, YOVO is at the forefront of developing a state-of-the-art, modular checkout and subscription engine, alongside world-class product hosting solutions that simplify the sale of digital products and services. Our goal is to empower creators and digital enterprises with the comprehensive financial and operational infrastructure necessary for scaling, from seamless payments to automated workflows and insightful analytics.Operating with a remote-first approach, we also offer the flexibility to work on-site at our Budapest office, a vibrant hub for tech talent in Europe. Supported by seasoned founders, operators, and international angel investors, we are on a mission to transform the European payments landscape.
Join our team as a Senior Data Engineer, where you will play a pivotal role in building and managing scalable data ingestion and Change Data Capture (CDC) capabilities on our Azure-based Lakehouse platform. Your expertise will drive our engineering maturity as we deliver ingestion and CDC preparation through Python projects and reusable frameworks. We are seeking a professional who applies best software engineering practices, including clean architecture, rigorous testing, code reviews, effective packaging, CI/CD, and operational excellence.Our platform emphasizes batch-first processing, allowing for the landing of streaming sources in their raw form while processing them in batch. We are selective in our evolution towards streaming as necessary.As part of the Common Data Intelligence Hub, you will collaborate closely with data architects, analytics engineers, and solution designers to create robust data products and ensure governed data flows across the enterprise.Your team is responsible for end-to-end ingestion and CDC engineering, including design, build, operation, observability, reliability, and reusable components.You will contribute to the development of platform standards, including contracts, layer semantics, and readiness criteria.While you will not primarily manage cloud infrastructure provisioning, you will work with the platform team to define requirements, review changes, and maintain deployable code for pipelines and jobs.Platform Data Engineering & DeliveryDesign and develop ingestion pipelines utilizing Azure and Databricks services, including Azure Data Factory pipelines and Databricks notebooks/jobs/workflows.Implement and manage CDC patterns for inserts, updates, and deletes, accommodating late arriving data and reprocessing strategies.Structure and maintain bronze and silver Delta Lake datasets, focusing on schema enforcement, de-duplication, and performance tuning.Create “transformation-ready” datasets and interfaces with stable schemas, contracts, and metadata expectations for analytics engineers and downstream modeling.Adopt a batch-first approach for data ingestion, ensuring raw landing, replayability, and idempotent batch processing while progressing towards true streaming as required.Software Engineering for Data FrameworksDevelop and maintain Python-based ingestion and CDC components as production-grade software, focusing on modules, packaging, versioning, and releases.Implement engineering best practices such as code reviews, unit/integration tests, static analysis, formatting/linting, type hints, and comprehensive documentation.Establish and enhance CI/CD pipelines for data engineering code and pipeline assets, covering build, testing, security checks, deployment, and rollback patterns.
Full-time|On-site|Budapest, Hungary; Munich, Germany; Tel Aviv, Israel
Role Overview Tulip is hiring a Data Operations Engineer to help manage and improve data operations. This role focuses on maintaining data quality, refining workflows, and supporting the efficiency of daily processes. The position is based in Budapest, Munich, or Tel Aviv. What You Will Do Work closely with teams across the company to ensure data remains accurate and reliable Identify opportunities to streamline data-related processes Support ongoing efforts to improve the efficiency of data operations Location Budapest, Hungary Munich, Germany Tel Aviv, Israel
Join our dynamic team at mpsolutions as a Full Stack Web Engineer, where you will contribute to the creation and advancement of a cutting-edge, event-driven platform. Your role will encompass the design and delivery of features across the technology stack, with a particular focus on asynchronous architectures and event streaming.We foster an innovative environment that encourages the use of advanced coding tools (such as Cursor and Claude) to enhance productivity and code quality during development.Your Responsibilities:Develop and maintain backend services utilizing NestJS (Node.js/TypeScript)Create responsive and maintainable frontend applications with Angular or ReactDesign and implement event-driven and asynchronous flows, including messaging patterns and stream processingEngage with Kafka topics, producers/consumers, schemas, and delivery semantics (e.g., retries, idempotency)Collaborate on system design, API contracts, observability, and performance optimizationContribute to engineering best practices: clean code, code reviews, automated testing, and CI/CD
Join our dynamic team at Hawkeye Innovations as a Senior Java Engineer specializing in our Data Platform Framework. In this role, you will drive the development of scalable and efficient data solutions, collaborate with cross-functional teams, and contribute to the architecture of our data platform. If you are passionate about Java and data engineering, this is your chance to make an impact in a forward-thinking company.
Join a global leader in technology dedicated to energy innovation for a sustainable future. Our client is at the forefront of transforming energy solutions with advanced gas engine technologies, digital platforms, and energy services. With operations in over 100 countries, they focus on advancing engineering while prioritizing the well-being of people and the planet.The Senior Data Engineer plays a crucial role in collaboration with Data Analysts, BI Developers, and Requirements Engineers to lay the groundwork for all analytical projects. This position entails building and managing data pipelines that extract, transform, and load (ETL) data from various sources into a centralized repository, subsequently facilitating project-specific data delivery.Your Responsibilities: Architect and deploy scalable and resilient data pipelines that meet the analytics and data processing requirements. Design and maintain robust database architectures, including data lakes and warehouses. Ensure data integrity and consistency through meticulous data cleaning, transformation, and validation processes. Engage with Data Analysts, BI Developers, and Requirements Engineers to gather project requirements and provide data solutions aligned with business goals. Enhance data retrieval processes by developing pipelines and physical data models tailored for reports and various analytical projects. Implement data security and privacy protocols to ensure compliance with legal and regulatory standards. Document all created data pipelines comprehensively.
Full-time|Hybrid|Budapest, London, Manchester, Amsterdam, Rotterdam, Dublin, Zagreb, Split
Join our team as a Lead Data Engineer and play a pivotal role in shaping data engineering strategies at DEPT®. Our hybrid work environment spans across vibrant cities including Budapest, London, Manchester, Amsterdam, Rotterdam, Dublin, Zagreb, and Split.At DEPT®, we empower the world’s most ambitious brands to accelerate their growth, blending technology and marketing through our expert team of over 4,000 specialists. We are proud to partner with industry leaders such as Google, Lufthansa, Meta, eBay, and OpenAI, and have maintained our B Corp and Climate Neutral certifications since 2021.In our Data & AI practice, we are dedicated to producing groundbreaking work that leverages Data & AI across various sectors. As a member of our EMEA Data craft team, you will collaborate with data strategists, scientists, and analysts to tackle complex challenges faced by beloved global brands.As a Lead Data Engineer, you will guide your team in delivering innovative, enterprise-scale data solutions, combining your technical expertise with strong business insights to meet client objectives.
Join our team at BDA_CDI HUB, where we deliver comprehensive IT solutions for T-Systems that encompass data management, analytics, visualization, and process automation.Our expertise covers everything from user-friendly analytics platforms to enterprise-level data warehouse and analytics solutions. We are dedicated to driving data-driven initiatives and implementing advanced analytics through cloud services, transforming our traditional data warehouse architecture into a centralized T-Systems Lakehouse framework to optimize performance and efficiency.Our work environment is dynamic, innovative, and focused on continuous learning and professional development.As a Senior Developer in Data Analytics & Data Engineering, your responsibilities will include:Designing IT platforms, processes, and structural frameworks for our evolving Lakehouse on Azure, using cutting-edge technologies such as Databricks, Data Factory, Data Lake Storage, Event Hub, and more.Developing sustainable architectures, prototyping solutions, and assessing new technologies to foster innovation.Supporting engineering teams in platform architecture development and mentoring them towards independent solution design.Ensuring the protection and security of business-critical and personal data.Collaborating with the team to improve platform architecture, establishing principles, patterns, and best practices.
Join our dynamic team as a Senior Data Engineer specializing in the Power Platform at Deutsche Telekom IT Solutions in Budapest. In this pivotal role within our international Common Data Intelligence Hub, you will enhance our Cloud Data & Analytics Platform by serving as the crucial link between Self-Service BI teams and the Azure Lakehouse backend.Our Cloud Data & Analytics Platform integrates Azure Lakehouse technologies (Data Factory, Databricks, dbt, CDC Framework) with Microsoft Power Platform services (Power BI, Power Apps, Power Automate, SharePoint Online). This innovative platform empowers business units to develop and manage their own analytics and reporting solutions while adhering to enterprise-level security, governance, and data management protocols.As a Senior Data Engineer, you will ensure that self-service solutions operate efficiently, securely, and reliably in accordance with established standards. You will provide guidance and support to business and data teams, helping them navigate the technical framework, governance principles, and best practices for sustainable analytics solutions. Collaboration with platform engineers will be key to maintaining operational alignment between the Power Platform frontend and the Lakehouse backend.Platform Operations & Maintenance:Ensure the reliable operation and performance optimization of Power Platform components (Power BI Service, Power Apps, Power Automate, SharePoint Online) within the Cloud Data & Analytics Platform.Monitor platform usage and performance to guarantee optimal resource allocation and cost efficiency.Manage Power BI gateways and secure data connections to the Azure Lakehouse backend (ADF, Databricks, dbt, CDC Framework).Oversee role-based access control (RBAC) and workspace permissions in line with corporate governance and Azure security principles.Optimize data refresh and load processes to maintain the freshness and stability of analytical datasets.Work closely with platform engineering teams to address incidents, coordinate upgrades, and implement technical improvements across both frontend and backend layers.Create standardized operational dashboards and reports to enhance transparency regarding platform usage, capacity, and performance metrics.Governance & Best Practices:Consult and coach Self-Service BI teams on effectively utilizing Power BI, Power Apps, and Power Automate within the governance framework.Guide new teams through onboarding and compliance processes, ensuring proper workspace configuration.
The Exciting Opportunity This position plays a vital role in architecting and enhancing our platform to meet business demands while optimizing our systems. In this role, you will have the opportunity to develop new data pipelines, manage platforms hosted on data streams for both batch and real-time loading, and create real-time visualizations. Key Responsibilities: Maintain and enhance our existing data platform. Develop processes to ingest data from Kafka, APIs, and databases using AWS MSK Connect. Design and maintain real-time data processing applications utilizing frameworks such as Spark Structured Streaming and Kafka Streams. Implement transformations on data streams. Participate in data modeling adhering to standards like Inmon, Kimball, and Data Vault. Ensure data quality by verifying consistency and accuracy. Stay current with research and advancements in technology to improve our data platform. Possess an investigative mindset to troubleshoot issues creatively and manage incidents effectively. Take full ownership of assigned projects and tasks while collaborating within a team environment. Document processes thoroughly and conduct knowledge-sharing sessions. What We're Looking For: Essential Qualifications: Proven experience with modern cloud database technologies, especially Snowflake. Expertise in orchestrating data pipelines using Airflow. Proficient in AWS Glue. Familiarity with Apache Iceberg. Strong experience with SQL and Data Integration Tools. Proficiency in programming languages such as Python or Scala. Knowledge of AWS Services like S3, Lambda, API Gateways, DMS, and RDS. Development experience in Microsoft and Linux/Cloud environments. Exceptional analytical and problem-solving skills.
Join Kpler as a Senior BI Data Engineer and be a pivotal force in shaping our data architecture. In this vital role within our Business Intelligence & Insights team, you will develop scalable data pipelines, create robust data models, and establish reliable infrastructure that empowers teams across the organization with access to high-quality, accurate data. This position is ideal for someone who thrives in a fast-paced, international environment and enjoys addressing complex data challenges while designing systems that drive insights, reporting, and machine learning at scale. You will report directly to the Director of BI, collaborating closely with cross-functional teams to enhance data-driven decision-making across the company.
Company Overview:At Zocks, we are a pioneering venture capital-backed AI startup that is revolutionizing financial services and various industries prioritizing security and privacy. Our founding team comprises industry veterans from leading companies like Twilio, IBM, Microsoft, and Hearsay Systems, who have dedicated their careers to developing real-time communication solutions and enterprise platforms. We leverage cutting-edge AI technology to seamlessly connect human interactions with enterprise systems, beginning with the financial sector.Our mission is straightforward yet impactful: to reinvent business operations and streamline areas often hindered by inefficient processes. Our innovative platform empowers users to communicate effortlessly while we capture vital information, eliminating cumbersome data entry and allowing them to focus more on what matters most—their clients.Location: Budapest, XI. (On-site Monday to Thursday, Remote on Fridays)Why Join Us: As we continue to expand, we are in search of a Senior Software Engineering Manager with a minimum of 7 years in Software Engineering and at least 3 years in a leadership role to guide and mentor our talented team of developers. We seek an individual who can introduce innovative ideas, possess advanced technical expertise, and have a strong passion for innovation. In this pivotal role, you will drive our projects forward, enhance our product offerings, and address the increasingly complex needs of our customers using agile methodologies and state-of-the-art technologies.
SEON serves as the pivotal hub for fraud prevention and AML compliance, aiding numerous companies globally in thwarting fraud, mitigating risk, and safeguarding revenue. Leveraging over 900 real-time, first-party data signals, SEON enhances customer profiles, identifies dubious activities, and simplifies compliance workflows - all centralized in one platform. With a commitment to delivering richer data, more adaptable and transparent analysis, and a quicker time to value than competitors, SEON has successfully helped businesses reduce fraud by 95% and achieve an impressive 32x ROI. Our rapid growth is fueled by partnerships with some of the world's most innovative digital brands, including Revolut, Wise, and Bilt.Join the Platform Engineering team at SEON, the driving force behind our technical advancement. This team is focused on constructing the fundamental infrastructure and pipelines that empower product engineers to deploy code securely, swiftly, and independently. We view our platform as a product and our developers as customers, aiming to eliminate friction and minimize cognitive load throughout the software delivery lifecycle.We are in search of a visionary and seasoned Senior Manager of Platform Engineering to lead our talented team of Platform Engineers. In this pivotal role, you will be instrumental in designing the 'Golden Paths' that standardize and expedite our development processes, ensuring that our infrastructure evolves in tandem with our rapid business expansion. You will work closely with Software Engineering, Data, and SRE teams to translate requirements into a unified platform strategy. Your proven ability to lead high-performing teams in complex, fast-paced environments will be essential, as will your skill in organizing and inspiring a team amidst ongoing growth and transformation.This position offers flexibility, allowing you to work from Budapest with a hybrid schedule or remotely from anywhere within the European Union, with occasional travel to our other offices.
Bosch Group seeks a Data Engineering Intern based in Budapest. This internship provides practical experience in data management, analytics, and the development of data pipelines. What you will do Support the team in managing and organizing data Assist with analytics tasks and reporting Help build and maintain data pipelines for ongoing projects Work closely with experienced data engineers on real-world assignments Requirements Interest in data engineering and analytics Willingness to learn new data technologies Ability to collaborate with team members Based in Budapest or able to work from this location
Role Overview Betsson Group is hiring a Lead Data DevOps Engineer to guide the Data DevOps team in Budapest. This position focuses on team leadership, technical direction, and ensuring smooth data operations within an agile setup. Key Responsibilities Lead and mentor a team of Data DevOps professionals. Work closely with cross-functional teams to align data operations with business needs. Shape and improve data architecture and data pipelines. Implement and promote strong data management and DevOps practices. Streamline workflows to deliver insights that support company goals. Location This role is based in Budapest.
Deutsche Telekom IT Solutions offers a Web Development Intern position based in Budapest, Debrecen, Pécs, or Szeged. This internship is designed for students or recent graduates aiming to gain hands-on experience in web development. Role overview Interns will join a team of web developers and IT professionals, participating in ongoing projects and contributing to real-world assignments. The environment encourages collaboration and learning from more experienced colleagues. What you will do Work with web developers and IT experts on active projects Take part in tasks that reflect real business needs Develop technical and teamwork skills in a supportive group This internship provides exposure to industry practices and the chance to build both technical knowledge and collaboration abilities.
Role overview The Senior Data Scientist - AI & Analytics position at Instructure in Budapest centers on using data science and machine learning to enhance products and services. This role requires collaboration with team members from various departments to address business challenges through data-driven solutions. Key responsibilities Analyze large and complex data sets to extract actionable insights. Build predictive models that support business decision-making. Work closely with colleagues from different functions to apply analytical methods to real-world problems. Location This position is based in Budapest, Hungary.
Role overview Qima seeks a Senior Manager, Data & Web Engineering to join their Budapest office. This leadership position manages a team of data engineers and web developers, focusing on the creation and improvement of data solutions and web applications. The role shapes data frameworks and helps ensure web platforms run smoothly, offering a dependable and user-friendly experience. What you will do Lead and mentor a group of data engineers and web developers Oversee development and refinement of data frameworks Maintain high standards for web platform performance and usability Collaborate with cross-functional teams to apply best practices in data management and web engineering Guide the team through complex technical issues Deliver projects on time and within budget Encourage continuous learning and foster innovation within the team Requirements Demonstrated experience leading teams in data engineering and web development Strong knowledge of data frameworks and web application performance Ability to work effectively across departments Track record of mentoring and developing technical staff Comfort managing several projects and shifting priorities This position is located in Budapest.
Join our esteemed client, a well-established IT consulting firm with over two decades of expertise in executing large-scale technology projects across various sectors. Their enduring collaborations with major organizations demonstrate their technical prowess and financial reliability, bolstered by consistently high external credit ratings. Operating with a streamlined core team, the company boasts low turnover rates and fosters a collaborative culture built on trust, long-term partnerships, and a commitment to high-quality deliverables. With engagements spanning telecommunications, healthcare, and academia, this role offers engineers the chance to work on diverse and intricate projects while enjoying a flexible and supportive work environment.We are seeking a Senior Data Engineer to help manage extensive data environments characterized by high-volume datasets, intricate integrations, and analytical applications. This hands-on engineering position emphasizes the creation and upkeep of data pipelines, in-depth understanding of end-to-end data flows, and resolution of data quality concerns across distributed systems.You will collaborate closely with engineers, analysts, and stakeholders to ensure the delivery of reliable, scalable, and well-structured data solutions that enhance analytics and machine learning efforts.Key Responsibilities:Design, construct, and sustain scalable data ingestion and ETL/ELT pipelines.Manage large datasets and interfaces that feed into enterprise-level databases.Examine data flows and diagnose inconsistencies across systems.Conduct root cause analyses to trace erroneous outputs back to original source systems.Prepare datasets for analytics and machine learning applications.Participate in data modeling and mapping activities.Assist machine learning initiatives through data preparation, evaluation, and optimization.Establish and implement data quality monitoring frameworks.Create visualizations to facilitate data interpretation and insights.
Join our dynamic team as a Senior Data Engineer where you will play a pivotal role in shaping our data architecture and driving innovative solutions. You will be responsible for designing, building, and maintaining robust data pipelines that ensure our data is accessible, reliable, and ready for analysis. Your expertise will directly contribute to enhancing our data-driven decision-making processes.
Join YOVO as a Web Product EngineerAt YOVO, we're revolutionizing payment infrastructure tailored for digital entrepreneurs across Europe. Our mission is to equip creators, educators, and innovative online businesses with cutting-edge tools that keep pace with their dynamic needs. Traditional payment solutions fail to cater to this evolving landscape, prompting us to forge a groundbreaking alternative.Founded in 2025, YOVO is at the forefront of developing a state-of-the-art, modular checkout and subscription engine, alongside world-class product hosting solutions that simplify the sale of digital products and services. Our goal is to empower creators and digital enterprises with the comprehensive financial and operational infrastructure necessary for scaling, from seamless payments to automated workflows and insightful analytics.Operating with a remote-first approach, we also offer the flexibility to work on-site at our Budapest office, a vibrant hub for tech talent in Europe. Supported by seasoned founders, operators, and international angel investors, we are on a mission to transform the European payments landscape.
Join our team as a Senior Data Engineer, where you will play a pivotal role in building and managing scalable data ingestion and Change Data Capture (CDC) capabilities on our Azure-based Lakehouse platform. Your expertise will drive our engineering maturity as we deliver ingestion and CDC preparation through Python projects and reusable frameworks. We are seeking a professional who applies best software engineering practices, including clean architecture, rigorous testing, code reviews, effective packaging, CI/CD, and operational excellence.Our platform emphasizes batch-first processing, allowing for the landing of streaming sources in their raw form while processing them in batch. We are selective in our evolution towards streaming as necessary.As part of the Common Data Intelligence Hub, you will collaborate closely with data architects, analytics engineers, and solution designers to create robust data products and ensure governed data flows across the enterprise.Your team is responsible for end-to-end ingestion and CDC engineering, including design, build, operation, observability, reliability, and reusable components.You will contribute to the development of platform standards, including contracts, layer semantics, and readiness criteria.While you will not primarily manage cloud infrastructure provisioning, you will work with the platform team to define requirements, review changes, and maintain deployable code for pipelines and jobs.Platform Data Engineering & DeliveryDesign and develop ingestion pipelines utilizing Azure and Databricks services, including Azure Data Factory pipelines and Databricks notebooks/jobs/workflows.Implement and manage CDC patterns for inserts, updates, and deletes, accommodating late arriving data and reprocessing strategies.Structure and maintain bronze and silver Delta Lake datasets, focusing on schema enforcement, de-duplication, and performance tuning.Create “transformation-ready” datasets and interfaces with stable schemas, contracts, and metadata expectations for analytics engineers and downstream modeling.Adopt a batch-first approach for data ingestion, ensuring raw landing, replayability, and idempotent batch processing while progressing towards true streaming as required.Software Engineering for Data FrameworksDevelop and maintain Python-based ingestion and CDC components as production-grade software, focusing on modules, packaging, versioning, and releases.Implement engineering best practices such as code reviews, unit/integration tests, static analysis, formatting/linting, type hints, and comprehensive documentation.Establish and enhance CI/CD pipelines for data engineering code and pipeline assets, covering build, testing, security checks, deployment, and rollback patterns.
Full-time|On-site|Budapest, Hungary; Munich, Germany; Tel Aviv, Israel
Role Overview Tulip is hiring a Data Operations Engineer to help manage and improve data operations. This role focuses on maintaining data quality, refining workflows, and supporting the efficiency of daily processes. The position is based in Budapest, Munich, or Tel Aviv. What You Will Do Work closely with teams across the company to ensure data remains accurate and reliable Identify opportunities to streamline data-related processes Support ongoing efforts to improve the efficiency of data operations Location Budapest, Hungary Munich, Germany Tel Aviv, Israel
Join our dynamic team at mpsolutions as a Full Stack Web Engineer, where you will contribute to the creation and advancement of a cutting-edge, event-driven platform. Your role will encompass the design and delivery of features across the technology stack, with a particular focus on asynchronous architectures and event streaming.We foster an innovative environment that encourages the use of advanced coding tools (such as Cursor and Claude) to enhance productivity and code quality during development.Your Responsibilities:Develop and maintain backend services utilizing NestJS (Node.js/TypeScript)Create responsive and maintainable frontend applications with Angular or ReactDesign and implement event-driven and asynchronous flows, including messaging patterns and stream processingEngage with Kafka topics, producers/consumers, schemas, and delivery semantics (e.g., retries, idempotency)Collaborate on system design, API contracts, observability, and performance optimizationContribute to engineering best practices: clean code, code reviews, automated testing, and CI/CD
Join our dynamic team at Hawkeye Innovations as a Senior Java Engineer specializing in our Data Platform Framework. In this role, you will drive the development of scalable and efficient data solutions, collaborate with cross-functional teams, and contribute to the architecture of our data platform. If you are passionate about Java and data engineering, this is your chance to make an impact in a forward-thinking company.
Join a global leader in technology dedicated to energy innovation for a sustainable future. Our client is at the forefront of transforming energy solutions with advanced gas engine technologies, digital platforms, and energy services. With operations in over 100 countries, they focus on advancing engineering while prioritizing the well-being of people and the planet.The Senior Data Engineer plays a crucial role in collaboration with Data Analysts, BI Developers, and Requirements Engineers to lay the groundwork for all analytical projects. This position entails building and managing data pipelines that extract, transform, and load (ETL) data from various sources into a centralized repository, subsequently facilitating project-specific data delivery.Your Responsibilities: Architect and deploy scalable and resilient data pipelines that meet the analytics and data processing requirements. Design and maintain robust database architectures, including data lakes and warehouses. Ensure data integrity and consistency through meticulous data cleaning, transformation, and validation processes. Engage with Data Analysts, BI Developers, and Requirements Engineers to gather project requirements and provide data solutions aligned with business goals. Enhance data retrieval processes by developing pipelines and physical data models tailored for reports and various analytical projects. Implement data security and privacy protocols to ensure compliance with legal and regulatory standards. Document all created data pipelines comprehensively.
Full-time|Hybrid|Budapest, London, Manchester, Amsterdam, Rotterdam, Dublin, Zagreb, Split
Join our team as a Lead Data Engineer and play a pivotal role in shaping data engineering strategies at DEPT®. Our hybrid work environment spans across vibrant cities including Budapest, London, Manchester, Amsterdam, Rotterdam, Dublin, Zagreb, and Split.At DEPT®, we empower the world’s most ambitious brands to accelerate their growth, blending technology and marketing through our expert team of over 4,000 specialists. We are proud to partner with industry leaders such as Google, Lufthansa, Meta, eBay, and OpenAI, and have maintained our B Corp and Climate Neutral certifications since 2021.In our Data & AI practice, we are dedicated to producing groundbreaking work that leverages Data & AI across various sectors. As a member of our EMEA Data craft team, you will collaborate with data strategists, scientists, and analysts to tackle complex challenges faced by beloved global brands.As a Lead Data Engineer, you will guide your team in delivering innovative, enterprise-scale data solutions, combining your technical expertise with strong business insights to meet client objectives.
Join our team at BDA_CDI HUB, where we deliver comprehensive IT solutions for T-Systems that encompass data management, analytics, visualization, and process automation.Our expertise covers everything from user-friendly analytics platforms to enterprise-level data warehouse and analytics solutions. We are dedicated to driving data-driven initiatives and implementing advanced analytics through cloud services, transforming our traditional data warehouse architecture into a centralized T-Systems Lakehouse framework to optimize performance and efficiency.Our work environment is dynamic, innovative, and focused on continuous learning and professional development.As a Senior Developer in Data Analytics & Data Engineering, your responsibilities will include:Designing IT platforms, processes, and structural frameworks for our evolving Lakehouse on Azure, using cutting-edge technologies such as Databricks, Data Factory, Data Lake Storage, Event Hub, and more.Developing sustainable architectures, prototyping solutions, and assessing new technologies to foster innovation.Supporting engineering teams in platform architecture development and mentoring them towards independent solution design.Ensuring the protection and security of business-critical and personal data.Collaborating with the team to improve platform architecture, establishing principles, patterns, and best practices.
Join our dynamic team as a Senior Data Engineer specializing in the Power Platform at Deutsche Telekom IT Solutions in Budapest. In this pivotal role within our international Common Data Intelligence Hub, you will enhance our Cloud Data & Analytics Platform by serving as the crucial link between Self-Service BI teams and the Azure Lakehouse backend.Our Cloud Data & Analytics Platform integrates Azure Lakehouse technologies (Data Factory, Databricks, dbt, CDC Framework) with Microsoft Power Platform services (Power BI, Power Apps, Power Automate, SharePoint Online). This innovative platform empowers business units to develop and manage their own analytics and reporting solutions while adhering to enterprise-level security, governance, and data management protocols.As a Senior Data Engineer, you will ensure that self-service solutions operate efficiently, securely, and reliably in accordance with established standards. You will provide guidance and support to business and data teams, helping them navigate the technical framework, governance principles, and best practices for sustainable analytics solutions. Collaboration with platform engineers will be key to maintaining operational alignment between the Power Platform frontend and the Lakehouse backend.Platform Operations & Maintenance:Ensure the reliable operation and performance optimization of Power Platform components (Power BI Service, Power Apps, Power Automate, SharePoint Online) within the Cloud Data & Analytics Platform.Monitor platform usage and performance to guarantee optimal resource allocation and cost efficiency.Manage Power BI gateways and secure data connections to the Azure Lakehouse backend (ADF, Databricks, dbt, CDC Framework).Oversee role-based access control (RBAC) and workspace permissions in line with corporate governance and Azure security principles.Optimize data refresh and load processes to maintain the freshness and stability of analytical datasets.Work closely with platform engineering teams to address incidents, coordinate upgrades, and implement technical improvements across both frontend and backend layers.Create standardized operational dashboards and reports to enhance transparency regarding platform usage, capacity, and performance metrics.Governance & Best Practices:Consult and coach Self-Service BI teams on effectively utilizing Power BI, Power Apps, and Power Automate within the governance framework.Guide new teams through onboarding and compliance processes, ensuring proper workspace configuration.
The Exciting Opportunity This position plays a vital role in architecting and enhancing our platform to meet business demands while optimizing our systems. In this role, you will have the opportunity to develop new data pipelines, manage platforms hosted on data streams for both batch and real-time loading, and create real-time visualizations. Key Responsibilities: Maintain and enhance our existing data platform. Develop processes to ingest data from Kafka, APIs, and databases using AWS MSK Connect. Design and maintain real-time data processing applications utilizing frameworks such as Spark Structured Streaming and Kafka Streams. Implement transformations on data streams. Participate in data modeling adhering to standards like Inmon, Kimball, and Data Vault. Ensure data quality by verifying consistency and accuracy. Stay current with research and advancements in technology to improve our data platform. Possess an investigative mindset to troubleshoot issues creatively and manage incidents effectively. Take full ownership of assigned projects and tasks while collaborating within a team environment. Document processes thoroughly and conduct knowledge-sharing sessions. What We're Looking For: Essential Qualifications: Proven experience with modern cloud database technologies, especially Snowflake. Expertise in orchestrating data pipelines using Airflow. Proficient in AWS Glue. Familiarity with Apache Iceberg. Strong experience with SQL and Data Integration Tools. Proficiency in programming languages such as Python or Scala. Knowledge of AWS Services like S3, Lambda, API Gateways, DMS, and RDS. Development experience in Microsoft and Linux/Cloud environments. Exceptional analytical and problem-solving skills.
Join Kpler as a Senior BI Data Engineer and be a pivotal force in shaping our data architecture. In this vital role within our Business Intelligence & Insights team, you will develop scalable data pipelines, create robust data models, and establish reliable infrastructure that empowers teams across the organization with access to high-quality, accurate data. This position is ideal for someone who thrives in a fast-paced, international environment and enjoys addressing complex data challenges while designing systems that drive insights, reporting, and machine learning at scale. You will report directly to the Director of BI, collaborating closely with cross-functional teams to enhance data-driven decision-making across the company.
Company Overview:At Zocks, we are a pioneering venture capital-backed AI startup that is revolutionizing financial services and various industries prioritizing security and privacy. Our founding team comprises industry veterans from leading companies like Twilio, IBM, Microsoft, and Hearsay Systems, who have dedicated their careers to developing real-time communication solutions and enterprise platforms. We leverage cutting-edge AI technology to seamlessly connect human interactions with enterprise systems, beginning with the financial sector.Our mission is straightforward yet impactful: to reinvent business operations and streamline areas often hindered by inefficient processes. Our innovative platform empowers users to communicate effortlessly while we capture vital information, eliminating cumbersome data entry and allowing them to focus more on what matters most—their clients.Location: Budapest, XI. (On-site Monday to Thursday, Remote on Fridays)Why Join Us: As we continue to expand, we are in search of a Senior Software Engineering Manager with a minimum of 7 years in Software Engineering and at least 3 years in a leadership role to guide and mentor our talented team of developers. We seek an individual who can introduce innovative ideas, possess advanced technical expertise, and have a strong passion for innovation. In this pivotal role, you will drive our projects forward, enhance our product offerings, and address the increasingly complex needs of our customers using agile methodologies and state-of-the-art technologies.
SEON serves as the pivotal hub for fraud prevention and AML compliance, aiding numerous companies globally in thwarting fraud, mitigating risk, and safeguarding revenue. Leveraging over 900 real-time, first-party data signals, SEON enhances customer profiles, identifies dubious activities, and simplifies compliance workflows - all centralized in one platform. With a commitment to delivering richer data, more adaptable and transparent analysis, and a quicker time to value than competitors, SEON has successfully helped businesses reduce fraud by 95% and achieve an impressive 32x ROI. Our rapid growth is fueled by partnerships with some of the world's most innovative digital brands, including Revolut, Wise, and Bilt.Join the Platform Engineering team at SEON, the driving force behind our technical advancement. This team is focused on constructing the fundamental infrastructure and pipelines that empower product engineers to deploy code securely, swiftly, and independently. We view our platform as a product and our developers as customers, aiming to eliminate friction and minimize cognitive load throughout the software delivery lifecycle.We are in search of a visionary and seasoned Senior Manager of Platform Engineering to lead our talented team of Platform Engineers. In this pivotal role, you will be instrumental in designing the 'Golden Paths' that standardize and expedite our development processes, ensuring that our infrastructure evolves in tandem with our rapid business expansion. You will work closely with Software Engineering, Data, and SRE teams to translate requirements into a unified platform strategy. Your proven ability to lead high-performing teams in complex, fast-paced environments will be essential, as will your skill in organizing and inspiring a team amidst ongoing growth and transformation.This position offers flexibility, allowing you to work from Budapest with a hybrid schedule or remotely from anywhere within the European Union, with occasional travel to our other offices.
Bosch Group seeks a Data Engineering Intern based in Budapest. This internship provides practical experience in data management, analytics, and the development of data pipelines. What you will do Support the team in managing and organizing data Assist with analytics tasks and reporting Help build and maintain data pipelines for ongoing projects Work closely with experienced data engineers on real-world assignments Requirements Interest in data engineering and analytics Willingness to learn new data technologies Ability to collaborate with team members Based in Budapest or able to work from this location
Role Overview Betsson Group is hiring a Lead Data DevOps Engineer to guide the Data DevOps team in Budapest. This position focuses on team leadership, technical direction, and ensuring smooth data operations within an agile setup. Key Responsibilities Lead and mentor a team of Data DevOps professionals. Work closely with cross-functional teams to align data operations with business needs. Shape and improve data architecture and data pipelines. Implement and promote strong data management and DevOps practices. Streamline workflows to deliver insights that support company goals. Location This role is based in Budapest.
Deutsche Telekom IT Solutions offers a Web Development Intern position based in Budapest, Debrecen, Pécs, or Szeged. This internship is designed for students or recent graduates aiming to gain hands-on experience in web development. Role overview Interns will join a team of web developers and IT professionals, participating in ongoing projects and contributing to real-world assignments. The environment encourages collaboration and learning from more experienced colleagues. What you will do Work with web developers and IT experts on active projects Take part in tasks that reflect real business needs Develop technical and teamwork skills in a supportive group This internship provides exposure to industry practices and the chance to build both technical knowledge and collaboration abilities.
Role overview The Senior Data Scientist - AI & Analytics position at Instructure in Budapest centers on using data science and machine learning to enhance products and services. This role requires collaboration with team members from various departments to address business challenges through data-driven solutions. Key responsibilities Analyze large and complex data sets to extract actionable insights. Build predictive models that support business decision-making. Work closely with colleagues from different functions to apply analytical methods to real-world problems. Location This position is based in Budapest, Hungary.
Apr 27, 2026
Sign in to browse more jobs
Create account — see all 512 results
Tailoring 0 resumes…
Tailoring 0 resumes…
We'll move completed jobs to Ready to Apply automatically.