Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Unlock Your Potential
Generate Job-Optimized Resume
One Click And Our AI Optimizes Your Resume to Match The Job Description.
Is Your Resume Optimized For This Role?
Find Out If You're Highlighting The Right Skills And Fix What's Missing
Experience Level
Experience
Qualifications
The ideal candidate will possess a strong background in data engineering, proficient in SQL and ETL processes. Familiarity with BI tools and data visualization is essential. Additionally, experience with cloud technologies and programming languages such as Python or R will be highly advantageous. Excellent problem-solving skills and the ability to work collaboratively in a fast-paced environment are required.
About the job
Join Kpler as a Business Intelligence Data Engineer where you will play a crucial role in transforming data into actionable insights. You will work with various data sources and be part of a dynamic team focused on enhancing our data platforms. You will have the opportunity to leverage your analytical skills to drive strategic decision-making and contribute to our innovative solutions.
About Kpler
Kpler is a leading data intelligence platform that specializes in providing transparency for global commodity markets. With a commitment to innovation and customer satisfaction, we empower organizations to make informed decisions based on accurate, real-time data. Join us in shaping the future of data analytics!
About EveryPayEveryPay is dedicated to revolutionizing the digital financial landscape of e-commerce in Greece. Our mission is to empower Marketplaces and Merchants, enabling them to succeed in a competitive environment.We are a vibrant team of young professionals, united by our core values of Empowering Customers, Collaborating as a Team, Managing Risks, and Delivering Results.We take pride in having created the payment infrastructure that connects numerous Greek Marketplaces and Merchants with global payment schemes such as Visa and MasterCard. Our services extend to Greece's largest and most successful marketplace, Skroutz.Our systems interface with thousands of banks, both domestically and internationally. Our technology handles tens of thousands of transactions daily, amounting to billions of euros in e-commerce. If you have made an online purchase in Greece, you have likely interacted with our payment solutions.EveryPay is a wholly-owned subsidiary of the Skroutz Group of Companies, functioning as both a Technology Firm and a Regulated Financial Services Institution. This unique position offers you exposure to both the Tech Payments Sector and the realm of Financial Services.Your Role in EveryPay's Vision:We are looking to expand our Data Platform team by hiring a skilled Data Platform Engineer. In this role, you will design, build, and maintain the core data platform that drives analytics and business intelligence at EveryPay. You will be instrumental in developing robust data ingestion pipelines, establishing scalable data infrastructure, and enabling our BI team to extract actionable insights from data. Your contributions will ensure that high-quality, reliable data is readily accessible to all stakeholders within the organization.Key Challenges You Will Tackle:Data Ingestion at Scale: Design and implement scalable, reliable data ingestion pipelines that handle data from diverse internal and external sources.Platform Enablement: Construct, operate, and optimize our data platform to empower BI and analytics teams to easily explore, analyze, and visualize data.Data Quality & Governance: Establish and uphold best practices for data quality, lineage, and governance to ensure data trustworthiness and compliance.Your Responsibilities:Architect, build, and maintain ETL/ELT pipelines for ingesting data from various systems (e.g., payment systems, marketplaces, SaaS tools).Establish and manage data platform infrastructure (cloud data warehouses, databases, orchestration tools, etc.).Collaborate closely with the BI team to understand data requirements and deliver efficient, reliable data models and datasets.Monitor pipeline performance and data quality, proactively troubleshooting and resolving any issues.
Join Elastic as a Data Engineer II specializing in Platform Analytics. In this pivotal role, you will be responsible for optimizing data flows and enhancing data strategies across the Kibana platform within the AppEx team. You will collaborate with cross-functional teams to design and implement scalable data solutions that drive business insights and analytics. This is an opportunity to work with cutting-edge technologies and contribute to data-driven decision-making processes.
Join Our TeamOpenBet stands at the forefront of the global betting and gaming entertainment industry, partnering with over 200 clients to deliver unforgettable winning moments to millions of players worldwide. From managing bets during monumental events such as the FIFA World Cup and Super Bowl to innovating next-generation products like BetBuilder, we are committed to continuously enhancing the player experience through exceptional content, state-of-the-art technology, and advanced player protection measures.For more than 25 years, our unparalleled platform has powered the most recognizable betting brands, providing peak performance with 100% uptime, unmatched scalability, and remarkable speed. With 85 licenses in our portfolio, 20 World Lottery Association operators among our clients, and a talented team of over 1,200 experts across 14 countries, we are undeniably at the heart of the industry. Step into your future with OpenBet and contribute to a global team that is redefining the future of betting entertainment.Your RoleAs a Platform Engineer at OpenBet, you will play a critical role in the design, construction, and maintenance of the cloud infrastructure that supports our leading sports betting platform. Utilizing your expertise in Microsoft Azure, Azure DevOps, and Terraform, you'll ensure our systems are scalable, secure, and resilient, fulfilling the demands of users across the globe.This position calls for a strategic thinker who excels in collaborative settings, drives automation, and advocates best practices in cloud engineering. You will collaborate with cross-functional teams to innovate and optimize our platform, enabling seamless delivery of real-time betting experiences.Key ResponsibilitiesDesign, deploy, and manage scalable, secure, and cost-effective cloud solutions on Microsoft Azure.Optimize infrastructure for high availability, disaster recovery, and performance across global regions.CI/CD Pipelines & DevOps PracticesDevelop and maintain robust CI/CD pipelines using Azure DevOps to facilitate rapid, reliable software delivery.Automate deployment processes, monitor system performance, and ensure compliance with industry best practices.
Jobgether is looking for a Platform Engineer with a focus on multicloud environments. This role is based in Greece and centers on designing and implementing cloud solutions that connect multiple platforms. The goal: keep operations smooth and systems scalable as needs grow. What you will do Design and build cloud solutions that work across different providers Integrate and optimize cloud architecture for performance and reliability Work with teams from different disciplines to improve existing services Who we’re looking for Experience with multicloud environments Strong background in platform engineering Ability to collaborate and contribute to team innovation This position offers the chance to shape cloud architecture and help deliver strong service offerings at jobgether.
Are you a driven DevOps Engineer ready to tackle complex challenges and modernize web applications on a global scale? Join our dynamic team at Betsson Group! Product Development at Betsson Group Our Product Development division is truly global, comprising cross-functional teams across six tech hubs: Malta, Budapest, Stockholm, Tallinn, Kyiv, and Athens. With nearly 600 skilled professionals, our Product Development organization is led by our CTO-CPO, fostering collaboration among talented area teams. What You Will Be Doing Engaging with cutting-edge technologies to support our cloud journey. The Platform Core team manages Infrastructure as Code for vital components of our Core Platform Layer, including Kubernetes Clusters, Message Broker backends, and infrastructure tooling, providing a distributed runtime platform for our business applications, APIs, and websites. Creating self-service infrastructure platforms for developers and offering guidance and tools for managing their applications across various environments. Continuously enhancing our platforms based on business needs, analyzing, diagnosing, and resolving performance efficiency and scalability issues in collaboration with colleagues from diverse teams. We are currently seeking to enhance our Kafka Messaging Platform management, looking for a DevOps engineer with robust knowledge and experience in managing Kafka infrastructure.
Arista Networks is seeking a Software Engineer to join the Platform Team in Athens. This team develops and maintains the core software that underpins Arista’s products, focusing on stability, scalability, and ongoing improvement. Role overview This position centers on building and supporting software components that form the foundation of Arista’s platform. The work involves tackling technical challenges and ensuring the platform remains reliable as it grows. What you will do Design, implement, and maintain software components for the platform Work closely with colleagues to address technical issues and find solutions Offer ideas to enhance product reliability and performance Requirements Interest in developing dependable software systems Ability to collaborate with others to solve complex problems Curiosity about new technologies and methods
METRO AEBE is recognized as one of Greece's leading employers, proudly supporting over 11,000 employees. Operating under the renowned My Market brand, we manage one of the largest retail networks in the country, featuring 290 stores nationwide. Additionally, we dominate the wholesale market with 50 METRO Cash & Carry stores catering to professionals across Greece.To fuel our ongoing expansion, we are looking for a skilled Data Engineer to join our Data Warehouse (DWH) team. In this role, you will play a pivotal part in designing, enhancing, and optimizing enterprise data products that empower data-driven decision-making.Responsibilities:Design and develop robust, automated data pipelines (ETL/ELT) to efficiently ingest data from diverse sources into our Data Warehouse or Data Lake.Conduct data wrangling tasks, including data cleaning and transformation, to convert raw data into actionable formats for analysis, visualization, or machine learning applications.Ensure data quality and monitor pipeline performance to uphold data integrity and reliability.Implement data access controls in alignment with corporate regulations and policies.Contribute to machine learning and AI initiatives by preparing, validating, and serving high-quality datasets for model training and evaluation.Work collaboratively with Data and BI Analysts, providing technical support as required.
Role Overview finartix is looking for an ETL/SSIS Data Engineer to join the team in Athens, Attica, Greece. This role focuses on building and maintaining data solutions for clients in the Greek market. The position works closely with IT professionals to improve data ecosystems and streamline data delivery for a range of sectors. What You Will Do Develop, test, and maintain data solutions throughout the full software development lifecycle. Apply effective methods for collecting and analyzing data to support strategic recommendations that fit client business goals. Act as a technical advisor, offering insights and solutions to clients. Work as part of an Agile team, contributing to collaboration and new ideas. Qualifications BS or MS in Computer Science, Engineering, or a related field. Minimum 3 years of experience in software development using MS SQL Server and ETL tools, especially SSIS. At least 2 years working on data migration projects. 2 years of experience in the Banking Industry. Solid understanding of software application fundamentals and how they affect user experience. Strong skills in testing and quality assurance. Proficient programming abilities and a creative, problem-solving approach. Good communication and time management skills. Comfortable working both independently and as part of a team. Demonstrated analytical thinking and a solution-oriented mindset. Proficiency with Microsoft Office Suite. Fluent in both English and Greek, written and spoken.
About the Role netcompany1 is looking for a Junior to Mid-level Data Engineer in Athens. This role focuses on building and improving data pipelines that support business decisions. The work involves designing, implementing, and optimizing data flows to keep information accurate and reliable. Main Responsibilities Design and build data pipelines for business and analytics needs Optimize existing data processes for efficiency and quality Work with data scientists, analysts, and engineers to improve data architecture Help ensure data integrity across projects Collaboration This position works closely with cross-functional teams, including data scientists and analysts, to support analytical projects and improve how data is used throughout the company. Who We're Looking For Interest in data engineering and analytics Willingness to learn new technologies and approaches Strong teamwork and communication skills
Optasia is a cutting-edge B2B2X financial technology platform specializing in scoring, financial decision-making, disbursement, and collections. Our mission is to promote financial inclusion for everyone, and we pride ourselves on transforming the world in our own unique way.We are on the lookout for passionate and proactive professionals who are driven by results and possess a can-do attitude. Join a team of like-minded individuals dedicated to delivering innovative solutions in an exciting environment.We invite you to apply for the position of Data Engineer within our expanding Data Engineering team. In this role, you will design and implement highly scalable end-to-end batch and streaming data pipelines, contributing to the overall success of Optasia.Your responsibilities will include:Enhancing the scalability, stability, accuracy, speed, and efficiency of our existing data systems.Designing and developing end-to-end data processing pipelines.Navigating a diverse technology stack, including Scala, Spark, Python3, Bash/Python scripting, Hadoop, and SQL.Designing, constructing, testing, and deploying new libraries, frameworks, or complete systems while adhering to the highest standards of testing and code quality.Developing, maintaining, and optimizing core libraries for batch processing and large volume data ingestion into our big data infrastructure.Building and maintaining CI/CD orchestration.What we expect from you:Bachelor's or Master's degree in Computer Science or Informatics.A minimum of 2 years experience in Data Engineering.Proven experience in software/data engineering and/or operations/DevOps/DataOps.Familiarity with the Apache Hadoop ecosystem (YARN, HDFS, HBase, Spark).Hands-on experience with both relational and NoSQL databases.Proficient in systems administration with Linux.Experience in deploying, configuring, and maintaining distributed systems and data/software engineering tools.Your key attributes:Experience with fluid virtual infrastructures such as containers (e.g., Docker, Kubernetes).Familiarity with data and ML flow engines and tools, such as Apache Airflow.A strong passion for learning new technologies and collaborating with other creative professionals.Why you should join us:We offer a range of benefits including: Flexible hybrid working options Competitive remuneration package An extra day off on your birthday Performance-based bonus scheme Comprehensive private healthcare insurance All the tech gear you need to work efficientlyExperience the Optasia perks: Join our multicultural working environment Engage with a unique and promising business and industry Gain insights into the future market landscape Enjoy a solid career path within our working family.
Join a pioneering leader in education technology, recognized globally for excellence in the assessment and certification of professional skills across more than 200 countries. PeopleCert is actively seeking a talented Data Engineer to enhance our dynamic Data & AI team. This position is crucial in architecting, developing, and sustaining the infrastructure and data solutions that empower our AI-driven projects. The ideal candidate will possess substantial practical experience with Microsoft Azure technologies and demonstrate a strong passion for data engineering practices that facilitate machine learning, advanced analytics, and large-scale data processing.In this role, you will collaborate closely with the AI Center of Excellence, working alongside data scientists, ML engineers, software developers, analysts, and business stakeholders to enable data accessibility and drive intelligent applications.Your responsibilities will include:Designing, implementing, and maintaining scalable data pipelines and workflows to facilitate AI/ML model training, evaluation, and inference.Building and optimizing data integration solutions utilizing Azure data tools such as Synapse Analytics, Azure Data Factory, Databricks, and Delta Lake.Partnering with data scientists and AI engineers to ensure data is available in the correct format and quality for modeling purposes.Developing and maintaining APIs and data services that power AI-driven applications and insights delivery.Supporting the development of data lakes and lakehouses tailored for advanced analytics and AI use cases.Writing efficient, reusable Python and SQL code for data processing, cleaning, and transformation.Participating in code reviews and knowledge-sharing sessions within the team to cultivate best practices and continuous learning.Keeping abreast of emerging tools, cloud services, and trends in data engineering and AI infrastructure.
Join Kpler as a Business Intelligence Data Engineer where you will play a crucial role in transforming data into actionable insights. You will work with various data sources and be part of a dynamic team focused on enhancing our data platforms. You will have the opportunity to leverage your analytical skills to drive strategic decision-making and contribute to our innovative solutions.
Are you excited about Data & AI? At Satori Analytics, we are redefining the landscape of data and artificial intelligence. Our mission is to empower global brands by providing unparalleled clarity through innovative data solutions. We develop cloud-based ecosystems for fintech and predictive models for airlines, offering cutting-edge solutions that span the entire data lifecycle—from ingestion to AI applications.As a rapidly growing scale-up, our dynamic team of over 100 tech professionals—including Data Engineers, Data Scientists, and more—delivers transformative analytics solutions across diverse sectors such as FMCG, retail, manufacturing, and financial services. Join us in spearheading the data revolution in South-Eastern Europe and beyond!What Your Day Might Look Like:Technical & Delivery LeadershipLead the development and enhancement of data engineering standards, best practices, and architectural principles for all Satori projects.Serve as a senior technical authority for complex data platforms, including cloud data stacks, pipelines, streaming, and orchestration.Assist project teams in solution design, risk management, and technical decision-making processes.Evaluate and critique designs to ensure they meet scalability, performance, security, and cost-effectiveness criteria.Collaborate with Tech Leads to maintain consistency and quality across projects.People Management & LeadershipOversee Senior Data Engineers and Tech Leads, fostering growth, performance, and career advancement.Mentor engineers on technical depth, ownership, communication, and leadership skills.Contribute to performance evaluations, development plans, and promotion decisions in line with Satori’s competency framework.Exemplify Satori’s values of collaboration, transparency, and accountability.Cross-Functional CollaborationWork in tandem with Product Owners to align technical solutions with client requirements and delivery constraints.Partner with Data Science, AI, and Cloud teams to ensure seamless end-to-end solutions.Support presales and discovery phases by providing technical insights, estimations, and solution framing when necessary.Organizational ImpactIdentify skill gaps, tooling, or process improvements and recommend practical solutions.Engage in internal initiatives, such as guilds, playbooks, training, and knowledge sharing.Help scale the data engineering capabilities as Satori expands, ensuring quality and culture are preserved.
Elevate your career with us! Join our dynamic development teams in Athens or work remotely as a Data Engineer. In this vital role within our agile team, you will help design and implement cutting-edge big data solutions on a scalable cloud platform. You will analyze millions of real-time data points to extract advanced insights and enhance analytics capabilities for our end users.Your Responsibilities: Develop and implement batch processing pipelines utilizing Spark (Python or Scala) and SQL; Design and execute streaming ETL/ELT processes from a variety of data sources; Write and maintain code for developing comprehensive big data solutions, focusing on data integration and analytics use cases; Create and implement APIs using contemporary Python frameworks; Collaborate effectively with our Business Analysis teams to align technical solutions with business needs; Conduct end-to-end and functional testing using open-source tools; Set up monitoring solutions for our data platform, including alerts and dashboards. Essential Qualifications: Bachelor’s degree in Computer Science or Software Engineering; Extensive knowledge of Apache Spark; Proficient in Python and database management; Previous experience as a Data Engineer; Familiarity with Azure Data Lake Storage and Delta Live Tables; Fluency in English, both written and spoken; Strong analytical skills and a team-oriented mindset; A passion for learning and professional growth in data engineering. Preferred Qualifications: Experience with Databricks; Proficiency in API development with FastAPI; Familiarity with cloud platforms (AWS, Azure, GCP, etc.); Experience with Docker. Why Join Us?We value talent and commitment, offering a range of benefits for our team members, including:Competitive full-time salary;Comprehensive private health coverage under the company’s group program;Flexible working hours;Access to state-of-the-art tools;Opportunities for professional development including language courses and specialized training;Career advancement opportunities with industry-leading specialists;A dynamic work environment that encourages personal and professional growth through challenging goals and mentorship.If you're ready to embrace an exciting challenge, work with cutting-edge technologies, and enjoy your daily tasks, we invite you to apply! Please submit your detailed CV in English, referencing: (SDE/02/26).Explore all our open vacancies by visiting the career section of our website.
Are you enthusiastic about big data and eager to engage with advanced technologies? We invite you to explore a thrilling opportunity as a Data Engineer - Spark Developer with our dynamic and growing development teams. Whether you prefer the vibrant atmosphere of our Athens office or the flexibility of remote work, we are excited to welcome your expertise and passion.Key Responsibilities: Architect, develop, test, deploy, maintain, and enhance data pipelines; Implement coding solutions using Apache Spark on Azure Databricks; Create and design big data architectures leveraging Azure Data Factory, Service Bus, BI, Databricks, and other Azure Services. Essential Qualifications: Bachelor's degree in Computer Science or Software Engineering; Strong analytical mindset, team-oriented, dedicated to quality, and eager to learn; Comprehensive understanding of Apache Spark; Proven experience as a Data Engineer; Advanced proficiency in Python or Scala; Expertise in Spark query tuning and performance enhancement; Familiarity with cloud platforms such as Azure, AWS, or GCP; Fluent in both spoken and written English. Preferred Qualifications: Ability to understand and analyze Directed Acyclic Graph (DAG) operations; Experience in providing cost estimates for big data processing; Capability to write and review architecture documentation. Benefits:We value talent and commitment and offer a range of benefits to our team members:Competitive full-time salary;Comprehensive private health coverage under the company’s group program;Flexible working hours;Access to state-of-the-art tools;Opportunities for professional development including language courses, specialized training, and continuous learning;Career advancement opportunities with leading specialists in the industry;A dynamic work environment that promotes challenging goals, autonomy, and mentorship, supporting both personal and company growth.If you are looking for an exciting challenge, keen to work with innovative technologies, and enjoy your work, we would love to hear from you! Please submit your detailed CV in English, referencing: (DESD/02/26).Explore our other open positions by visiting our career section at www.eurodyn.com and follow us on Twitter (@EURODYN_Careers) and LinkedIn.European Dynamics (www.eurodyn.com) is a prominent European company specializing in Software, Information, and Communication Technologies, with a robust international presence.
Full-time|On-site|Athens or Ioannina, Sterea Ellada, Greece
Location: Athens or Ioannina, Sterea Ellada, Greece About Snappi Bank Snappi Bank is building a neobank from the ground up. The team focuses on financial freedom by delivering transparent, technology-driven digital banking services. The company aims to reshape how people interact with their finances. Role Overview The Data Engineer will design, build, and manage data architecture and pipelines that support data acquisition, storage, processing, and analysis across the organization. This position is open in both the Athens and Ioannina offices. Main Responsibilities Create and maintain data pipelines and infrastructure for efficient ingestion, processing, and storage of large datasets. Work with data scientists, analysts, and other stakeholders to understand data needs and translate them into technical solutions. Develop and optimize data models and schemas for effective storage and retrieval. Build and manage ETL processes to bring data from various sources into data warehouses or lakes. Monitor and troubleshoot pipelines to ensure data integrity, reliability, and performance. Evaluate and introduce new tools or technologies to improve data processing and operational efficiency. Document pipelines, processes, and solutions to support knowledge sharing and maintainability. Partner with infrastructure and DevOps teams to deploy and manage data systems in cloud environments. Keep up with trends and best practices in data engineering and analytics. Qualifications Bachelor’s degree in Computer Science, Electronics, or equivalent experience in data roles. Minimum 5 years of experience in a similar position (7+ years preferred; 3-5 years considered for junior roles). Strong skills in SQL and Python; experience with Azure Data Factory is a plus. Excellent interpersonal skills, including listening, negotiation, and presentation. Clear verbal and written communication abilities. Attention to detail. Effective decision-making, problem analysis, and resolution skills. Strong organizational habits. Proactive approach to problem-solving. Comfort working in a fast-changing environment. Interest in agile software processes, data-driven development, reliability, and experimentation; experience with Agile product teams is a plus. Why Work at Snappi? Snappi Bank values innovation, trust, and ongoing growth. The team focuses on solutions and results. This is a chance to make a real impact on the future of banking and improve financial services for a broad audience.
Join our innovative team as a Semantic Data Engineer, where you'll play a crucial role in enhancing a sophisticated platform centered around RDF data models, SPARQL queries, and structured datasets. Your primary responsibilities will involve comprehending, maintaining, and advancing the semantic layer of our system, collaborating closely with backend engineers and architects. This position is ideal for a passionate specialist with a keen interest in data modeling, semantics, and knowledge representation within real-world production environments.Key Responsibilities:Analyze and uphold RDF/TTL data models and vocabularies;Design, optimize, and manage SPARQL queries;Facilitate data ingestion, transformation, and validation processes;Ensure the consistency and accuracy of semantic data throughout the platform;Work alongside backend engineers to integrate semantic logic into application workflows;Assist in documenting semantic models, assumptions, and constraints;Engage in troubleshooting data quality and reasoning challenges.
We are seeking a talented Site Reliability Data Engineer to join our dynamic team in Athens. As a Site Reliability Data Engineer, you will be at the forefront of ensuring the reliability and performance of our data systems. Your expertise will play a critical role in maintaining service uptime, optimizing system performance, and enhancing our data infrastructure.Your responsibilities will include monitoring system performance, troubleshooting issues, and implementing automation tools to streamline processes. You will collaborate with cross-functional teams to design and deploy scalable infrastructure solutions.
Join Kpler as a Data Engineer specializing in Dry Bulk Commodities. In this pivotal role, you will design, implement, and optimize data pipelines to support our dynamic analytics platform. Collaborate with cross-functional teams to enhance data accessibility and ensure high data quality, driving insights that empower our clients in the commodities market.
METRO AEBE operates a large network of retail stores in Greece and Cyprus, including My Market, My Market Local, METRO Cash & Carry, and BEST VALUE. With more than 11,000 employees, the company has been recognized as a Top Employer for both 2025 and 2026. As the company continues to expand, it remains committed to business growth alongside sustainable practices. The Data Warehouse (DWH) team in Athens is adding a Data Engineer to support these efforts. Role overview This Data Engineer position focuses on designing, improving, and optimizing enterprise data products. The work directly supports data-driven decision-making throughout METRO AEBE.
Apr 27, 2026
Sign in to browse more jobs
Create account — see all 438 results
Tailoring 0 resumes…
Tailoring 0 resumes…
We'll move completed jobs to Ready to Apply automatically.