Mid-Level Data Engineer
Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Unlock Your Potential
Generate Job-Optimized Resume
One Click And Our AI Optimizes Your Resume to Match The Job Description.
Is Your Resume Optimized For This Role?
Find Out If You're Highlighting The Right Skills And Fix What's Missing
Experience Level
Mid to Senior
Qualifications
About Capital
Capital is a premier trading platform making significant strides in global expansion. Our cutting-edge products have garnered prestigious industry accolades for excellence in technology and user experience.
Similar jobs
Search for C3t Remote Customer Data Platform Data Engineer
778 results
Software Mind
Join Our Team as a CDP Data Engineer!At Software Mind, we are dedicated to crafting innovative analytical solutions for digital services. Our mission is to create a state-of-the-art data ecosystem utilizing a diverse array of technologies, particularly within the AWS cloud environment.We are seeking a talented individual to become an integral part of our team, focusing on the implementation of a Customer Data Platform (CDP). This role encompasses everything from installation and configuration to development, integration with our analytical and Data Governance tools, and ongoing maintenance.Your Contributions Will Include:Establishing and maintaining the CDP environment, including collectors, enrichers, and loaders.Integrating event data from multiple sources.Configuring and developing the CDP, including schema repositories.Building and sustaining data pipelines on AWS.Creating analytical models and integrating them with the data catalog.Collaborating on the development and implementation of user identification mechanisms (ID stitching, cross-device, cross-site, consent logic).Documenting processes and working closely with teams in Data Engineering, Data Management, Data Governance, IT, Privacy, and Analytics.
Sigma Software Group
Role overview Sigma Software Group is hiring a Senior Data Engineer in Warsaw to focus on integrations and the data platform. This position involves designing and building solutions that connect data across the company’s platforms. Collaboration with teams throughout the organization is central to keeping data moving reliably and strengthening the overall data architecture. What you will do Design and implement data integration solutions for various platforms Work closely with cross-functional teams to keep data flowing seamlessly Optimize and maintain data pipelines for performance and reliability Monitor data quality and drive improvements where needed Apply best practices in data management Support the ongoing development of the company’s data architecture Impact This role plays a key part in enabling data-driven decision making at Sigma Software Group. By ensuring reliable, high-quality data is available when needed, the Senior Data Engineer helps establish data as a strategic asset for the organization.
QED AI
QED AI is an innovative technology firm dedicated to enhancing public health and food security in Sub-Saharan Africa. By developing cutting-edge digital infrastructure and leveraging AI, we operate at the critical intersection of humanitarian aid and scientific research. Our initiatives include monitoring diseases such as HIV, malaria, and tuberculosis, as well as conducting nutrient analyses for crops and soils across various African nations. Our funding sources include prestigious philanthropic and governmental organizations such as the Global Fund, Gates Foundation, and the CDC.We are seeking a mid/senior Data Platform Engineer (Backend) to join our dynamic team in Warsaw. Ideal candidates will possess the following:Proven experience in designing and maintaining data pipelines utilizing ETL and/or ELT methodologies, with a critical eye on trade-offs rather than adhering to a single pattern.Solid understanding of data pipeline reliability, including concepts of idempotency, backfills, and the management of late or corrected data.Ability to structure data systems into distinct layers (e.g., raw, cleaned, curated) and articulate their varying functions and guarantees.Experience with making informed decisions between batch, micro-batch, and streaming methods based on latency, accuracy, and operational complexity.Strong foundation in software engineering principles, version control, writing clean code and tests, robust design, and a grasp of basic data structures and algorithms.Ability to conceptualize logical software architectures and communicate clearly, both verbally and in writing.Eagerness to engage with a diverse range of problems and technologies.Willingness to participate in regular design sessions, code reviews, and collaborate effectively within teams.Experience with UNIX-based or OSX-Darwin development environments.Working proficiency (≥C1) in English, both spoken and written (minimum typing speed of 45 words per minute).Interest in collaborating with individuals from different cultural backgrounds.Demonstrated emotional resilience and social intelligence.A genuine passion for your work and a generally optimistic outlook, while recognizing the value of constructive skepticism.Additional skills that are advantageous but not mandatory include:Understanding of analytical data modeling concepts, including hierarchical and dimensional modeling.
Samba TV
At Samba TV, we are at the forefront of transforming the global viewing landscape through our innovative data and technology. Our mission is to enhance the viewing experience for audiences worldwide. With our unique data capabilities, we empower media companies to connect with viewers for their latest shows and movies, while providing advertisers with tools to engage audiences and evaluate their reach across various devices. Our rich narrative is shaped by a diverse array of cultural insights derived from our extensive global data reach and AI-driven analytics.We are looking for a talented Data Engineer to join our data platform team. This team is essential in developing and maintaining the robust data platform that serves our entire organization, facilitating everything from data ingestion to advanced analytics and reporting. You will work with valuable viewership data and contextual datasets, as well as scalable applications that support data-driven decision-making. Our hybrid work model allows you to collaborate primarily using AWS, Databricks, BigQuery, and Snowflake technologies. The successful candidate will possess significant expertise in cloud-based data engineering, distributed data processing, and data governance, ensuring our analytics, reporting, and machine learning initiatives are well-supported.
Join Our Team as a Software Engineer!We are seeking a talented Software Engineer to become a key member of Bolt’s Data Platform Insights team. In this role, you will be responsible for developing and enhancing the platform that empowers Bolt teams to create and manage high-quality data products. Your contributions will ensure that data is both reliable and accessible, driving impactful decisions at a massive scale.About BoltWith more than 200 million customers across over 50 countries, Bolt stands as one of the fastest-growing tech companies in Europe and Africa. Our success is attributed to our dedicated team members, and we are committed to fostering an inclusive environment where every individual is welcomed, irrespective of race, color, religion, gender identity, sexual orientation, age, or disability.Our vision is to transform cities into spaces designed for people rather than vehicles. We need your expertise to help us achieve this mission!About the RoleAs a Software Engineer in the Insights team, you will play an essential role in shaping the data ecosystem at Bolt. You will deliver frameworks, libraries, and automations that facilitate the seamless building, testing, and deployment of thousands of data models, metrics, and BI reports. Collaborating closely with data producers, you will tackle their toughest challenges and develop scalable, elegant platform solutions. We are committed to solving complex engineering problems at scale, ensuring that Bolt’s data modeling ecosystem remains reliable, cost-effective, and high-performing.
inetum2
Join our dynamic team at inetum2 as a Data Engineer, where you will play a pivotal role in designing, building, and optimizing our data infrastructure. We are seeking a forward-thinking individual with a passion for data and technology.
thedotcollective
About The Dot CollectiveWe are a forward-thinking consultancy operating across the UK and EU, driven by engineering excellence and a commitment to empowering individuals to create meaningful impacts.Our projects leverage the latest technology stacks and we adopt agile scrum methodologies to ensure efficiency and effectiveness in our deliverables.About YouAre you enthusiastic about the power of data and its potential to transform businesses? Do you thrive on making significant contributions in a short time frame? If so, you may find your perfect role with us.
Telemedi
Join one of Poland's foremost telemedicine platforms as an AI Data Engineer, where you will be the architect of our analytics layer. Your role involves designing and implementing intelligent workflows that transform raw operational data into dynamic, self-validating reports. You will analyze business processes to determine which tasks can be automated using AI, then build and maintain these solutions. You'll be the driving force behind initiatives aimed at streamlining reporting, ensuring it aligns with AI capabilities, and shaping the future of our analytics.
CreatorIQ
Role overview CreatorIQ seeks a Senior Data Engineer in Warsaw with a focus on reporting. This role collaborates with teams across the company to strengthen data architecture and reporting systems. The work directly supports data-driven decisions at every level of the organization. What you will do Design and build data pipelines that power reporting and analytics Maintain and improve data quality throughout multiple systems Optimize the performance of reporting tools to ensure timely and accurate insights Impact Efficient analytics and reliable data are central to CreatorIQ’s business. The Senior Data Engineer’s contributions shape how teams access information and make decisions, directly influencing business outcomes.
Inetum Polska
Join our dynamic team as a Junior Data Engineer, where you will play a critical role in enhancing modern, real-time data processing capabilities. You will assist in transitioning existing data and ML workflows from batch processing to scalable streaming solutions. This position involves hands-on engineering, close collaboration with Data Scientists, and operational oversight for production data pipelines.Technology EnvironmentUtilization of advanced real-time data streaming technologies for ML model inference.Experience with distributed data processing frameworks that enable scalable, low-latency pipelines.Work with containerized workloads orchestrated in cloud-native environments.Employ monitoring and observability tools to ensure the reliability and performance of data pipelines.Engage with a Python-based ecosystem that supports ML model integration and lifecycle management.Key ResponsibilitiesTransform batch inference workflows into efficient streaming pipelines.Establish streaming semantics to replace batch windows, including micro-batching, windowing, and state management.Design Kafka topic structures, partitioning strategies, and consumer group patterns tailored for prediction workloads.Implement strategies for checkpointing, backpressure handling, and delivery guarantees.Package and version ML model artifacts for streaming jobs to facilitate safe rollouts and rollbacks.Optimize performance for throughput and latency through effective batching strategies and resource allocation.Deploy and manage streaming jobs with comprehensive monitoring and alerting.Integrate streaming outputs into downstream ETL/BI systems seamlessly.Collaborate with Data Scientists on CI/CD for streaming models while monitoring model performance and drift.Team & CollaborationEngage in a distributed delivery model closely coordinated with the central AI/BI team in Germany.Experience daily collaboration through MS Teams, Jira, and Confluence.Utilize Agile methodologies (Scrum/Kanban) within cross-functional squads.
Miratech
Role Overview Miratech is hiring a Senior Data Engineer in Warsaw. This position focuses on designing and building data architectures and pipelines to support the company’s data initiatives. The Senior Data Engineer works closely with teams across the business to maintain data quality and improve accessibility, helping drive informed decisions throughout the organization.
Strategic LeadershipDesign and execute a robust data privacy and access control framework that encompasses multi-dimensional classification, dynamic permissions, and information barriers.Serve as the technical leader of a specialized team dedicated to privacy-centric access controls while collaborating with cross-functional teams including data ingestion, knowledge mapping, and automation developers.Establish and uphold security and privacy standards, policies, and best practices throughout the entire product development lifecycle.Technical ImplementationCraft a multi-tiered access control model that integrates Role-Based Access Control (RBAC), Attribute-Based Access Control (ABAC), and purpose-driven limitations.Supervise the deployment of intricate data classification frameworks leveraging NLP and other advanced technologies.Design and validate mechanisms for permission propagation tailored for graph data models and derived insights.Set security boundaries for autonomous AI agents, ensuring adequate context isolation and privilege controls.Cross-Team CoordinationCollaborate closely with engineering teams to embed privacy controls into the data pipeline, knowledge graph, and AI components.Partner with product management to harmonize privacy requirements with usability and functionality.Work alongside customer success to address specific client privacy and compliance needs.Advocate for the adoption of privacy-by-design principles among development teams.
About the Role inetum2 is looking for a Technical Leader Data Engineer in Warsaw. This position puts technical leadership at the center of data engineering projects, shaping solutions and guiding teams in a changing data landscape.
At Samba TV, we are at the forefront of transforming the viewing experience worldwide through our innovative data and technology solutions. Our mission is to empower media companies by providing them with the insights they need to connect effectively with audiences while offering advertisers the tools to engage viewers across multiple devices. With our unique global footprint and AI-driven insights, we have a compelling story to tell about culture and media consumption.In the role of Data Engineer, you will lead the development of high-performance, scalable data pipelines and infrastructure that underpin Samba TV's analytics and insights. Your expertise will be essential in designing and implementing architectural enhancements, adhering to best practices, and mentoring a talented team of engineers. You will collaborate closely with teams in Data Science, Analytics, and Product to deliver robust, production-ready data solutions that drive significant business impact.
Capital
Join our dynamic trading platform as we extend our reach globally. Our award-winning products are recognized for their innovative technology and exceptional user experience. We are dedicated to assembling a top-tier team, and we are eager to welcome talented individuals who share our vision.In the role of Data Engineer, you will be an essential member of our data team, responsible for the design, development, and maintenance of robust data pipelines and systems. Collaborating closely with data scientists, analysts, and other stakeholders, you will ensure that our data is reliable and readily available to support informed decision-making and analytics efforts. Your key responsibilities will involve data ingestion, transformation, and delivery, alongside optimizing our data infrastructure.As a Mid-Level Data Engineer, you are expected to enhance and sustain our data infrastructure, guaranteeing data integrity and accessibility. You should demonstrate the ability to work both independently and collaboratively, adapting to evolving data requirements and partnering with fellow data professionals to derive meaningful insights for the organization. Keeping abreast of the latest data engineering best practices and emerging technologies will be vital for your success in this position.
Join Kpler as a Business Intelligence Data Engineer, where you will play a crucial role in transforming data into actionable insights. You will work with cutting-edge technologies to develop and maintain our data pipelines, ensuring data quality and accessibility across the organization.
About the Role inetum2 is looking for a Data Engineer I in Warsaw to help turn data into practical insights. This role supports the data engineering team in designing, building, and maintaining scalable data pipelines that align with business needs. What You Will Do Collaborate with team members to design and develop data pipelines Maintain and improve existing data infrastructure Contribute to projects that help the business make informed decisions Location This position is based in Warsaw.
Role Overview adlook is hiring an Analytics & Data Engineer in Warsaw, Poland. This position plays a key part in shaping how the company uses data to inform decisions and improve operations. The role centers on building and maintaining data pipelines, making sure information stays accurate and easy to access. What You Will Do Work with teams across the company to design and implement data pipelines Support the reliability and quality of data used for analytics and reporting Help optimize data workflows to improve how the organization makes decisions
Inetum Polska
Join Inetum Polska as a Data Engineer, where you will play a pivotal role in the development and optimization of our data infrastructure. Your responsibilities will include:Designing and maintaining efficient processes to aggregate data from diverse sources into our Data Lake.Creating, developing, and refining complex data pipelines to guarantee a reliable flow of information.Establishing frameworks that support the development of data pipelines.Implementing thorough testing frameworks for data pipelines to ensure data integrity and quality.Collaborating with analysts and data scientists to deliver superior quality data solutions.Overseeing data management practices, ensuring security, compliance, and best practices in governance.Exploring and adopting new technologies to enhance data pipeline performance.Integrating and leveraging data from various source systems, including Kafka, MQ, SFTP, databases, APIs, and file shares.
Join our innovative team at Inetum Polska as a Data Engineer, where you will utilize your data engineering expertise in a fast-paced environment. Your role will be pivotal in ensuring smooth data migration and optimization for cutting-edge AI and ML projects. Don't miss out on the opportunity to contribute to our groundbreaking initiatives!Key ResponsibilitiesData Pipeline Development:Craft, develop, and implement Python-based ETL/ELT pipelines to facilitate data migration from on-premises MS SQL Server to our Databricks instance,Ensure effective ingestion of historical parquet datasets into Databricks.Data Quality & Validation:Establish validation, reconciliation, and quality assurance protocols to guarantee the accuracy and completeness of migrated data,Manage schema mapping, field transformations, and metadata enrichment to standardize datasets,Integrate data governance, quality assurance, and compliance into all migration processes.Performance Optimization:Optimize pipelines for enhanced speed and efficiency, leveraging Databricks capabilities, including Delta Lake when applicable,Oversee resource utilization and scheduling for large dataset transfers.Collaboration:Coordinate closely with AI engineers, data scientists, and business stakeholders to outline data access patterns needed for upcoming AI POCs,Work alongside infrastructure teams to ensure secure connections between legacy systems and Databricks.Documentation & Governance:Maintain comprehensive technical documentation for all data pipelines,Adhere to best practices for data governance, compliance, and security throughout the migration process.
Sign in to browse more jobs
Create account — see all 778 results

