Data Engineer at Samba TV | Warsaw
Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Unlock Your Potential
Generate Job-Optimized Resume
One Click And Our AI Optimizes Your Resume to Match The Job Description.
Is Your Resume Optimized For This Role?
Find Out If You're Highlighting The Right Skills And Fix What's Missing
Experience Level
Experience
Qualifications
About Samba TV
Samba TV is at the forefront of transforming the media landscape, enabling companies to effectively reach and engage audiences through our cutting-edge data technology. Our mission is to enhance the viewing experience by leveraging powerful insights drawn from a vast array of global data.
Similar jobs
Search for Senior Data Engineer At Inetum Polska Warsaw
1,528 results
Inetum Polska
Join our innovative team as a Senior Data Engineer and play a pivotal role in enhancing our product platform. We are looking for an experienced professional with a strong background in data engineering to design, implement, and maintain robust data solutions.Main Responsibilities:Architect and develop data solutions that elevate our product platform.Define and refine data models tailored for analytical, operational, and predictive applications.Engage in critical data architecture and technical infrastructure decisions.Guarantee high quality, reliability, and maintainability through best practices in CI/CD and automation.Work closely with product owners, developers, and stakeholders to convert business requirements into effective technical solutions.Technologies Utilized: Azure DevOps, Kubernetes, SQL, ETL.
Inetum Polska
Join Inetum Polska as a Data Engineer, where you will play a pivotal role in the development and optimization of our data infrastructure. Your responsibilities will include:Designing and maintaining efficient processes to aggregate data from diverse sources into our Data Lake.Creating, developing, and refining complex data pipelines to guarantee a reliable flow of information.Establishing frameworks that support the development of data pipelines.Implementing thorough testing frameworks for data pipelines to ensure data integrity and quality.Collaborating with analysts and data scientists to deliver superior quality data solutions.Overseeing data management practices, ensuring security, compliance, and best practices in governance.Exploring and adopting new technologies to enhance data pipeline performance.Integrating and leveraging data from various source systems, including Kafka, MQ, SFTP, databases, APIs, and file shares.
Join Inetum Polska as a skilled ML Ops Engineer, where you will play a critical role in shaping the future of machine learning operations. Your primary responsibility will be to architect and deploy the necessary infrastructure to support the management and orchestration of up to 1,500 ML scoring processes within a cutting-edge Databricks environment. You will focus on operationalizing ML scoring pipelines, establishing a robust, scalable, and secure platform that empowers our data science teams to seamlessly deploy their models.
Inetum Polska
The AI & Data Presales Specialist plays a crucial role in steering the complete presales lifecycle for data, analytics, and AI initiatives. This journey spans from initial qualification and discovery phases to solution design, proposal development, and seamless handover to the delivery team. This position demands exceptional client engagement abilities, a solid understanding of contemporary data and AI platforms, and the skill to articulate compelling value propositions and solution narratives. Collaboration is key, as you'll work closely with Specialized Sales, Solution Architects, and Subject Matter Experts (SMEs) to formulate winning proposals that direct clients towards tangible, measurable results.Key ResponsibilitiesOpportunity Qualification & Deal ShapingEvaluate opportunity alignment, customer readiness, stakeholder dynamics, decision-making processes, potential risks, and actionable next steps alongside Specialized Sales.Design commercial strategies, set project boundaries, delivery models, and clarify risk and assumptions.Customer Discovery & Requirements DefinitionLead discovery workshops and interviews with both business and technical stakeholders.Document functional and non-functional requirements, constraints, success metrics, and KPIs.Transform insights into an organized use-case backlog and define the Minimum Viable Product (MVP).Solution Narrative & Architecture CoordinationCraft and sustain the solution storyline, including phased roadmaps, trade-offs, and alternative pathways.Collaborate with Solution Architects and SMEs to establish the solution approach, input for sizing, delivery model, and maintain a risk and assumptions register.Presales Deliverables & Proposal OwnershipGenerate high-caliber presales documentation: executive presentations, proposals/Statements of Work (SoWs), RFP responses, and compliance matrices.Guarantee clarity, structure, and consistency throughout all written communications.Facilitate internal reviews, maintain bid cadence, and conduct readiness checkpoints to ensure alignment between Sales and Architecture.Value Engineering & Business Case DevelopmentDevelop and revise ROI/TCO models, benefit hypotheses, and measurement strategies.Align the value proposition with customer KPIs and strategic objectives.
Join our dynamic team as a QA Engineer/Analyst and contribute to a variety of exciting projects at Inetum. Our Quality Assurance teams tackle multiple initiatives simultaneously, ranging from developing automated API tests to testing web and mobile applications. You will also play a key role in creating effective test strategies and analyzing intricate business processes. We welcome diverse profiles, whether you are analytical, testing-focused, or technical, at both Junior and Regular levels.Please note: You are not applying for a specific position. We will connect with candidates about roles and projects that align closely with their expertise and interests.Technologies you will work with:Testing & Automation: API Testing (REST), Gauge, nUnit, Cucumber, SpecFlow, Playwright, Selenium, WebAPI test automation, A/B TestingBackend / Integrations: .NET Core, messaging/queues (Azure Service Bus, Event Hub), microservicesPlatforms & Environments: Azure DevOps (pipelines, test plans, repositories), Atlassian (Jira, Bitbucket, Confluence), Xray Test Management, JenkinsDatabases & Analytics: SQL (Postgres, Sybase, MS SQL), ELK Stack, GrafanaOther: Git, Swagger, JetBrains tools (IntelliJ, DataGrip), basic Python skills (in selected projects)
Nearmap Ltd.
About the Role Nearmap is looking for a Senior Data Engineer to join the Engineering & Technology group in Warsaw. This team focuses on building and improving the systems that help businesses use location data more effectively. What You Will Do Design and implement data solutions that support business needs Maintain and improve data quality and process integrity Work closely with teams across the company to deliver useful insights Who We’re Looking For This role suits someone with strong experience in data engineering, a focus on quality, and an interest in supporting projects that use location data in new ways.
Inetum Polska
Join our dynamic team as a Junior Data Engineer, where you will play a critical role in enhancing modern, real-time data processing capabilities. You will assist in transitioning existing data and ML workflows from batch processing to scalable streaming solutions. This position involves hands-on engineering, close collaboration with Data Scientists, and operational oversight for production data pipelines.Technology EnvironmentUtilization of advanced real-time data streaming technologies for ML model inference.Experience with distributed data processing frameworks that enable scalable, low-latency pipelines.Work with containerized workloads orchestrated in cloud-native environments.Employ monitoring and observability tools to ensure the reliability and performance of data pipelines.Engage with a Python-based ecosystem that supports ML model integration and lifecycle management.Key ResponsibilitiesTransform batch inference workflows into efficient streaming pipelines.Establish streaming semantics to replace batch windows, including micro-batching, windowing, and state management.Design Kafka topic structures, partitioning strategies, and consumer group patterns tailored for prediction workloads.Implement strategies for checkpointing, backpressure handling, and delivery guarantees.Package and version ML model artifacts for streaming jobs to facilitate safe rollouts and rollbacks.Optimize performance for throughput and latency through effective batching strategies and resource allocation.Deploy and manage streaming jobs with comprehensive monitoring and alerting.Integrate streaming outputs into downstream ETL/BI systems seamlessly.Collaborate with Data Scientists on CI/CD for streaming models while monitoring model performance and drift.Team & CollaborationEngage in a distributed delivery model closely coordinated with the central AI/BI team in Germany.Experience daily collaboration through MS Teams, Jira, and Confluence.Utilize Agile methodologies (Scrum/Kanban) within cross-functional squads.
About the Role inetum2 is looking for a Data Engineer I in Warsaw to help turn data into practical insights. This role supports the data engineering team in designing, building, and maintaining scalable data pipelines that align with business needs. What You Will Do Collaborate with team members to design and develop data pipelines Maintain and improve existing data infrastructure Contribute to projects that help the business make informed decisions Location This position is based in Warsaw.
Join our innovative team at Inetum Polska as a Data Engineer, where you will utilize your data engineering expertise in a fast-paced environment. Your role will be pivotal in ensuring smooth data migration and optimization for cutting-edge AI and ML projects. Don't miss out on the opportunity to contribute to our groundbreaking initiatives!Key ResponsibilitiesData Pipeline Development:Craft, develop, and implement Python-based ETL/ELT pipelines to facilitate data migration from on-premises MS SQL Server to our Databricks instance,Ensure effective ingestion of historical parquet datasets into Databricks.Data Quality & Validation:Establish validation, reconciliation, and quality assurance protocols to guarantee the accuracy and completeness of migrated data,Manage schema mapping, field transformations, and metadata enrichment to standardize datasets,Integrate data governance, quality assurance, and compliance into all migration processes.Performance Optimization:Optimize pipelines for enhanced speed and efficiency, leveraging Databricks capabilities, including Delta Lake when applicable,Oversee resource utilization and scheduling for large dataset transfers.Collaboration:Coordinate closely with AI engineers, data scientists, and business stakeholders to outline data access patterns needed for upcoming AI POCs,Work alongside infrastructure teams to ensure secure connections between legacy systems and Databricks.Documentation & Governance:Maintain comprehensive technical documentation for all data pipelines,Adhere to best practices for data governance, compliance, and security throughout the migration process.
At Samba TV, we are pioneers in tracking streaming and broadcast video globally through our innovative data and technology solutions. Our mission is to revolutionize the viewing experience for everyone. Our proprietary data empowers media companies to connect with audiences for new shows and movies, while providing advertisers with the tools to engage viewers and measure their reach across all devices. Join us as we share a compelling narrative shaped by a global footprint of data and AI-driven insights.We are currently looking for a talented Data Engineer to enhance our data platform team. This team is responsible for developing and maintaining the data infrastructure that supports our organization’s operations. Your work will span from data ingestion to analytics and reporting, managing valuable viewership and contextual datasets, and creating scalable applications that facilitate data-driven decision-making. While our organization operates in a hybrid model, you will primarily work with technologies including AWS, Databricks, BigQuery, and Snowflake. The ideal candidate will possess significant experience in cloud-based data engineering, distributed data processing, as well as data governance and metadata management to support analytics, reporting, and machine learning initiatives.
SoftwareMind
SoftwareMind is looking for a Data Engineer based in Warsaw. This role centers on designing, building, and maintaining data pipelines that help drive informed decisions throughout the company. Key responsibilities Create and manage data pipelines that integrate information from multiple sources Monitor and support data quality and reliability at every stage Collaborate with team members to deliver solutions aligned with business needs Requirements Strong grasp of data engineering principles Comfort working closely with others in a collaborative setting Interest in building systems that support data-driven decisions
At Samba TV, we are at the forefront of transforming the viewing experience worldwide through our innovative data and technology solutions. Our mission is to empower media companies by providing them with the insights they need to connect effectively with audiences while offering advertisers the tools to engage viewers across multiple devices. With our unique global footprint and AI-driven insights, we have a compelling story to tell about culture and media consumption.In the role of Data Engineer, you will lead the development of high-performance, scalable data pipelines and infrastructure that underpin Samba TV's analytics and insights. Your expertise will be essential in designing and implementing architectural enhancements, adhering to best practices, and mentoring a talented team of engineers. You will collaborate closely with teams in Data Science, Analytics, and Product to deliver robust, production-ready data solutions that drive significant business impact.
At Samba TV, we are redefining the viewing experience by harnessing the power of our proprietary data and cutting-edge technology to track streaming and broadcast video globally. Our mission is to empower media companies to connect with audiences and advertisers to engage viewers, delivering insights that transform how we experience entertainment.We invite a talented Data Engineer to join our dynamic Data Technology team in Warsaw. You will play a crucial role in building and maintaining our data platform that serves the entire organization — supporting everything from data ingestion and analytics to comprehensive reporting. You will work with cutting-edge technologies like AWS, Databricks, BigQuery, and Snowflake to enhance our data infrastructure.As a self-sufficient contributor, you will take ownership of well-defined pipeline components and features, collaborate effectively with your teammates and cross-functional stakeholders, and navigate the complete data lifecycle. We are looking for candidates with 2–4 years of hands-on experience who are proficient in writing production-quality code and are eager to grow their technical expertise.
thedotcollective
Discover the Opportunity at The Dot CollectiveAt The Dot Collective, we are a pioneering consultancy operating across the UK and EU, dedicated to engineering excellence and empowering individuals to create meaningful impacts.We embrace modern technology stacks and apply agile scrum methodologies to all our projects.Who You AreIf you have a passion for data and its transformative potential, and you're eager to make significant contributions in a short timeframe, we could be the perfect fit for you.
Location: Warsaw, Poland Company: Veeam Software About Veeam Software Veeam leads in data management and AI trust, helping organizations understand, secure, and strengthen their data. Headquartered in Seattle with teams in over 30 countries, Veeam protects the operations of more than 550,000 clients worldwide. The company focuses on keeping businesses running smoothly and enabling the safe, rapid adoption of AI at scale. Role Overview The Senior Data Security and Privacy Engineer will focus on implementing privacy-by-design principles and building strong data protection strategies for the Veeam Data Cloud (VDC) data plane. This position plays a critical role in upholding high standards for data security and privacy, helping to maintain client trust in Veeam's solutions.
Join the innovative team at Inetum as a talented Data Engineer! We are seeking individuals with a strong background in data engineering to contribute to our exciting Big Data projects. Preferably, you will have hands-on experience with Databricks and the Spark framework.
Point72 Asset Management, L.P.
Join Point72, a leading asset management firm, as a Data Engineer and play a pivotal role in transforming raw data into actionable insights. You will collaborate with cross-functional teams to develop data pipelines, ensuring that our analytical and operational needs are met efficiently. Your expertise will contribute to optimizing our data processes and enhancing the overall performance of our data systems.
thedotcollective
About The Dot CollectiveWe are a modern consultancy operating across the UK and EU, committed to engineering excellence and empowering individuals to drive impactful change.We utilize a wide array of contemporary technology stacks and implement agile scrum methodologies in all our projects.About YouAre you enthusiastic about data and its ability to transform businesses? Do you thrive on making significant contributions in a short time frame? If so, you may find your ideal role with us.
Join our dynamic team at inetum2 as a Junior Data Engineer! In this pivotal role, you will be tasked with monitoring production data pipelines, troubleshooting incidents, enhancing system stability, and ensuring seamless daily data operations. If you are eager to grow your skills in data engineering and contribute to impactful projects, we want to hear from you!
Join Telemedi as an AI Data Engineer!Telemedi is a pioneering company in the healthcare sector, committed to employing programmers, doctors, and experts across various fields to develop cutting-edge solutions that enhance patient care. Our mission is to leverage technology to provide everyone with convenient and immediate access to medical services.In the role of AI Data Engineer, you will collaborate with us to innovate within the telemedicine and insurance industries.Your Responsibilities:Design and implement autonomous workflows that transform raw data into self-validating reports.Map business processes and determine which aspects can be automated using AI.Build and maintain the analytics layer of our telemedicine platform.Replace manual reporting systems in Excel with intelligent, repeatable pipelines.Construct architectures for data validation systems and monitor the quality of AI outputs.The first three responsibilities will constitute 80% of your work time.
Sign in to browse more jobs
Create account — see all 1,528 results

