companyDeutsche Telekom IT Solutions logo

Senior Data Engineer - Lakehouse & Data Engineering Frameworks

On-site Full-time

Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.


Unlock Your Potential

Generate Job-Optimized Resume

One Click And Our AI Optimizes Your Resume to Match The Job Description.

Is Your Resume Optimized For This Role?

Find Out If You're Highlighting The Right Skills And Fix What's Missing

Experience Level

Mid to Senior

Qualifications

3–5 years of hands-on experience in building data pipelines using Databricks and Azure in production settings. Strong understanding of Delta Lake patterns, including CDC, schema evolution, deduplication, partitioning, and performance optimization. Advanced skills in Python engineering, focusing on maintainable project structures (packaging, dependency management, testing, tooling). Proficient in SQL, capable of handling complex transformations, debugging, and performance tuning. Demonstrated experience with CI/CD and Git-based workflows, including merge requests, branching strategies, and automated testing. Ability to diagnose and resolve issues in distributed systems, particularly relating to Spark execution and data correctness. Solid understanding of data modeling principles and their impact on ingestion and performance. Practical experience implementing data governance and security controls within a Lakehouse environment.

About the job

Join our team as a Senior Data Engineer, where you will play a pivotal role in building and managing scalable data ingestion and Change Data Capture (CDC) capabilities on our Azure-based Lakehouse platform. Your expertise will drive our engineering maturity as we deliver ingestion and CDC preparation through Python projects and reusable frameworks. We are seeking a professional who applies best software engineering practices, including clean architecture, rigorous testing, code reviews, effective packaging, CI/CD, and operational excellence.

Our platform emphasizes batch-first processing, allowing for the landing of streaming sources in their raw form while processing them in batch. We are selective in our evolution towards streaming as necessary.

As part of the Common Data Intelligence Hub, you will collaborate closely with data architects, analytics engineers, and solution designers to create robust data products and ensure governed data flows across the enterprise.

  • Your team is responsible for end-to-end ingestion and CDC engineering, including design, build, operation, observability, reliability, and reusable components.
  • You will contribute to the development of platform standards, including contracts, layer semantics, and readiness criteria.
  • While you will not primarily manage cloud infrastructure provisioning, you will work with the platform team to define requirements, review changes, and maintain deployable code for pipelines and jobs.

Platform Data Engineering & Delivery

  • Design and develop ingestion pipelines utilizing Azure and Databricks services, including Azure Data Factory pipelines and Databricks notebooks/jobs/workflows.
  • Implement and manage CDC patterns for inserts, updates, and deletes, accommodating late arriving data and reprocessing strategies.
  • Structure and maintain bronze and silver Delta Lake datasets, focusing on schema enforcement, de-duplication, and performance tuning.
  • Create “transformation-ready” datasets and interfaces with stable schemas, contracts, and metadata expectations for analytics engineers and downstream modeling.
  • Adopt a batch-first approach for data ingestion, ensuring raw landing, replayability, and idempotent batch processing while progressing towards true streaming as required.

Software Engineering for Data Frameworks

  • Develop and maintain Python-based ingestion and CDC components as production-grade software, focusing on modules, packaging, versioning, and releases.
  • Implement engineering best practices such as code reviews, unit/integration tests, static analysis, formatting/linting, type hints, and comprehensive documentation.
  • Establish and enhance CI/CD pipelines for data engineering code and pipeline assets, covering build, testing, security checks, deployment, and rollback patterns.

About Deutsche Telekom IT Solutions

Recognized as Hungary’s most attractive employer in 2025 by Randstad’s representative survey, Deutsche Telekom IT Solutions is a vital subsidiary of the Deutsche Telekom Group. With over 5,300 employees, we offer a comprehensive range of IT and telecommunications services to numerous large clients across Germany and other European nations. Our commitment to excellence is evidenced by awards such as the Best in Educational Cooperation from HIPA in 2019 and recognition as the Most Ethical Multinational Company in the same year. We are continually expanding our four sites in Budapest, Debrecen, Pécs, and Szeged, seeking skilled IT professionals to join our dynamic team.

Similar jobs

Tailoring 0 resumes

We'll move completed jobs to Ready to Apply automatically.