We are seeking a talented and experienced DWH Architect / Senior Data Engineer to join our client, a prestigious Swiss insurance company. If you are eager to confront complex challenges and contribute to the evolution of modern data platforms, we would love to hear from you. Key Responsibilities:Design and deploy cutting-edge data warehouse architectures.Lea…
We are seeking a highly skilled DWH Architect / Senior Data Engineer for our client, a leading insurance company in Switzerland. If you are ready to tackle complex challenges and actively shape modern data platforms, we would love to hear from you. Your Responsibilities:Designing and implementing state-of-the-art data warehouse architecturesLeading the migration and modernization of existing data warehouse systemsAnalyzing and translating business and technical requirements into scalable, architecture-compliant solutionsCollaborating closely with internal stakeholders and cross-functional teamsEnsuring data governance, security, and compliance with regulatory requirements in SwitzerlandEnhancing and optimizing data platforms and reporting solutionsEstablishing and maintaining naming conventions, self-service approaches, delineation, tool proposals, and guidelines
Role overview Databricks is hiring a Senior Solutions Architect in Zürich, Switzerland. This position centers on designing cloud-based data solutions that help clients get the most value from the Databricks platform. What you will do Meet with customers to learn about their data challenges and business objectives. Create and recommend solutions that fit each client’s needs using Databricks technology. Help organizations shape their data strategy and architecture. Break down complex technical topics into clear explanations for non-technical stakeholders. Work closely with internal and client teams to support successful delivery of solutions. Key skills Deep knowledge of cloud-based data platforms and architectural design. Clear communication skills, especially when presenting technical ideas to varied audiences. Background in working directly with clients to understand and address business needs.
P-1127 At Databricks, we are driven by a passion for empowering data teams to tackle some of the most challenging problems globally, ranging from security threat detection to cancer drug development. Our mission revolves around creating and managing the premier data and AI infrastructure platform, allowing our clients to concentrate on the high-value challenges central to their missions. Our engineering teams design and maintain highly sophisticated products that address significant global needs. We develop and operate one of the most extensive software platforms in existence, comprising millions of virtual machines that generate terabytes of logs and process exabytes of data daily. At this scale, we frequently encounter cloud hardware, network, and operating system faults, necessitating that our software effectively shields our customers from such issues. The Delta DML team is responsible for the core write-path operations for Delta Lake, the open-source storage layer supporting the Databricks Lakehouse. Our goal is to provide industry-leading performance and a seamless user experience at an immense scale, with the majority of data written in Databricks traversing our platform. We spearhead performance innovations such as Low Shuffle Merge and Deletion Vectors and actively contribute to open-source efforts aimed at unifying Delta and Iceberg formats. We are looking for an exceptionally talented and experienced Senior Staff Software Engineer to join our backend team. In this pivotal role, you will be key in designing, developing, and maintaining robust backend systems that power Databricks workspaces. You will architect the next-generation platform for serving workspace assets, ensuring high queries per second (QPS), low latency, reliability, and performance, while proactively addressing future growth challenges. Furthermore, as a senior member of the team, you will provide technical leadership, mentorship, and guidance to junior engineers, thereby enhancing overall team coding practices and system designs. The Impact you will have: Address real business needs at scale through your software engineering expertise. Engage in low-level systems debugging, performance measurement, and optimization on large production clusters. Lead architectural design, influence the product roadmap, and take ownership of new projects. Introduce tools to enhance the automation and operability of services. Utilize your deep expertise to prevent and investigate production issues. Plan and lead complex technical projects involving multiple teams within the company. Contribute as a technical team lead by mentoring others, leading sprint planning, delegating tasks, and participating in project planning.
We are seeking a skilled DWH Developer (m/f/d) for our client in the Swiss health insurance sector. In the Performance/Claims department, you will play a crucial role in enhancing the current Microsoft DWH and delivering insightful analyses on performance and claims data. Key Responsibilities:Expand our Microsoft DWH to meet new business requirements in the performance and claims area.Prepare and report on performance and claims data for in-depth analyses.Execute ad-hoc requests through data-driven analyses based on business needs.Make adjustments to the DWH as part of Syrius upgrades.Work independently and contribute to data process design.Participate in medium-term projects on our cloud data platform.
We are seeking an experienced DWH Developer (m/f/d) for our client in the Swiss health insurance sector, specifically for Product Management & Underwriting. In this role, you will enhance the DWH landscape, implement business requirements in the product area, and support data-driven decisions on modern analytics platforms. Key Responsibilities:Expand our Microsoft DWH to accommodate new business requirements in Product Management and Underwriting.Prepare and report data for analysis and management decisions.Execute ad-hoc data analyses based on business needs.Adjust the DWH in line with Syrius upgrades.Work independently and contribute to shaping data processes.Participate in medium-term projects on our cloud data platform.
Jobgether is looking for a Senior Software Engineer with a focus on DevOps and Azure to join the team in Switzerland. This position centers on building and maintaining scalable infrastructure, with a strong emphasis on automation and system reliability. Role overview This role works closely with cross-functional teams to design, implement, and support infrastructure solutions. The main goal is to streamline development workflows and ensure smooth integration across jobgether's platforms. Improving system performance and automating key processes are core aspects of the position. What you will do Collaborate with teams to design and maintain infrastructure using Azure technologies Automate deployment and integration processes to improve efficiency Enhance system performance and reliability across platforms Requirements Strong experience with Azure and DevOps practices Background in designing and implementing scalable infrastructure solutions Ability to work collaboratively with cross-functional teams
At Databricks, we are dedicated to empowering data teams to tackle the world’s most pressing challenges — from revolutionizing transportation to accelerating medical innovations. Our mission is to build and maintain the premier data and AI infrastructure platform, enabling our customers to harness deep data insights for business enhancement. Founded by engineers and driven by a customer-centric ethos, we eagerly embrace technical challenges, whether it's designing cutting-edge UI/UX for data interfacing or optimizing our services and infrastructure across millions of virtual machines. The journey has just begun.Our engineering teams are responsible for developing highly sophisticated products that address critical global needs. We operate one of the largest software platforms at scale, comprising millions of virtual machines that generate terabytes of logs and process exabytes of data daily. Given our scale, we frequently encounter cloud hardware, network, and operating system faults, and our software must effectively shield our customers from these issues.Unity Catalog is our native platform offering unified and open governance for data and AI. It dismantles data silos, simplifies governance, and accelerates insights at scale. We are seeking a Senior Staff Software Engineer to take the lead in Unity Catalog Runtime Enforcement. You will spearhead the development and fortification of the runtime enforcement layer for Unity Catalog, ensuring secure and consistent authorization and data access across Databricks compute, engines, and clouds. This initiative will minimize incidents and simplify customer choices by standardizing enforcement semantics.
Title: Lead Data EngineerType: Full-timeLocation: All Visium locationsStart Date: March 2026About UsAt Visium, we empower enterprise leaders to shape their AI and data strategies, drive transformative initiatives, and seamlessly integrate AI into their operations, ensuring organizations remain agile and resilient for the future. Leveraging our expertise in strategy, architecture, cloud engineering, analytics, artificial intelligence, and machine learning, we help clients unlock and maximize the potential of their data.Join a team that is dedicated to pioneering a future where organizations are not only innovative but also ethically responsible. Become part of our vibrant community of Visiumees – the curious, the ambitious, the doers, and those who aspire to create a world that inspires awe.Ready to embark on this journey with us?RoleAs a Lead Data Engineer, you will be instrumental in designing, developing, and enhancing cutting-edge data platforms for our clients.Your responsibilities will include:Evaluating and comprehending clients’ data ecosystems, including data sources, architectures, and quality metrics.Collaborating with business and IT stakeholders to convert requirements into effective data solutions.Designing, constructing, and maintaining robust data platforms (pipelines, data lakes, data warehouses) on modern cloud infrastructures.Implementing data models, schemas, and governance frameworks to facilitate analytics and reporting.Engaging in technical sales and pre-sales initiatives, including solution design and estimations.Managing large-scale, distributed data systems, troubleshooting challenges, and keeping abreast of best practices in data engineering.RequirementsTo be considered for this role, you should possess the following qualifications:5+ years of experience in Data Engineering with a focus on Data Platforms, Data Modeling, and production-grade pipeline development.A proven ability to translate business needs into technological solutions.Extensive expertise in Microsoft Fabric.Experience in integrating SAP data.A leadership mindset and experience in managing data engineering teams.Proficient in distributed data processing (e.g., Spark), data lakes/warehouses, ETL/ELT methodologies, and Python development.Exceptional problem-solving, ownership, and consulting communication skills in English, with the capability to work autonomously in complex settings.BenefitsWhat we offer:A competitive compensation package.An annual education budget to enhance your professional skills.A yearly sports budget to promote a healthy lifestyle.A flexible working culture that values work-life balance.
jobgether is looking for a Senior Data Engineer based in Switzerland. This position centers on designing and building advanced data solutions that support the company’s growing infrastructure needs. Role overview The Senior Data Engineer will take a central role in developing and maintaining systems that shape how data is managed and utilized across the organization. The work involves contributing to significant projects that help define the future of data operations at jobgether. What you will do Build and optimize data solutions to support business objectives Work with modern technologies as part of a collaborative team Contribute to projects that impact the company’s data infrastructure Location This role is based in Switzerland.
We are excited to announce a Solution Architect position aimed at strengthening our client's team located near Fribourg. This role focuses primarily on integrating Microsoft solutions within a structured and industrialized technical environment, rather than fully custom developments.Key ResponsibilitiesActively participate in defining technical architectures in collaboration with project teams.Produce required architectural deliverables, such as system architecture diagrams, integration schemas, operational documents, and technical concepts.Develop various architecture diagrams to formalize technical choices.Implement and maintain continuous integration and deployment pipelines using Microsoft Azure DevOps.Work closely with vendors to prepare development environments.Cooperate with quality assurance teams to establish testing environments.Ensure consistency and alignment of solutions with corporate standards.Support projects throughout their technical lifecycle.
Join repriskag as a Senior Data Engineer, where you will play a pivotal role in transforming data into actionable insights. You will design, build, and maintain scalable data pipelines that will support our analytics and business intelligence efforts. Bring your expertise in data modeling and ETL processes to drive our data strategy forward.
About the RoleJoin OWT, a leading technology and strategic consulting firm and a subsidiary of Swisscom, where we guide our clients through the digital transformation of their processes, the execution of innovative projects, and the implementation of cutting-edge technologies.What You'll Love:A vibrant, collaborative culture fueled by a shared passion for digitalization and emerging technologies, always putting our clients' needs first.An environment that promotes the learning of new skills and the development of existing ones through a well-defined career path.A flexible working model that allows you to collaborate with colleagues and clients from the comfort of your home.The opportunity to work with renowned companies on diverse projects involving modern technology.The chance to benefit from colleagues who are experts in their fields, always willing to share knowledge and provide constructive feedback.A proven mentoring model, ensuring you have someone to guide you through your day-to-day at OWT and support your professional and personal growth.Conveniently located offices in the heart of Geneva, Lausanne, Zurich, and Bern, easily accessible.A competitive salary that promotes a healthy work-life balance.Your Responsibilities:Consult with healthcare organizations on designing and governing data architectures for openEHR Clinical Data Repositories (CDR) and FHIR Operational Data Repositories (ODR).Model clinical data (diagnoses, medications, findings) in openEHR and operational data (appointments, bed occupancy, admissions/discharges) in FHIR.Develop Data Governance Frameworks, defining roles, processes, maturity assessments, and compliance monitoring.Ensure integration between openEHR CDR and FHIR ODR for complete traceability across clinical and administrative processes.Support OWT in building internal expertise in openEHR/FHIR and contribute to strategy development.Collaborate on Digital Health projects with leading Swiss healthcare institutions.
Role overview Jobgether is seeking a Senior Python Data Scraping Engineer for a partner company based in Switzerland. This freelance position centers on developing web data extraction systems that serve both AI and human-driven processes. The role is fully remote, offering flexibility while requiring independence and a careful, detail-oriented approach. What you will do Design and build scalable Python solutions for web scraping and data extraction Tackle complex challenges involving dynamic websites and large volumes of data Produce structured, reliable datasets for analytics and AI-driven applications Collaborate with AI agents to achieve high accuracy and thorough data validation Apply both technical skill and creative thinking to adapt to changing web environments Requirements Advanced Python programming skills, with a focus on web scraping Experience handling large-scale data extraction and processing tasks Ability to address complex scraping problems, including dynamic content Strong commitment to quality assurance and data validation Comfortable working independently and managing remote work responsibilities Located in Switzerland Position details Freelance contract Remote work setup Work contributes to datasets supporting advanced AI and analytics
Full-time|On-site|Bern, Canton of Bern, Switzerland
Gramian Consultancy, a boutique firm specializing in IT professional services and engineering talent solutions, is searching for a Cloud Platform Engineer (Kubernetes/Azure/DevOps) to join a leading Swiss technology company in Lausanne. This permanent, onsite position focuses on building and maintaining reliable, secure, and scalable cloud and on-premises platforms. The role supports a technology-driven organization known for its advanced infrastructure and commitment to innovation. What you will do Manage and optimize Kubernetes clusters, both on-premises and in Azure (AKS, RKE2/AKS). Design and maintain CI/CD pipelines using GitLab CI, and implement GitOps workflows with Argo CD. Operate and enhance centralized logging systems built on the ELK stack (Elasticsearch, Logstash/Filebeat, Kibana). Administer Azure cloud infrastructure (AKS, Entra ID, NSG, Key Vault, Azure Monitor) with a strong focus on security. Deploy and maintain infrastructure using Infrastructure as Code tools such as OpenTofu and Ansible. Implement monitoring and alerting with Prometheus and Grafana. Ensure deployment, availability, and performance of middleware and data components (RabbitMQ, Redis, PostgreSQL). Participate in incident management, root cause analysis, and ongoing improvement efforts. Maintain clear, up-to-date technical documentation, including runbooks, architectures, and procedures. Requirements 3–5 years of experience in DevOps, SRE, or Platform Engineering roles. Extensive hands-on experience with Kubernetes (deployments, services, Helm, troubleshooting). Practical experience with Argo CD and GitOps deployment strategies. Solid experience with GitLab, including CI/CD pipelines, runners, and container registries. Proven experience managing ELK stack environments. Strong knowledge of Azure services (AKS, Entra ID, NSG, Key Vault, Azure Monitor). Expertise in Infrastructure as Code (OpenTofu, Ansible). Scripting skills in Bash and/or Python for automation. Strong problem-solving abilities and a proactive approach. Location and contract Location: Lausanne, Switzerland Working model: Onsite Contract type: Permanent Interview process Introductory call Two client interviews (HR and Hiring Manager)
About UsTetraScience is at the forefront of the Scientific Data and AI Cloud revolution. We are fundamentally changing the landscape of scientific research by creating and industrializing AI-native scientific datasets, offering an innovative suite of next-generation lab data management solutions, scientific applications, and AI-driven outcomes.As a recognized leader in this emerging field, TetraScience has outpaced all competitors in revenue generation. In the past year, major players in computing, cloud services, data, and AI infrastructure have partnered with us, establishing TetraScience as the industry standard for co-innovation and market strategies. For more details on our latest partnerships, visit our Newsroom.As part of your application process, we encourage you to review the Tetra Way letter, penned by our co-founder and CEO, Patrick Grady. This document is essential for understanding our values and culture, and we ask that you reflect on its contents to assess your alignment with our philosophy.It is crucial that you take this document seriously, as embodying its principles is expected from every team member.Your ProfileYou are a product-focused, outcome-driven innovator in technical scientific solutions.A dynamic self-starter, you navigate uncertainty with ease, designing and building effective solutions.You are hands-on, eager to prototype, demo, and deliver results swiftly for your end users.Collaboration is your strength, as you work alongside scientists, product managers, and engineers to transform intricate scientific data into actionable insights. Your capacity to engage with both scientists and business leaders positions you as a vital contributor to harnessing the full value of scientific data.With substantial experience in advanced data methodologies within the biopharma R&D sector, you effectively address current challenges and propose scalable solutions.Your thirst for knowledge drives you to master new tools, techniques, and fields.You exemplify the principles of extreme ownership and have a proven track record in constructing scalable data models and applications aimed at empowering biopharma users to maximize their data's value through AI/ML integration.This position demands exceptional self-discipline and resolve as we pioneer a new category that will significantly impact the industry.
Upbound is revolutionizing the construction of modern infrastructure for the Agentic AI Era. As the creators and primary maintainers of Crossplane, we are developing the Intelligent Control Plane—a groundbreaking platform layer designed to make infrastructure programmable, autonomous, and composable.Our mission is to empower AI-native enterprises with a foundational platform layer that enables teams to provision, operate, and adapt infrastructure at scale—preparing platforms for both human and AI agents. We collaborate with leading cloud providers, ISVs, and open-source communities to help organizations accelerate their operations with enhanced confidence.Currently, Upbound supports Fortune 500 companies and platform engineers in over 100 countries. Crossplane has exceeded 100M downloads and is utilized by over 1,000 teams globally. We are a Series B company, backed by GV (formerly Google Ventures), Altimeter Capital, and Intel Capital, having raised $69M to date. Learn more at upbound.io.As a Staff Solutions Architect at Upbound, you will be the key to our customers' technical success. You will drive customer outcomes from initial implementation through production adoption, ensuring they derive tangible business value from Crossplane and the Upbound platform. This role is deeply technical and hands-on, integrated into the post-sales motion, and involves close collaboration with customers to deliver effective solutions rather than just advisory services.
Sword Services seeks an Azure DevOps Administrator to support client operations in Givisiez, Fribourg. The focus is on managing integration and deployment platforms, especially the administration of an on-premise Azure DevOps Server. This position works closely with teams aiming to streamline software delivery and cloud-native infrastructure. Main responsibilities Maintain and oversee the Azure DevOps Server platform, including build servers on both Windows and Linux systems. Identify areas for improvement in DevOps tools and suggest technical enhancements that fit project needs. Assist project teams in building continuous integration (CI) pipelines for automated builds. Set up and refine continuous deployment (CD) pipelines for on-premise, containerized, or cloud-based environments. Advance infrastructure automation using Infrastructure as Code methods. Create and manage Kubernetes manifests and Helm charts. Provide technical support and operational oversight to help maintain platform stability. Document solutions and collaborate with integration teams in an agile context.
Veeam is at the forefront of the Data and AI Trust domain, dedicated to empowering organizations to fully understand, secure, and build resilience around their data and AI capabilities. As the leading provider in data resilience and security posture management, Veeam seamlessly integrates identity, data, security, and AI risk management. With our headquarters in Seattle and a global presence across more than 30 countries, Veeam proudly protects over 550,000 customers worldwide, who rely on us to keep their operations running smoothly. Join our team as we boldly advance together, fostering growth, learning, and impactful contributions to some of the world’s most prominent brands.About the Role: Following our acquisition of Securiti AI, a pioneer in AI-driven data security posture management (DSPM), Veeam is searching for a Senior Sales Engineer. This role will involve providing technical leadership within our sales team, focusing on the Securiti AI product suite. You will collaborate closely with a Securiti Sales Specialist and support a team of 3–5 Account Executives and Veeam Solution Engineers as the go-to technical expert.Your responsibilities will include guiding customers from initial needs assessment through to solution design, delivering hands-on demonstrations and proof-of-concept sessions to illustrate the value of our offerings. Success in this position hinges on exceptional technical skills and the ability to build strong, trusting relationships with clients.
Join our dynamic team at Capco as a Data Engineer specializing in Snowflake. In this pivotal role, you will leverage your expertise in data engineering to design, build, and optimize data pipelines and architectures, enhancing our data-driven decision-making capabilities. You will collaborate with cross-functional teams to deliver high-quality data solutions that meet the needs of our clients and drive business success.
Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Experience Level
Mid to Senior
About the job
We are seeking a talented and experienced DWH Architect / Senior Data Engineer to join our client, a prestigious Swiss insurance company. If you are eager to confront complex challenges and contribute to the evolution of modern data platforms, we would love to hear from you.
Key Responsibilities:
Design and deploy cutting-edge data warehouse architectures.
Lead the migration and modernization of existing data warehouse systems.
Analyze and convert business and technical requirements into scalable, architecture-compliant solutions.
Work collaboratively with internal stakeholders and cross-functional teams.
Ensure data governance, security, and compliance with Swiss regulations.
Continuously enhance and optimize data platforms and reporting solutions.
Develop and implement naming conventions, self-service strategies, and tool recommendations.