Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Unlock Your Potential
Generate Job-Optimized Resume
One Click And Our AI Optimizes Your Resume to Match The Job Description.
Is Your Resume Optimized For This Role?
Find Out If You're Highlighting The Right Skills And Fix What's Missing
Experience Level
Senior
Qualifications
The impact you will have:Design and manage the cross-company Data Intelligence Platform that encompasses every business and product metric essential for Databricks operations. You will significantly influence the delicate balance between data protection and shareability as we transition into a public company. Develop tools and infrastructure to efficiently operate Databricks across multiple clouds, geographies, and deployment types on a large scale. This includes CI/CD processes, testing frameworks for pipelines and data quality, and infrastructure-as-code tooling. Collaborate with our engineering teams to provide leadership in formulating the long-term vision and requirements for the Databricks product. Construct reliable data pipelines and resolve data challenges using Databricks, our partner’s products, and various open-source tools. You will provide early feedback on the design and functionality of these products. Represent Databricks at academic and industry conferences and events.
About the job
At Databricks, we are dedicated to empowering data teams to address some of the most challenging problems globally—from bringing innovative transportation solutions to life to accelerating groundbreaking medical advancements. Our mission is realized through the development and operation of the world's premier data and AI infrastructure platform, enabling our customers to harness profound data insights to elevate their businesses. Founded by engineers with a relentless focus on customer success, we eagerly embrace each opportunity to tackle technical challenges, whether it’s designing next-generation UI/UX for data interactions or scaling our services across millions of virtual machines. Our journey has just begun.
As a Staff Software Engineer on the Data Platform team, you will harness state-of-the-art AI developer tools and techniques to construct Databricks' Data Intelligence Platform, which will facilitate automated decision-making throughout the organization. This role involves close collaboration with Databricks Product Teams, Data Science, and other stakeholders. You will shape the future of various tools, including logging, orchestration, data transformation, metric stores, governance platforms, and data consumption layers.
About Databricks
Databricks is at the forefront of data and AI infrastructure, passionate about enabling teams worldwide to solve pressing challenges. Our innovative platform is designed to foster deep insights from data, driving business improvement and technological advancement.
At Databricks, we are driven by our mission to empower data teams in tackling some of the most challenging issues facing the world today. From realizing the future of transportation to expediting medical innovations, we build and maintain the leading data and AI infrastructure platform, enabling our clients to harness deep data insights for business enhancement. Founded by engineers with a relentless focus on customer satisfaction, we eagerly embrace every challenge, whether it's designing next-gen UI/UX for data interaction or scaling our services across millions of virtual machines.Our Databricks Mosaic AI utilizes a distinctive data-focused approach to develop enterprise-grade Machine Learning and Generative AI solutions, allowing organizations to securely and cost-effectively manage and deploy models trained with their proprietary data. We're excited to expand our team in Bengaluru, India, where we are in the process of launching 14 new teams from scratch!As a Senior Software Engineer at Databricks India, you will engage with various domains including:BackendDistributed Data Systems (DDS)Full Stack DevelopmentYour Impact:1. As part of our Backend teams, you will tackle diverse challenges across our core service platforms, including:Addressing intricate issues ranging from product development to infrastructure, focusing on distributed systems, large-scale service architecture, monitoring, workflow orchestration, and enhancing developer experience.Delivering dependable, high-performance services and client libraries designed for storing and accessing vast amounts of data on cloud storage solutions such as AWS S3 and Azure Blob Store.Creating robust, scalable services using technologies like Scala, Kubernetes, and Apache Spark™, supporting an infrastructure that handles millions of cluster-hours daily, while developing product features that empower customers to effortlessly manage and monitor their platform usage.2. Our DDS team encompasses:Apache Spark™Data Plane StorageDelta LakeDelta PipelinesPerformance Engineering3. As a Full Stack engineer, you will collaborate closely with your team and product management to deliver an outstanding user experience.
Collaboration Fuels Innovation. Join Roku: Redefining TelevisionRoku stands as the leading TV streaming platform in the U.S., Canada, and Mexico, with aspirations to empower every television globally. As pioneers in streaming technology, we aim to connect consumers with their cherished content, assist content publishers in expanding and monetizing their audiences, and offer advertisers unique tools to engage effectively with viewers.From day one at Roku, you will have the opportunity to make significant contributions. As a rapidly expanding public company, we foster an environment where every team member plays a vital role. You will have the chance to delight millions of TV streamers worldwide while acquiring valuable experience across diverse fields. About Our Team The Ad Revenue team plays a crucial role within Roku's Advertising organization, focusing on financial automation and data solutions that facilitate informed decision-making across the advertising landscape. We thrive in a dynamic and intricate environment, collaborating closely with Finance, Accounting, Analytics, and various engineering teams to provide timely, impactful insights into business performance. As we progress, we are investing in AI-driven capabilities to enable non-technical stakeholders to effectively interpret our diverse data assets into actionable results.Role Overview We are in search of a highly proficient Senior Software Engineer for a hybrid role that merges software and data engineering. This position demands the capability to design, construct, and maintain scalable systems for both application development and extensive data processing. You will be responsible for architecting and overseeing production-grade data products and APIs, utilizing technologies such as Java/Scala, SQL, Spark, Airflow, and Kubernetes to deliver dependable, high-performance solutions. The ideal candidate will have a documented history of building high-scale data services and pipelines, with a strong commitment to data quality and operational excellence.
P-1348 At Databricks, we are dedicated to empowering data teams to tackle some of the most challenging problems in the world, ranging from security threat detection to the development of cancer drugs. Our mission is to create and manage the leading data and AI infrastructure platform, allowing our customers to concentrate on the critical challenges central to their missions. Our engineering teams are committed to developing innovative technical products that meet real and significant needs globally. We continuously push the limits of data and AI technology while ensuring resilience, security, and scalability to enhance our customers' success on our platform. We are responsible for the operation of one of the largest scale software platforms, comprising millions of virtual machines that generate terabytes of logs and process exabytes of data on a daily basis. At this scale, we encounter cloud hardware, network, and operating system issues, and our software must effectively shield our customers from these challenges. As a Senior Software Engineer on the Data Platform team, you will contribute to building the Data Intelligence Platform for Databricks, which aims to automate decision-making across the organization. You will collaborate closely with Databricks Product Teams, Data Science, Applied AI, and more. Your role will involve developing a range of tools for logging, orchestration, data transformation, metric storage, governance platforms, and data consumption layers. You will leverage the latest and most advanced Databricks products and other tools in the data ecosystem. Our team also serves as a substantial in-house customer, using Databricks to inform the future direction of our product. Your Impact: Design and manage the Databricks metrics store, enabling all business units and engineering teams to consolidate and share detailed metrics on a common platform with high quality, introspection capabilities, and query performance. Lead the development of the cross-company Data Intelligence Platform, which encapsulates all business and product metrics essential for running Databricks. You will play a pivotal role in balancing data protection with ease of shareability as we transition to a public company. Create tools and infrastructure to efficiently manage and operate Databricks on Databricks at scale across multiple clouds, geographies, and deployment types. This includes CI/CD processes, testing frameworks for pipelines and data quality, and infrastructure-as-code tooling. Establish the foundational ETL framework utilized by all pipelines developed within the company. Collaborate with our engineering teams to provide...
Join Tekion as a Senior Data Platform Engineer, where you will play a crucial role in developing and optimizing our data platform. You will collaborate with cross-functional teams to build scalable data solutions that empower our business decisions and enhance user experience.
Acceldata is seeking a highly skilled Senior Software Engineer to join our dynamic team specializing in the Open Data Platform (ODP). In this role, you will be instrumental in designing, developing, and maintaining scalable software systems that power our innovative data solutions. You will collaborate with cross-functional teams to deliver high-quality products that exceed client expectations.
Collaborate to Innovate in Streaming. Join Roku's Vision to Revolutionize TelevisionAs the leading TV streaming platform in the U.S., Canada, and Mexico, Roku is on a mission to power every television globally. We have transformed how people enjoy their favorite shows and movies by connecting consumers to the content they love, assisting publishers in reaching vast audiences, and offering advertisers unique tools to engage effectively.At Roku, your contributions will be recognized from day one. We are a rapidly expanding public company where every team member plays a pivotal role. This is your chance to impact millions of TV streamers worldwide while gaining valuable experience across diverse fields. About Our Team The Ad Data Activation organization is at the forefront of building a reliable, privacy-conscious, and scalable data foundation vital to Roku’s advertising growth. We develop identity systems, device graph pipelines, audience platforms, and insights tools that empower precise targeting, measurement, and reporting across Roku Ads. We are seeking a Senior Machine Learning Engineer to enhance our systems' intelligence. You will operate at the intersection of large-scale data platforms, applied machine learning, and generative AI, developing functionalities that make Roku's advertising data actionable for both internal teams and advertisers. This role involves creating core generative AI platform features for Roku Advertising, alongside applying traditional ML techniques.
At Databricks, we are dedicated to empowering data teams to address some of the most challenging problems globally—from bringing innovative transportation solutions to life to accelerating groundbreaking medical advancements. Our mission is realized through the development and operation of the world's premier data and AI infrastructure platform, enabling our customers to harness profound data insights to elevate their businesses. Founded by engineers with a relentless focus on customer success, we eagerly embrace each opportunity to tackle technical challenges, whether it’s designing next-generation UI/UX for data interactions or scaling our services across millions of virtual machines. Our journey has just begun.As a Staff Software Engineer on the Data Platform team, you will harness state-of-the-art AI developer tools and techniques to construct Databricks' Data Intelligence Platform, which will facilitate automated decision-making throughout the organization. This role involves close collaboration with Databricks Product Teams, Data Science, and other stakeholders. You will shape the future of various tools, including logging, orchestration, data transformation, metric stores, governance platforms, and data consumption layers.
Discover OktaOkta is the world's leading identity management company, empowering individuals to securely access any technology, anytime, on any device or application. Our versatile products, including the Okta Platform and Auth0 Platform, offer secure access, authentication, and automation, placing identity at the forefront of business security and growth.We value diverse perspectives and experiences at Okta. We seek lifelong learners who can contribute uniquely to our team rather than just looking for candidates who meet every qualification. Join us in creating a future where identity is truly in your hands.About OktaOkta provides an enterprise-grade identity management solution, designed from the ground up in the cloud with a steadfast commitment to customer success. With Okta, you can manage access across any application, person, or device, be it employees, partners, or customers, whether applications are in the cloud or on-premise. Our solutions enhance security, increase productivity, and ensure compliance.Our service features directory services, single sign-on, robust authentication, provisioning, workflow, and built-in reporting capabilities. It operates on a secure, reliable, and extensively audited cloud platform that deeply integrates with on-premises applications, directories, and identity management systems.About the TeamThe Data Platform team is tasked with providing the foundational data services, systems, and products for Okta, significantly benefiting our users. Currently, the Data Platform team addresses challenges and enables:Streaming analyticsInteractive end-user reportingA data and machine learning platform for Okta's scalabilityTelemetry for our products and dataOur elite team is fast-paced, innovative, and adaptable. We promote ownership and hold high expectations for our engineers, rewarding them with exciting projects, cutting-edge technologies, and the opportunity to acquire significant equity in a transformative company. Okta is poised to redefine the landscape of cloud computing.About the PositionThis role presents an exciting opportunity for experienced Software Engineers to join our rapidly expanding Data Platform organization. We are committed to scaling high-volume, low-latency, distributed data services and products. As part of the Data Platform team, you will collaborate with engineers across the organization to construct the foundational infrastructure that will support Okta's growth for years to come.
Full-time|Remote|Bengaluru, India; EMEA Remote; Tel Aviv, Israel
At WEKA, we are pioneering a transformative approach to the enterprise data stack, designed for the era of reasoning. Our flagship product, NeuralMesh by WEKA, exemplifies the forefront of agentic AI data infrastructure, offering a cloud and AI-native software solution that is adaptable to any environment. This innovation converts traditional data silos into dynamic data pipelines, significantly boosting GPU utilization and enhancing the speed, efficiency, and energy consumption of AI model training, inference, and other high-compute workloads.As a pre-IPO, growth-stage enterprise, WEKA is experiencing remarkable growth, having secured $375 million in funding from prominent venture capital and strategic investors. We collaborate with some of the largest and most innovative organizations worldwide, including 12 of the Fortune 50, to accelerate their discoveries, insights, and sustainable business outcomes. Our team is driven by a commitment to address our customers' most intricate data challenges and to foster intelligent innovation and business value. If this resonates with you, we welcome you to embark on this exciting journey with us.
Harness is a pioneering AI Software Delivery Platform, founded by the esteemed technologist and entrepreneur Jyoti Bansal, known for founding AppDynamics, which was acquired by Cisco for $3.7 billion. With approximately $570 million raised in funding, Harness is currently valued at $5.5 billion and is supported by prominent investors such as Goldman Sachs, Menlo Ventures, IVP, Unusual Ventures, and Citi Ventures. As artificial intelligence accelerates the process of code creation, the primary challenges have now shifted to subsequent phases including testing, deployments, application security, reliability, compliance, and cost optimization. Harness integrates AI and automation into this "outer loop," empowering teams to deliver software with enhanced speed while ensuring security and governance throughout the software delivery lifecycle.The Harness Platform, propelled by Harness AI and the Software Delivery Knowledge Graph, leverages deep contextual insights and intelligent automation across the software delivery lifecycle, incorporating governance and policy-driven controls seamlessly throughout the platform.In the last year, Harness has facilitated over 185 million deployments, 82 million builds, 18 trillion flag evaluations, 8 million security scans, and 9.1 billion optimized tests, managing $2.8 billion in cloud expenditure. This has enabled esteemed clients such as United Airlines, Morningstar, and Choice Hotels to accelerate their release cycles by up to 75%, lower cloud costs by up to 60%, and achieve a tenfold increase in DevOps efficiency.With a diverse global team across 14 offices in 25 countries, Harness is at the forefront of revolutionizing AI software delivery, and we are on the lookout for exceptional talent to help us advance even more rapidly.Position SummaryThe Unified Data Platform (UDP) represents a robust data infrastructure solution aimed at providing a common storage, ingestion, processing, and query layer for all product modules within the Harness ecosystem. This platform supports real-time and batch data processing, unified querying across diverse data sources, and offers a semantic layer for consistent data modeling across modules. It will also underpin several core Harness AI initiatives, including the Knowledge Graph and AI-powered dashboarding.About the RoleEngage in the design, architecture, and development of this platform alongside principal engineers and architects, tackling complex challenges such as declarative ingestion and processing frameworks or DSL-based query expressions, all while ensuring sub-second read latency, sub-minute freshness SLA with integrated security and compliance.Mentor both senior and junior engineers through peer reviews (code/design) and intricate debugging processes.Draft high-level and low-level design documents.
Join our innovative team at Tekion as a Staff Software Engineer – Data Platform Engineer, where your expertise will help shape the future of our data platform. You will be responsible for designing and implementing robust data solutions that drive business insights and enhance operational efficiency.
Join Acceldata as we transform data observability, enabling enterprises to manage and monitor their data effectively through innovative solutions tailored to meet distinct organizational needs. Our Open Data Platform (ODP) seamlessly integrates cutting-edge technologies to provide unparalleled data observability for modern enterprises.About the Role: We are seeking a talented Software Engineer to enhance and scale the Acceldata Open Data Platform (ODP) - a robust, enterprise-grade, open-source platform designed for cloud, hybrid, and on-premises environments. You will tackle significant technological challenges, ranging from distributed systems to data observability, assisting global clients in modernizing their platforms without vendor lock-in.This position provides an exciting opportunity to develop state-of-the-art data solutions, influence the open-source landscape, and collaborate with industry leaders. Your contributions will leave a lasting impact on our data platform and the wider open-source community.
Teamwork makes the stream work. Join Roku and Transform the Future of TV Streaming!As the leading TV streaming platform in the U.S., Canada, and Mexico, Roku is at the forefront of revolutionizing how audiences engage with television. Our goal is to power every TV worldwide, connecting viewers to their favorite content while empowering publishers and advertisers with innovative solutions.From day one, your contributions at Roku will be recognized and valued. We are a dynamic, growing public company where every team member plays a crucial role in delighting millions of viewers around the globe while acquiring invaluable experience across diverse fields. About Our Big Data TeamRoku operates one of the largest data lakes globally, managing over 70 PB of data and executing more than 10 million queries each month. Our Big Data team is responsible for developing and maintaining the platform that makes this possible. We offer tools to acquire, generate, process, monitor, validate, and access data for both streaming and batch processing. Our technologies include Scribe, Kafka, Hive, Presto, Spark, Flink, Pinot, and more. The team actively contributes to the Open Source community and aims to expand its involvement.Your RoleWe are modernizing our Big Data Platform and need your expertise to redefine our architecture to enhance user experience, reduce costs, and boost efficiency. If you are passionate about Big Data technologies and eager to explore Open Source, this position is tailored for you!Key ResponsibilitiesOptimize and fine-tune existing Big Data systems and pipelines, while also developing new ones to ensure they operate efficiently and cost-effectively.
About the RoleAs a Software Engineer focused on Platform and Data Infrastructure, you will be pivotal in designing and maintaining the foundational elements that drive Galileo’s platform. Your expertise will be vital in tackling complex systems challenges at scale, ensuring our infrastructure remains robust, efficient, and responsive. We are on the lookout for a skilled engineer who has hands-on experience in building large-scale real-time infrastructure, crafting services and APIs capable of processing millions of queries, and addressing the unique challenges posed by high-scale systems. Familiarity with optimizing high-volume traffic across SQL and NoSQL databases, time-series databases, and object stores is essential.What You'll Be DoingDesign and scale core infrastructure by creating and optimizing distributed systems and APIs that can manage millions of real-time queries while maintaining low latency and high reliability.Develop data-rich systems by working with SQL, NoSQL, time-series, and object storage solutions, ensuring that data pipelines and retrieval processes are optimized for maximum throughput and efficiency.Enhance performance at scale through profiling and tuning systems for latency, throughput, and cost, ensuring the platform grows in alignment with customer demand.Engage in the development of real-time serving systems by designing high-throughput caching layers and efficient data lookup services to provide swift, dependable access to extensive datasets.
At WEKA, we are redefining the enterprise data stack for the reasoning age. Our innovative solution, NeuralMesh by WEKA, stands at the forefront of agentic AI data infrastructure, providing a cloud and AI-native software solution deployable anywhere. We convert traditional data silos into dynamic data pipelines that significantly enhance GPU utilization, accelerating AI model training, inference, machine learning, and other resource-intensive tasks while being energy efficient.As a pre-IPO, growth-stage company on a rapid growth path, WEKA has successfully raised $375 million in funding from world-class venture capitalists and strategic investors. We partner with the world’s most innovative enterprises and research organizations, including 12 of the Fortune 50, to facilitate faster and more sustainable discoveries, insights, and business outcomes. Our commitment is to tackle our customers’ most intricate data challenges to foster intelligent innovation and drive business value. If you share our enthusiasm, we welcome you to embark on this exciting journey with us.
Join the Okta Family!At Okta, we are revolutionizing identity management by empowering individuals and organizations to securely access any technology, anywhere, on any device. Our innovative platforms, including the Okta and Auth0 Platforms, are designed to enhance security, streamline authentication, and facilitate automation, placing identity at the forefront of business growth and security.We value diverse perspectives and backgrounds, seeking lifelong learners who can contribute their unique experiences to our mission.Be part of a team that is building a future where identity is truly yours.About OktaOkta is committed to making the world a more secure and interconnected place by enabling organizations to embrace any technology. As the leading independent identity provider for enterprises, we collaborate with a diverse range of clients—from major corporations to innovative startups—to ensure secure connections between people and technology. Our robust AI and data capabilities are pivotal to our growth and success, powering secure and scalable products for both our customer base and employees.The OpportunityWe are on the lookout for a Senior Data Engineer to join our Enterprise Data Platform team. Reporting to the Sr. Manager of Enterprise Data Platform, you will play a crucial role in developing the data infrastructure that enables our internal AI initiatives through AI-ready data. Your responsibilities will include transforming our data product vision into a secure, scalable, and sophisticated technical framework.The ideal candidate possesses a strong passion for creating high-quality data solutions. You will contribute to establishing engineering excellence within the team and ensuring our core data platform is reliable and ready to support AI-driven decision-making at Okta.What You’ll DoDesign and maintain efficient data pipelines and models on our AI data platform, leveraging tools such as Snowflake, AWS, and dbt.Implement security and governance components to uphold compliance and security standards.Develop automated solutions for data classification and access control, and support vulnerability management processes.Utilize infrastructure as code (Terraform) and CI/CD (GitHub/GitLab) to construct, test, and deploy data infrastructure with security and reliability.Promote and adhere to best practices in code quality, system design, and operational readiness.
Harness is an innovative AI Software Delivery Platform company founded by the visionary technologist and entrepreneur Jyoti Bansal, who previously established AppDynamics, which was acquired by Cisco for a staggering $3.7 billion. With approximately $570 million in funding, Harness is currently valued at $5.5 billion and supported by prestigious investors such as Goldman Sachs, Menlo Ventures, IVP, Unusual Ventures, and Citi Ventures.As the landscape of software development evolves with the acceleration of AI-driven code creation, the focus has shifted to the broader aspects of software delivery, including testing, deployments, application security, reliability, compliance, and cost optimization. Harness leverages AI and automation to streamline these processes, enabling teams to deliver software more swiftly while ensuring security and governance throughout the software delivery lifecycle.In the past year, Harness has facilitated over 185 million deployments, 82 million builds, 18 trillion flag evaluations, 8 million security scans, 9.1 billion optimized tests, 3 trillion protected API calls, and managed $2.8 billion in cloud expenditure. Our solutions have empowered clients such as United Airlines, Morningstar, and Choice Hotels to amplify their release speeds by as much as 75%, cut cloud costs by up to 60%, and achieve a remarkable 10x increase in DevOps efficiency.With a diverse and global team across 14 offices spanning 25 countries, Harness is at the forefront of shaping the future of AI software delivery. We are on the lookout for exceptional talent to join our mission and propel us forward.Position SummaryWe are developing a cohesive data platform that will serve over 20 product modules, facilitating the ingestion, processing, and retrieval of data to empower analytics and systems at scale. This position is integral to the Data Activation & Analytics initiative under the Unified Data Platform framework, concentrating on creating a self-service analytics platform equipped with a semantic modeling layer, a domain-specific language (DSL) query layer, and an analytics materialization layer. The platform will enable product teams to construct custom dashboards independently, as well as provide readily embeddable analytics modules.About the RoleYou will spearhead the design, architecture, and development of this platform alongside senior architects, tackling complex challenges related to DSL-based query expressions, achieving sub-second read latency, multi-tenancy, high concurrency, and more.
P-1348 Join Databricks, where our mission is to empower data teams to tackle the world's most challenging problems, from detecting security threats to advancing cancer drug development. We build and maintain the premier data and AI infrastructure platform, enabling our customers to focus on their critical missions. Our engineering teams are dedicated to creating innovative technical products that address significant needs while pushing the limits of data and AI technology. We operate with the resilience, security, and scalability necessary to ensure our customers succeed on our platform. Our platform operates at an unparalleled scale, comprising millions of virtual machines and generating terabytes of logs, processing exabytes of data daily. We encounter various cloud hardware, network, and operating system faults, and our software must adeptly protect our customers from these challenges. As a Staff Software Engineer on the Data Platform team, you will contribute to the development of the Data Intelligence Platform at Databricks, automating decision-making processes across the organization. Collaborating with Product Teams, Data Science, Applied AI, and more, you will create tools for logging, orchestration, data transformation, metric storage, governance platforms, and data consumption layers. Leveraging cutting-edge Databricks products and tools within the data ecosystem, your team will serve as a significant in-house customer, providing insights that shape our product's future. Your Impact: Design and manage the Databricks metrics store, facilitating shared access to detailed metrics across business units and engineering teams with high quality and performance. Develop the cross-company Data Intelligence Platform, encompassing all business and product metrics necessary for running Databricks, balancing data protection with ease of sharing as we transition to a public entity. Create tools and infrastructure for efficiently managing Databricks operations at scale across multiple clouds and geographies, including CI/CD processes, testing frameworks for pipelines and data quality, and infrastructure-as-code tools. Establish the foundational ETL framework utilized by all company-developed pipelines. Collaborate with engineering teams to enhance...
About Us Acceldata stands at the forefront of Enterprise Data Observability, having established itself as a leader since its inception in 2018. Based in Silicon Valley, we have pioneered the first Enterprise Data Observability Platform designed to facilitate the development and management of exceptional data products.Our approach to Enterprise Data Observability integrates cutting-edge technologies such as AI, LLMs, Analytics, and DataOps. Acceldata empowers organizations with vital capabilities that ensure the delivery of reliable and trustworthy data to fuel enterprise data products.As a SaaS solution, Acceldata's platform is trusted by a diverse range of global clients, including industry giants like HPE, HSBC, Visa, Freddie Mac, Manulife, Workday, Oracle, and many more. We are proud to be a Series-C funded company with backing from top-tier investors including Insight Partners, March Capital, Lightspeed, and others.About the Role: We are on the lookout for a highly skilled Senior Software Development Engineer in Test (SDET) to join our Open Data Platform (ODP) team, focusing on the quality assurance and performance enhancement of large-scale data systems.In this position, you will collaborate closely with both development and operations teams to design and implement comprehensive testing strategies for the Open Source Data Platform (ODP), which encompasses technologies such as Hadoop, Spark, Hive, and Kafka. Your expertise will be vital in automating tests, fine-tuning performance, and pinpointing bottlenecks within distributed data systems.Key responsibilities will include drafting test plans, developing automated test scripts, and executing functional, regression, and performance testing. You will play a critical role in identifying and rectifying defects, safeguarding data integrity, and optimizing testing methodologies. Strong teamwork and collaboration skills are essential, as you will engage with cross-functional teams and spearhead quality improvement initiatives. Your contributions will significantly impact the reliability and quality standards of big data solutions.https://www.acceldata.io/open-data-platform
Role overview The Senior Platform Engineer at Adyen will work on the core payment platform that supports the company’s payment services. The position centers on both the design and ongoing development of this essential infrastructure. What you will do Design and develop new features for Adyen’s payment systems platform Maintain and enhance existing infrastructure to support reliability and scalability Work closely with fellow engineers to deliver secure, efficient solutions Location This role is located in Bengaluru.
Apr 28, 2026
Sign in to browse more jobs
Create account — see all 3,140 results
Tailoring 0 resumes…
Tailoring 0 resumes…
We'll move completed jobs to Ready to Apply automatically.