Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Unlock Your Potential
Generate Job-Optimized Resume
One Click And Our AI Optimizes Your Resume to Match The Job Description.
Is Your Resume Optimized For This Role?
Find Out If You're Highlighting The Right Skills And Fix What's Missing
Experience Level
Senior
Qualifications
Proven experience in software development with a strong understanding of data platforms. Extensive knowledge in programming languages such as Java, Python, or Scala. Experience with cloud technologies and data processing frameworks. Strong analytical and problem-solving skills. Excellent communication and teamwork abilities.
About the job
Acceldata is seeking a highly skilled Senior Software Engineer to join our dynamic team specializing in the Open Data Platform (ODP). In this role, you will be instrumental in designing, developing, and maintaining scalable software systems that power our innovative data solutions. You will collaborate with cross-functional teams to deliver high-quality products that exceed client expectations.
About Acceldata
Acceldata is at the forefront of data intelligence, helping organizations unlock the full potential of their data. With a commitment to innovation and excellence, we provide cutting-edge solutions that empower businesses to make data-driven decisions.
Similar jobs
1 - 20 of 2,725 Jobs
Search for Software Engineer Open Data Platform Odp
Join Acceldata as we transform data observability, enabling enterprises to manage and monitor their data effectively through innovative solutions tailored to meet distinct organizational needs. Our Open Data Platform (ODP) seamlessly integrates cutting-edge technologies to provide unparalleled data observability for modern enterprises.About the Role: We are seeking a talented Software Engineer to enhance and scale the Acceldata Open Data Platform (ODP) - a robust, enterprise-grade, open-source platform designed for cloud, hybrid, and on-premises environments. You will tackle significant technological challenges, ranging from distributed systems to data observability, assisting global clients in modernizing their platforms without vendor lock-in.This position provides an exciting opportunity to develop state-of-the-art data solutions, influence the open-source landscape, and collaborate with industry leaders. Your contributions will leave a lasting impact on our data platform and the wider open-source community.
Acceldata is seeking a highly skilled Senior Software Engineer to join our dynamic team specializing in the Open Data Platform (ODP). In this role, you will be instrumental in designing, developing, and maintaining scalable software systems that power our innovative data solutions. You will collaborate with cross-functional teams to deliver high-quality products that exceed client expectations.
About Us Acceldata stands at the forefront of Enterprise Data Observability, having established itself as a leader since its inception in 2018. Based in Silicon Valley, we have pioneered the first Enterprise Data Observability Platform designed to facilitate the development and management of exceptional data products.Our approach to Enterprise Data Observability integrates cutting-edge technologies such as AI, LLMs, Analytics, and DataOps. Acceldata empowers organizations with vital capabilities that ensure the delivery of reliable and trustworthy data to fuel enterprise data products.As a SaaS solution, Acceldata's platform is trusted by a diverse range of global clients, including industry giants like HPE, HSBC, Visa, Freddie Mac, Manulife, Workday, Oracle, and many more. We are proud to be a Series-C funded company with backing from top-tier investors including Insight Partners, March Capital, Lightspeed, and others.About the Role: We are on the lookout for a highly skilled Senior Software Development Engineer in Test (SDET) to join our Open Data Platform (ODP) team, focusing on the quality assurance and performance enhancement of large-scale data systems.In this position, you will collaborate closely with both development and operations teams to design and implement comprehensive testing strategies for the Open Source Data Platform (ODP), which encompasses technologies such as Hadoop, Spark, Hive, and Kafka. Your expertise will be vital in automating tests, fine-tuning performance, and pinpointing bottlenecks within distributed data systems.Key responsibilities will include drafting test plans, developing automated test scripts, and executing functional, regression, and performance testing. You will play a critical role in identifying and rectifying defects, safeguarding data integrity, and optimizing testing methodologies. Strong teamwork and collaboration skills are essential, as you will engage with cross-functional teams and spearhead quality improvement initiatives. Your contributions will significantly impact the reliability and quality standards of big data solutions.https://www.acceldata.io/open-data-platform
At Databricks, we are dedicated to empowering data teams to address some of the most challenging problems globally—from bringing innovative transportation solutions to life to accelerating groundbreaking medical advancements. Our mission is realized through the development and operation of the world's premier data and AI infrastructure platform, enabling our customers to harness profound data insights to elevate their businesses. Founded by engineers with a relentless focus on customer success, we eagerly embrace each opportunity to tackle technical challenges, whether it’s designing next-generation UI/UX for data interactions or scaling our services across millions of virtual machines. Our journey has just begun.As a Staff Software Engineer on the Data Platform team, you will harness state-of-the-art AI developer tools and techniques to construct Databricks' Data Intelligence Platform, which will facilitate automated decision-making throughout the organization. This role involves close collaboration with Databricks Product Teams, Data Science, and other stakeholders. You will shape the future of various tools, including logging, orchestration, data transformation, metric stores, governance platforms, and data consumption layers.
Discover OktaOkta is the world's leading identity management company, empowering individuals to securely access any technology, anytime, on any device or application. Our versatile products, including the Okta Platform and Auth0 Platform, offer secure access, authentication, and automation, placing identity at the forefront of business security and growth.We value diverse perspectives and experiences at Okta. We seek lifelong learners who can contribute uniquely to our team rather than just looking for candidates who meet every qualification. Join us in creating a future where identity is truly in your hands.About OktaOkta provides an enterprise-grade identity management solution, designed from the ground up in the cloud with a steadfast commitment to customer success. With Okta, you can manage access across any application, person, or device, be it employees, partners, or customers, whether applications are in the cloud or on-premise. Our solutions enhance security, increase productivity, and ensure compliance.Our service features directory services, single sign-on, robust authentication, provisioning, workflow, and built-in reporting capabilities. It operates on a secure, reliable, and extensively audited cloud platform that deeply integrates with on-premises applications, directories, and identity management systems.About the TeamThe Data Platform team is tasked with providing the foundational data services, systems, and products for Okta, significantly benefiting our users. Currently, the Data Platform team addresses challenges and enables:Streaming analyticsInteractive end-user reportingA data and machine learning platform for Okta's scalabilityTelemetry for our products and dataOur elite team is fast-paced, innovative, and adaptable. We promote ownership and hold high expectations for our engineers, rewarding them with exciting projects, cutting-edge technologies, and the opportunity to acquire significant equity in a transformative company. Okta is poised to redefine the landscape of cloud computing.About the PositionThis role presents an exciting opportunity for experienced Software Engineers to join our rapidly expanding Data Platform organization. We are committed to scaling high-volume, low-latency, distributed data services and products. As part of the Data Platform team, you will collaborate with engineers across the organization to construct the foundational infrastructure that will support Okta's growth for years to come.
At Databricks, we are driven by our mission to empower data teams in tackling some of the most challenging issues facing the world today. From realizing the future of transportation to expediting medical innovations, we build and maintain the leading data and AI infrastructure platform, enabling our clients to harness deep data insights for business enhancement. Founded by engineers with a relentless focus on customer satisfaction, we eagerly embrace every challenge, whether it's designing next-gen UI/UX for data interaction or scaling our services across millions of virtual machines.Our Databricks Mosaic AI utilizes a distinctive data-focused approach to develop enterprise-grade Machine Learning and Generative AI solutions, allowing organizations to securely and cost-effectively manage and deploy models trained with their proprietary data. We're excited to expand our team in Bengaluru, India, where we are in the process of launching 14 new teams from scratch!As a Senior Software Engineer at Databricks India, you will engage with various domains including:BackendDistributed Data Systems (DDS)Full Stack DevelopmentYour Impact:1. As part of our Backend teams, you will tackle diverse challenges across our core service platforms, including:Addressing intricate issues ranging from product development to infrastructure, focusing on distributed systems, large-scale service architecture, monitoring, workflow orchestration, and enhancing developer experience.Delivering dependable, high-performance services and client libraries designed for storing and accessing vast amounts of data on cloud storage solutions such as AWS S3 and Azure Blob Store.Creating robust, scalable services using technologies like Scala, Kubernetes, and Apache Spark™, supporting an infrastructure that handles millions of cluster-hours daily, while developing product features that empower customers to effortlessly manage and monitor their platform usage.2. Our DDS team encompasses:Apache Spark™Data Plane StorageDelta LakeDelta PipelinesPerformance Engineering3. As a Full Stack engineer, you will collaborate closely with your team and product management to deliver an outstanding user experience.
Join our innovative team at Tekion as a Staff Software Engineer – Data Platform Engineer, where your expertise will help shape the future of our data platform. You will be responsible for designing and implementing robust data solutions that drive business insights and enhance operational efficiency.
About the RoleAs a Software Engineer focused on Platform and Data Infrastructure, you will be pivotal in designing and maintaining the foundational elements that drive Galileo’s platform. Your expertise will be vital in tackling complex systems challenges at scale, ensuring our infrastructure remains robust, efficient, and responsive. We are on the lookout for a skilled engineer who has hands-on experience in building large-scale real-time infrastructure, crafting services and APIs capable of processing millions of queries, and addressing the unique challenges posed by high-scale systems. Familiarity with optimizing high-volume traffic across SQL and NoSQL databases, time-series databases, and object stores is essential.What You'll Be DoingDesign and scale core infrastructure by creating and optimizing distributed systems and APIs that can manage millions of real-time queries while maintaining low latency and high reliability.Develop data-rich systems by working with SQL, NoSQL, time-series, and object storage solutions, ensuring that data pipelines and retrieval processes are optimized for maximum throughput and efficiency.Enhance performance at scale through profiling and tuning systems for latency, throughput, and cost, ensuring the platform grows in alignment with customer demand.Engage in the development of real-time serving systems by designing high-throughput caching layers and efficient data lookup services to provide swift, dependable access to extensive datasets.
Collaboration Fuels Innovation. Join Roku: Redefining TelevisionRoku stands as the leading TV streaming platform in the U.S., Canada, and Mexico, with aspirations to empower every television globally. As pioneers in streaming technology, we aim to connect consumers with their cherished content, assist content publishers in expanding and monetizing their audiences, and offer advertisers unique tools to engage effectively with viewers.From day one at Roku, you will have the opportunity to make significant contributions. As a rapidly expanding public company, we foster an environment where every team member plays a vital role. You will have the chance to delight millions of TV streamers worldwide while acquiring valuable experience across diverse fields. About Our Team The Ad Revenue team plays a crucial role within Roku's Advertising organization, focusing on financial automation and data solutions that facilitate informed decision-making across the advertising landscape. We thrive in a dynamic and intricate environment, collaborating closely with Finance, Accounting, Analytics, and various engineering teams to provide timely, impactful insights into business performance. As we progress, we are investing in AI-driven capabilities to enable non-technical stakeholders to effectively interpret our diverse data assets into actionable results.Role Overview We are in search of a highly proficient Senior Software Engineer for a hybrid role that merges software and data engineering. This position demands the capability to design, construct, and maintain scalable systems for both application development and extensive data processing. You will be responsible for architecting and overseeing production-grade data products and APIs, utilizing technologies such as Java/Scala, SQL, Spark, Airflow, and Kubernetes to deliver dependable, high-performance solutions. The ideal candidate will have a documented history of building high-scale data services and pipelines, with a strong commitment to data quality and operational excellence.
Harness is a pioneering AI Software Delivery Platform, founded by the esteemed technologist and entrepreneur Jyoti Bansal, known for founding AppDynamics, which was acquired by Cisco for $3.7 billion. With approximately $570 million raised in funding, Harness is currently valued at $5.5 billion and is supported by prominent investors such as Goldman Sachs, Menlo Ventures, IVP, Unusual Ventures, and Citi Ventures. As artificial intelligence accelerates the process of code creation, the primary challenges have now shifted to subsequent phases including testing, deployments, application security, reliability, compliance, and cost optimization. Harness integrates AI and automation into this "outer loop," empowering teams to deliver software with enhanced speed while ensuring security and governance throughout the software delivery lifecycle.The Harness Platform, propelled by Harness AI and the Software Delivery Knowledge Graph, leverages deep contextual insights and intelligent automation across the software delivery lifecycle, incorporating governance and policy-driven controls seamlessly throughout the platform.In the last year, Harness has facilitated over 185 million deployments, 82 million builds, 18 trillion flag evaluations, 8 million security scans, and 9.1 billion optimized tests, managing $2.8 billion in cloud expenditure. This has enabled esteemed clients such as United Airlines, Morningstar, and Choice Hotels to accelerate their release cycles by up to 75%, lower cloud costs by up to 60%, and achieve a tenfold increase in DevOps efficiency.With a diverse global team across 14 offices in 25 countries, Harness is at the forefront of revolutionizing AI software delivery, and we are on the lookout for exceptional talent to help us advance even more rapidly.Position SummaryThe Unified Data Platform (UDP) represents a robust data infrastructure solution aimed at providing a common storage, ingestion, processing, and query layer for all product modules within the Harness ecosystem. This platform supports real-time and batch data processing, unified querying across diverse data sources, and offers a semantic layer for consistent data modeling across modules. It will also underpin several core Harness AI initiatives, including the Knowledge Graph and AI-powered dashboarding.About the RoleEngage in the design, architecture, and development of this platform alongside principal engineers and architects, tackling complex challenges such as declarative ingestion and processing frameworks or DSL-based query expressions, all while ensuring sub-second read latency, sub-minute freshness SLA with integrated security and compliance.Mentor both senior and junior engineers through peer reviews (code/design) and intricate debugging processes.Draft high-level and low-level design documents.
Harness is an innovative AI Software Delivery Platform company founded by the visionary technologist and entrepreneur Jyoti Bansal, who previously established AppDynamics, which was acquired by Cisco for a staggering $3.7 billion. With approximately $570 million in funding, Harness is currently valued at $5.5 billion and supported by prestigious investors such as Goldman Sachs, Menlo Ventures, IVP, Unusual Ventures, and Citi Ventures.As the landscape of software development evolves with the acceleration of AI-driven code creation, the focus has shifted to the broader aspects of software delivery, including testing, deployments, application security, reliability, compliance, and cost optimization. Harness leverages AI and automation to streamline these processes, enabling teams to deliver software more swiftly while ensuring security and governance throughout the software delivery lifecycle.In the past year, Harness has facilitated over 185 million deployments, 82 million builds, 18 trillion flag evaluations, 8 million security scans, 9.1 billion optimized tests, 3 trillion protected API calls, and managed $2.8 billion in cloud expenditure. Our solutions have empowered clients such as United Airlines, Morningstar, and Choice Hotels to amplify their release speeds by as much as 75%, cut cloud costs by up to 60%, and achieve a remarkable 10x increase in DevOps efficiency.With a diverse and global team across 14 offices spanning 25 countries, Harness is at the forefront of shaping the future of AI software delivery. We are on the lookout for exceptional talent to join our mission and propel us forward.Position SummaryWe are developing a cohesive data platform that will serve over 20 product modules, facilitating the ingestion, processing, and retrieval of data to empower analytics and systems at scale. This position is integral to the Data Activation & Analytics initiative under the Unified Data Platform framework, concentrating on creating a self-service analytics platform equipped with a semantic modeling layer, a domain-specific language (DSL) query layer, and an analytics materialization layer. The platform will enable product teams to construct custom dashboards independently, as well as provide readily embeddable analytics modules.About the RoleYou will spearhead the design, architecture, and development of this platform alongside senior architects, tackling complex challenges related to DSL-based query expressions, achieving sub-second read latency, multi-tenancy, high concurrency, and more.
P-1348 Join Databricks, where our mission is to empower data teams to tackle the world's most challenging problems, from detecting security threats to advancing cancer drug development. We build and maintain the premier data and AI infrastructure platform, enabling our customers to focus on their critical missions. Our engineering teams are dedicated to creating innovative technical products that address significant needs while pushing the limits of data and AI technology. We operate with the resilience, security, and scalability necessary to ensure our customers succeed on our platform. Our platform operates at an unparalleled scale, comprising millions of virtual machines and generating terabytes of logs, processing exabytes of data daily. We encounter various cloud hardware, network, and operating system faults, and our software must adeptly protect our customers from these challenges. As a Staff Software Engineer on the Data Platform team, you will contribute to the development of the Data Intelligence Platform at Databricks, automating decision-making processes across the organization. Collaborating with Product Teams, Data Science, Applied AI, and more, you will create tools for logging, orchestration, data transformation, metric storage, governance platforms, and data consumption layers. Leveraging cutting-edge Databricks products and tools within the data ecosystem, your team will serve as a significant in-house customer, providing insights that shape our product's future. Your Impact: Design and manage the Databricks metrics store, facilitating shared access to detailed metrics across business units and engineering teams with high quality and performance. Develop the cross-company Data Intelligence Platform, encompassing all business and product metrics necessary for running Databricks, balancing data protection with ease of sharing as we transition to a public entity. Create tools and infrastructure for efficiently managing Databricks operations at scale across multiple clouds and geographies, including CI/CD processes, testing frameworks for pipelines and data quality, and infrastructure-as-code tools. Establish the foundational ETL framework utilized by all company-developed pipelines. Collaborate with engineering teams to enhance...
P-1348 At Databricks, we are dedicated to empowering data teams to tackle some of the most challenging problems in the world, ranging from security threat detection to the development of cancer drugs. Our mission is to create and manage the leading data and AI infrastructure platform, allowing our customers to concentrate on the critical challenges central to their missions. Our engineering teams are committed to developing innovative technical products that meet real and significant needs globally. We continuously push the limits of data and AI technology while ensuring resilience, security, and scalability to enhance our customers' success on our platform. We are responsible for the operation of one of the largest scale software platforms, comprising millions of virtual machines that generate terabytes of logs and process exabytes of data on a daily basis. At this scale, we encounter cloud hardware, network, and operating system issues, and our software must effectively shield our customers from these challenges. As a Senior Software Engineer on the Data Platform team, you will contribute to building the Data Intelligence Platform for Databricks, which aims to automate decision-making across the organization. You will collaborate closely with Databricks Product Teams, Data Science, Applied AI, and more. Your role will involve developing a range of tools for logging, orchestration, data transformation, metric storage, governance platforms, and data consumption layers. You will leverage the latest and most advanced Databricks products and other tools in the data ecosystem. Our team also serves as a substantial in-house customer, using Databricks to inform the future direction of our product. Your Impact: Design and manage the Databricks metrics store, enabling all business units and engineering teams to consolidate and share detailed metrics on a common platform with high quality, introspection capabilities, and query performance. Lead the development of the cross-company Data Intelligence Platform, which encapsulates all business and product metrics essential for running Databricks. You will play a pivotal role in balancing data protection with ease of shareability as we transition to a public company. Create tools and infrastructure to efficiently manage and operate Databricks on Databricks at scale across multiple clouds, geographies, and deployment types. This includes CI/CD processes, testing frameworks for pipelines and data quality, and infrastructure-as-code tooling. Establish the foundational ETL framework utilized by all pipelines developed within the company. Collaborate with our engineering teams to provide...
About UsAt Rox, our mission is to empower individuals to excel in their work.Our innovative platform equips sellers with autonomous revenue agents, allowing them to concentrate on what they excel at: selling. Just as coding agents transformed engineering, our revenue agents revolutionize customer interactions.We are pioneering the revenue stack by developing the world’s first revenue operating system, covering everything from the application layer to the context system. With Rox, humans transition to orchestration roles while agents handle the complete customer lifecycle.Rox serves Global 2000 leaders across banking, hardware, construction, and advanced AI sectors, while also supporting top-tier AI innovators such as Ramp and Cognition.Our success is grounded in a shared belief in our mission, paired with an unwavering dedication to making it a reality.The TeamOur world-class team is the driving force behind our mission to redefine business operations.Our team members have:Founded and successfully exited companiesHeld top positions at Google, AWS, Confluent, and New RelicAchieved gold medals in IMO and IOIPublished groundbreaking research papersWe have raised $50M from esteemed investors, including Sequoia (Alfred Lin), General Catalyst (Hemant Taneja), Google Ventures, Elad Gil, and Chris Ré.Core PrinciplesTaste: Craft beautiful experiences.We strive for perfection in every detail, ensuring that every interaction aids sellers in their tasks and enhances their experience. We are committed to continuous improvement and innovation.Obsession: Commit unreasonably.Our commitment to our craft is unwavering. We proactively drive value and are dedicated to learning and improving our services daily.Action: Get it done.Execution is key. We prioritize thoughtful yet prompt decision-making and swift delivery, building trust through our actions.
Join Tekion as a Senior Data Platform Engineer, where you will play a crucial role in developing and optimizing our data platform. You will collaborate with cross-functional teams to build scalable data solutions that empower our business decisions and enhance user experience.
Collaborate to Innovate in Streaming. Join Roku's Vision to Revolutionize TelevisionAs the leading TV streaming platform in the U.S., Canada, and Mexico, Roku is on a mission to power every television globally. We have transformed how people enjoy their favorite shows and movies by connecting consumers to the content they love, assisting publishers in reaching vast audiences, and offering advertisers unique tools to engage effectively.At Roku, your contributions will be recognized from day one. We are a rapidly expanding public company where every team member plays a pivotal role. This is your chance to impact millions of TV streamers worldwide while gaining valuable experience across diverse fields. About Our Team The Ad Data Activation organization is at the forefront of building a reliable, privacy-conscious, and scalable data foundation vital to Roku’s advertising growth. We develop identity systems, device graph pipelines, audience platforms, and insights tools that empower precise targeting, measurement, and reporting across Roku Ads. We are seeking a Senior Machine Learning Engineer to enhance our systems' intelligence. You will operate at the intersection of large-scale data platforms, applied machine learning, and generative AI, developing functionalities that make Roku's advertising data actionable for both internal teams and advertisers. This role involves creating core generative AI platform features for Roku Advertising, alongside applying traditional ML techniques.
Harness is revolutionizing the software delivery landscape with its AI-driven platform, spearheaded by visionary technologist and entrepreneur Jyoti Bansal, the founder of AppDynamics, which was acquired by Cisco for $3.7 billion. With approximately $570 million raised in funding and a valuation of $5.5 billion, Harness is backed by prominent investors, including Goldman Sachs, Menlo Ventures, IVP, Unusual Ventures, and Citi Ventures. As AI accelerates code generation, the challenges have evolved beyond coding to encompass testing, deployments, application security, reliability, compliance, and cost optimization. Harness integrates AI and automation into this "outer loop," empowering teams to deliver software more rapidly while ensuring security and governance throughout the software delivery lifecycle.Utilizing Harness AI and the Software Delivery Knowledge Graph, the Harness Platform embeds intelligent automation and contextual insights across the software delivery lifecycle, ensuring governance and policy-driven controls are integral to the platform.In the past year alone, Harness has facilitated over 185 million deployments, 82 million builds, 18 trillion flag evaluations, 8 million security scans, 9.1 billion optimized tests, 3 trillion protected API calls, and managed $2.8 billion in cloud expenditure. Our solutions have enabled key clients such as United Airlines, Morningstar, and Choice Hotels to accelerate their release cycles by up to 75%, decrease cloud costs by up to 60%, and achieve a tenfold increase in DevOps efficiency.With a global presence spanning 14 offices and 25 countries, Harness is at the forefront of shaping the future of AI software delivery. We are seeking exceptional talent to join us in our mission to innovate and accelerate.
Harness, a leader in AI-driven software delivery, was founded by notable technologist Jyoti Bansal, who is also the founder of AppDynamics, acquired by Cisco for a staggering $3.7B. With approximately $570M raised in funding and a valuation of $5.5B, Harness is supported by top-tier investors such as Goldman Sachs, Menlo Ventures, IVP, Unusual Ventures, and Citi Ventures. As AI revolutionizes code creation, the focus has shifted to optimizing processes beyond code – including testing, deployments, application security, reliability, compliance, and cost efficiency. Harness leverages AI and automation to enhance this "outer loop," enabling teams to accelerate software delivery without compromising security or governance throughout the entire software lifecycle.Utilizing the Harness AI and Software Delivery Knowledge Graph, our platform seamlessly integrates intelligent automation and governance controls throughout the software delivery process.In the past year alone, Harness facilitated over 185M deployments, 82M builds, conducted 8M security scans, and managed $2.8B in cloud expenditure, empowering clients like United Airlines and Choice Hotels to boost release speeds by up to 75% and reduce cloud costs by 60% while achieving tenfold improvements in DevOps efficiency.With a global footprint across 14 offices in 25 countries, Harness is redefining the future of AI software delivery – and we are eager to welcome exceptional talent to join our journey.Position SummaryAs an Engineering Manager under the AppSec Platform charter, you will design and develop a multi-cloud, portable, open-source streaming-first data platform that powers our API observability and security products. Your team will also manage common services in the authentication and authorization domains. This platform is vital as the core pillar for customer-facing analytics and consumption layers for various product modules including Catalog, Runtime Protection, and Application/AI Security. You will lead and cultivate a team of 8-10 engineers, driving execution, fostering innovation, and delivering high-impact platform capabilities. Additionally, you will enhance operational excellence through monitoring, reliability improvements, and cost optimization of large-scale data systems, all while contributing to our mission of becoming a global leader in the AppSec domain.
Full-time|Remote|Bengaluru, India; EMEA Remote; Tel Aviv, Israel
At WEKA, we are pioneering a transformative approach to the enterprise data stack, designed for the era of reasoning. Our flagship product, NeuralMesh by WEKA, exemplifies the forefront of agentic AI data infrastructure, offering a cloud and AI-native software solution that is adaptable to any environment. This innovation converts traditional data silos into dynamic data pipelines, significantly boosting GPU utilization and enhancing the speed, efficiency, and energy consumption of AI model training, inference, and other high-compute workloads.As a pre-IPO, growth-stage enterprise, WEKA is experiencing remarkable growth, having secured $375 million in funding from prominent venture capital and strategic investors. We collaborate with some of the largest and most innovative organizations worldwide, including 12 of the Fortune 50, to accelerate their discoveries, insights, and sustainable business outcomes. Our team is driven by a commitment to address our customers' most intricate data challenges and to foster intelligent innovation and business value. If this resonates with you, we welcome you to embark on this exciting journey with us.
Role overview Quince seeks a Staff Data Engineer in Bengaluru, Karnataka, India. This position centers on building and maintaining the core infrastructure behind the company’s data platform. The work directly supports major data initiatives and helps drive informed decisions throughout the business. What you will do Design and develop infrastructure that powers the data platform Maintain and improve systems supporting data needs across the organization Collaborate with other teams to strengthen the broader data ecosystem Requirements Solid background in data engineering Experience architecting and developing data infrastructure Comfort working collaboratively to address challenges Motivation to use data for meaningful solutions Appreciation for innovation and ongoing improvement
Apr 27, 2026
Sign in to browse more jobs
Create account — see all 2,725 results
Tailoring 0 resumes…
Tailoring 0 resumes…
We'll move completed jobs to Ready to Apply automatically.