Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Unlock Your Potential
Generate Job-Optimized Resume
One Click And Our AI Optimizes Your Resume to Match The Job Description.
Is Your Resume Optimized For This Role?
Find Out If You're Highlighting The Right Skills And Fix What's Missing
Experience Level
Mid to Senior
Qualifications
Proven experience in Python development with a focus on Azure services. Strong background in data management and analytics. Excellent leadership and team collaboration skills. Ability to design scalable and efficient data architectures.
About the job
Join the Bosch Group as a Lead Python Developer specializing in Azure and Data Management. In this pivotal role, you will lead a team of talented developers, architecting and implementing innovative data solutions on the Azure platform. Your expertise will drive data management strategies that enhance business operations and deliver impactful insights.
About Bosch Group
The Bosch Group is a leading global supplier of technology and services, committed to creating innovative solutions that enhance the quality of life. Our teams are driven by a passion for technology and a dedication to sustainability. Join us in shaping the future of intelligent solutions and making a positive impact.
Join our dynamic team as an Azure Big Data Engineer at Bosch Group in Bengaluru. In this role, you will be responsible for designing, implementing, and maintaining big data solutions using Azure technologies. You will work closely with data scientists and analysts to ensure data is accessible, reliable, and secure. Your contribution will aid in driving data-driven decisions across the organization.
Join our dynamic team at Betsol as an Azure Data Engineer, where you will play a crucial role in designing, implementing, and maintaining data solutions on the Azure cloud platform. Your expertise will help drive our data strategy and ensure that our data architecture supports our business goals.
T-Systems Information and Communication Technology India Private Limited
Full-time|On-site|Bengaluru
We are seeking an experienced Senior Data Engineer with a strong background in DevOps and Data Engineering technologies. The ideal candidate will have between 7 to 12 years of relevant experience and will be responsible for designing and implementing scalable data solutions.Key Responsibilities:Develop, maintain, and optimize data pipelines and architecture.Leverage DevOps principles to automate processes and enhance data workflows.Collaborate with cross-functional teams to deliver high-quality solutions.
Join the team as an Azure Data Engineer for one of Weekday's esteemed clients!With a minimum of 3 years of experience, you will play a pivotal role in designing, constructing, and managing scalable data solutions on Microsoft Azure.As an Azure Data Engineer, you will develop and optimize data pipelines and workflows to bolster business intelligence, analytics, and data science initiatives. Your expertise in Azure-native tools and services will ensure the delivery of robust, secure, and high-performance data solutions.Key responsibilities include building and maintaining ETL/ELT pipelines using Azure Data Factory, enabling efficient data ingestion from diverse structured and unstructured sources. You will utilize Azure Data Lake Storage and Azure Blob Storage for managing extensive datasets while ensuring data organization and governance.Furthermore, you will leverage Azure Synapse Analytics or Azure SQL Database to transform and model data for downstream applications. Collaborating closely with data analysts, scientists, and cross-functional teams, you will address business requirements and deliver dependable data solutions.In this role, maintaining data quality, integrity, and security through best practices and governance policies will be essential. Monitoring, troubleshooting data pipelines, optimizing performance, and ensuring high availability will be integral to your tasks.
Chief Big Data ArchitectBengaluru, IndiaPosition OverviewHuawei is a recognized leader in both fixed and mobile network technologies. As the industry shifts towards a unified CT & IT networking paradigm, we are enhancing our R&D initiatives to develop a scalable, multi-tenant computing and communication infrastructure platform, specifically designed to meet the diverse requirements of our CT & IT clientele.We are seeking talented architects to join our Big Data team at the Huawei India R&D center, where you will play a pivotal role in advancing our big data platform solutions.If you have a passion for big data storage technologies, such as distributed file systems and noSQL data stores, alongside experience with open-source projects like HDFS and HBase, this is an opportunity to lead architectural design, tackle complex technical challenges, and contribute to innovative solutions at Huawei.Key Responsibilities:Collaborate closely with the Product/Platform team in China and the R&D team in the USA to:· Define short-term and long-term business roadmaps for Big Data, addressing customer pain points and delivering competitive products.· Architect and design the next generation of storage engines that are secure, reliable, and scalable.· Analyze challenges in big data storage and innovate solutions for distributed storage engines through POCs and partnerships.· Rapidly iterate on new designs and technologies, delivering prototypes and releases.Qualifications:15+ years of relevant experience in big data and distributed systems.Strong analytical and problem-solving skills.Bachelor's degree or higher in Computer Science or a related field.
SanDisk is seeking a Data Engineer in Bengaluru to work with the Azure Cloud Platform. This role centers on building and maintaining scalable data pipelines and data architecture that enable the company’s broader data efforts. The focus is on transforming raw information into insights that support business decisions. Key responsibilities Design, develop, and maintain data pipelines using Azure services Build and support data architecture for analytics and reporting needs Collaborate with team members to turn raw data into actionable information Location This position is based in Bengaluru.
Teamwork makes the stream work. Join Roku and Transform the Future of TV Streaming!As the leading TV streaming platform in the U.S., Canada, and Mexico, Roku is at the forefront of revolutionizing how audiences engage with television. Our goal is to power every TV worldwide, connecting viewers to their favorite content while empowering publishers and advertisers with innovative solutions.From day one, your contributions at Roku will be recognized and valued. We are a dynamic, growing public company where every team member plays a crucial role in delighting millions of viewers around the globe while acquiring invaluable experience across diverse fields. About Our Big Data TeamRoku operates one of the largest data lakes globally, managing over 70 PB of data and executing more than 10 million queries each month. Our Big Data team is responsible for developing and maintaining the platform that makes this possible. We offer tools to acquire, generate, process, monitor, validate, and access data for both streaming and batch processing. Our technologies include Scribe, Kafka, Hive, Presto, Spark, Flink, Pinot, and more. The team actively contributes to the Open Source community and aims to expand its involvement.Your RoleWe are modernizing our Big Data Platform and need your expertise to redefine our architecture to enhance user experience, reduce costs, and boost efficiency. If you are passionate about Big Data technologies and eager to explore Open Source, this position is tailored for you!Key ResponsibilitiesOptimize and fine-tune existing Big Data systems and pipelines, while also developing new ones to ensure they operate efficiently and cost-effectively.
Full-time|₹500K/yr - ₹2M/yr|On-site|Bengaluru, Karnataka, India
Role overview Weekday's Client seeks a Big Data Developer in Bengaluru to improve and maintain data pipelines and processing systems that drive business intelligence and analytics. The position works with large volumes of both structured and unstructured data, spanning cloud and on-premise environments. Collaboration with data engineers, analysts, and product teams is central to delivering reliable, high-performance solutions that support business decisions. What you will do Design, develop, and maintain scalable data pipelines and big data processing systems. Build and optimize data architectures using AWS services to increase availability and performance. Use PostgreSQL for data storage, querying, and performance tuning. Process and analyze large datasets to enable analytics and reporting. Work with cross-functional teams to gather requirements and deliver data solutions. Maintain data quality, integrity, and security across all systems. Optimize data workflows for better performance, scalability, and cost efficiency. Implement ETL and ELT processes for smooth data ingestion and transformation. Monitor and troubleshoot data pipelines to ensure reliability and uptime. Integrate cloud and on-premise data systems to support hybrid environments. Document data architecture, workflows, and best practices for future growth. Location This role is based in Bengaluru, Karnataka, India.
CSQ127R179MissionThe Manager of Technical Solutions in Data & AI will spearhead the development and expansion of a specialized team in India, dedicated to ensuring the resilience and seamless operation of customer production workloads. This leadership position involves overseeing support operations during APJ and EMEA business hours while closely collaborating with global teams to guarantee 24/7 support coverage. The team will tackle intricate and protracted data engineering challenges presented by Databricks clients, facilitating the success of real-time use cases. Responsibilities include performance optimization, ensuring production job reliability, and assisting customers in stabilizing workloads on innovative products and features. You will deeply understand the genuine business challenges our clients face with data and be committed to enhancing the reliability and performance of their systems to achieve their objectives.The Impact You Will Have:Lead as the Technical Solutions Manager for an elite team of Data & AI professionals, providing support across EMEA and APJ business hours.Enhance the technical proficiency of the team to ensure successful adoption of new Databricks platform products and features for customer production workloads.Engage with top-tier clients to comprehend their business needs pertaining to Data & AI strategy, collaborating with field engineering and sales as necessary.Partner with internal product engineering teams to improve Databricks products’ supportability and performance.Maintain high reliability of the Databricks platform to help customers achieve their business goals.
Position: Big Data Lead & DeveloperExperience: 8-10 yearsLocation: BengaluruIndustry: TelecommunicationsEmployment Type: Full-time Visa Sponsorship: Available for on-site client workCore Competency: In-depth knowledge of the data management landscape with a strong grasp of core Big Data design patterns and challenges in data analytics, modeling, quality improvement, and data management implementation. Proficient in technologies such as Hadoop, Spark, Amazon Web Services, Google Cloud Platform, and Microsoft Azure.Position Overview: The Big Data Lead will spearhead the design and development of scalable and distributed applications utilizing the Hadoop Technology Stack, including Apache Pig, Apache Hive, and HDFS.Key Responsibilities:Develop cutting-edge ICT big data solutions that integrate existing systems with new data analytics strategies.Design, test, deploy, and document Big Data platforms and analytical procedures.Deliver scalable and distributed applications utilizing Hadoop technologies.Provide efficient solutions based on project requirements.Implement Map/Reduce jobs, UDFs, and tune the performance of Hadoop jobs.Collaborate with data scientists, analysts, and product management to summarize data analysis results for business decision-making.Document functional processes within data quality applications.Research and assess the suitability of various technologies to advise on optimal project solutions.
Join Huawei Technologies as a Big Data Automation Tester!Location: BengaluruEmployment Type: 2-Year Contract (Under Third Party Payroll)This contract may be renewable after two years based on performance.Experience Required: 3+ YearsTechnical Skills Required: C, Python, Linux Kernel, Scripting, Big DataJob Summary:We are seeking a dedicated Automation Tester passionate about fixing and debugging software issues. This full-time contractor position will involve testing our advanced suite of solutions for the ARM64 platform, a cutting-edge technology in the industry.Your Responsibilities:Conduct comprehensive testing of the ARM64-based solution, ensuring the identification and verification of various use cases for data center applications.Develop automated test cases and scripts, assist in building test environments, and consolidate reporting for test results.Explore and apply various testing methodologies, and depending on performance, there may be opportunities to engage with open-source projects.What We’re Looking For:Experience in testing or benchmarking server applications, including databases, distributed file systems, Big Data, and virtualization.A strong understanding of use case testing for server and data center applications.Proficient automation skills in scripting languages such as C, Python, or other testing programming languages.Experience with ARM or hardware-based application testing is an added advantage.
Teamwork makes the stream work. Roku is revolutionizing the way the world experiences televisionAs the leading TV streaming platform in the U.S., Canada, and Mexico, Roku aims to empower every television globally. With a pioneering spirit, we connect users to their favorite content, support content creators in growing their audiences, and provide advertisers with innovative ways to engage viewers.From your first day at Roku, you'll be an integral part of our journey. As a fast-growing public company, everyone here plays a crucial role. Join us in delighting millions of TV streamers worldwide while gaining invaluable experience across various disciplines. About the TeamThe Roku Data Engineering team is dedicated to building a state-of-the-art big data platform that empowers both internal and external stakeholders to leverage data for business growth. Our team collaborates closely with business partners and engineering teams to gather metrics on essential initiatives for success. As a Senior Data Engineer focused on device metrics, you will design data models and develop scalable data pipelines to capture critical business metrics across our diverse range of Roku devices. About the RoleAt Roku, we connect users to the streaming content they love and enable content publishers to monetize large audiences while providing advertisers with unique capabilities to engage consumers. Our Roku streaming players and Roku TV™ models are available worldwide through direct retail sales and partnerships with TV brands and pay-TV operators. With millions of devices sold in numerous countries, thousands of streaming channels, and billions of hours of content consumed, the development of a scalable, highly available, fault-tolerant big data platform is crucial to our success. This role is based in Bengaluru, India and requires hybrid work, with three days in the office. What You'll Be DoingDevelop and maintain highly scalable, fault-tolerant distributed data processing systems (both batch and streaming) handling terabytes of data ingested daily and managing a petabyte-sized data warehouse.Design and implement efficient data models and pipelines that support business growth and decision-making.
Bosch Global Software Technologies Private Limited
Full-time|On-site|bengaluru
Job Title: Senior AI ML Data Engineer for ADAS Backend Development with AzureWe are seeking a skilled Python Lead with extensive experience in Azure Cloud to oversee data-in-the-loop operations, emphasizing data traceability and sequencing.Key Responsibilities:Prioritize project requirements in alignment with stakeholder expectations. Design, develop, and maintain high-quality Python code, with a primary focus on FastAPI for API development.Manage API deployment and operations on Azure Cloud, utilizing Docker and Kubernetes for effective containerization and orchestration.Implement Infrastructure as Code (IaC) using Terraform to ensure scalable and maintainable cloud infrastructure. Collaborate with cross-functional teams to define project requirements and architecture.Conduct thorough code reviews, offer technical guidance, and mentor team members to promote continuous improvement.Uphold best practices in software development, including rigorous testing, comprehensive documentation, and effective version control.Troubleshoot and resolve complex technical challenges to ensure system reliability and performance.
Join our dynamic team at kitspvtltd as a Software Engineer, Senior Software Engineer, or Technical Consultant specializing in Azure technologies. This role offers an exciting opportunity to work with cutting-edge cloud solutions and contribute to impactful data management projects.
We are seeking a talented and experienced Big Data Architect to join our dynamic team at Squircle IT Consulting Services. In this role, you will be pivotal in designing and implementing robust data architectures that support our clients' needs in the ERP and Business Intelligence domains. Your expertise in big data technologies will enable us to deliver innovative solutions that drive efficiency and scalability.
Join the Bosch Group as a Lead Python Developer specializing in Azure and Data Management. In this pivotal role, you will lead a team of talented developers, architecting and implementing innovative data solutions on the Azure platform. Your expertise will drive data management strategies that enhance business operations and deliver impactful insights.
Teamwork Makes the Stream Work. Roku is Revolutionizing Television ViewingRoku stands at the forefront as the leading TV streaming platform across the U.S., Canada, and Mexico, with an ambitious goal to power every television worldwide. We initiated the streaming journey for TVs and aim to be the central platform connecting the entire TV ecosystem. Our mission is to connect viewers with their favorite content, empower publishers to grow and monetize large audiences, and provide advertisers with innovative tools to engage effectively.From your first day at Roku, your contributions will be valued and impactful. We are a rapidly expanding public company where every team member plays a crucial role. Join us in delighting millions of viewers globally while gaining significant experience across diverse disciplines. About the Team The Data Insights team is integral to Roku’s Advertising organization, spearheading measurement and analytics efforts that drive strategic decisions within the advertising landscape. We craft and oversee products that yield actionable insights for advertisers while fulfilling the operational and analytical requirements of internal teams. Collaboration is key as we partner with Product Managers, Data Scientists, Ad Sales, Ads Operations, and various groups within Advertising Engineering to deliver high-impact solutions. Looking ahead, we are investigating AI-driven measurement capabilities to enhance advertising campaign effectiveness and bolster internal analytics. About the Role We are in search of a talented Senior Software Engineer with extensive expertise in big data technologies, such as Apache Spark and Apache Airflow. This hybrid role merges software engineering and data engineering, necessitating skills in designing, building, and maintaining scalable systems for application development and large-scale data processing. In this position, you will collaborate with cross-functional teams to architect and manage robust, production-grade data products that fuel essential analytics and measurement capabilities. You will engage with technologies including Apache Spark, Apache Airflow, Trino, Druid, Spring Boot, and StarRocks.
About Us At Coupang, our mission is to astonish our customers. We thrive on the feedback that echoes, “How did we ever live without Coupang?” Driven by a passion to simplify shopping, dining, and everyday living, we are reshaping the multi-billion-dollar e-commerce landscape from the ground up. Recognized as one of the fastest-growing e-commerce entities, we have built an unmatched reputation as a dominant and dependable player in South Korean commerce. We combine the dynamic spirit of a startup with the extensive resources of a large global public company. This unique blend empowers our continuous growth and the rapid launch of innovative services. We are a community of entrepreneurs, all dedicated to driving new initiatives and engendering transformative innovations. At Coupang, you will witness personal and professional growth for yourself, your colleagues, your team, and the company as a whole. Our commitment to redefining commerce is unwavering. We continually push the limits of what’s possible to address challenges and transcend traditional barriers. Join Coupang today and help us craft an extraordinary experience in this interconnected and high-tech world. Role Overview As a Senior Staff Data Engineer on the Core Data and Ingestion team, you will lead the architecture, design, and development of robust Data Ingestion systems, Data Lakes, Data Warehouses, and Data Marts. Your work will enable data-driven decision-making across various departments within Coupang. In this pivotal role, you will collaborate with diverse business domains to identify their data needs, and leverage Big Data Infrastructure to design and deliver reliable, scalable data products that process terabytes of data daily, ultimately enhancing our customers' experience.
At Branch, we empower every interaction with reliable links and valuable insights that drive measurable growth. Our advanced attribution, powered by AI-driven linking, is trusted to facilitate seamless experiences that enhance ROI, reduce wasted expenditures, and eliminate fragmented attribution.We uphold the same commitment to building our team, enabling our members to act swiftly, take ownership of results, and create impactful solutions. We take pride in investing in our team's health, wealth, and professional development so each individual can flourish as we expand. Our culture values intelligent, humble, and collaborative colleagues who embrace accountability and produce results in an environment where their contributions significantly advance the business.We are innovative, intentionally scaling, and led by experienced leaders who understand the nuances of building lasting companies. Trusted by notable brands like Instacart, Western Union, NBCUniversal, ZocDoc, and Sephora, we strike the perfect balance of being substantial enough to make an impact while remaining small enough for you to create a meaningful difference. If you are energized by the challenge of building, rapid learning, and shaping the future of customer growth, you will find your place here.In your role as a Senior Data Engineer, you will design, build, and manage components of our robust real-time and batch data pipelines, processing petabytes of data. Our platform is engineered to ingest and analyze billions of events daily, making the resulting aggregations and insights available within minutes in our analytical data stores. Data Analytics is central to our operations, and we continuously innovate to enhance our systems' performance, timeliness, cost-effectiveness, and reliability. You will enhance our core data infrastructure and pipelines using state-of-the-art technologies such as Flink, Spark, Kafka, Iceberg, and Druid while operating within the AWS cloud environment.If you are eager to develop systems capable of processing billions of data points a day and explore petabytes of data, and are ready to push the boundaries of what is feasible with data, this is the place for you!
Collaboration Fuels Innovation. Join Roku in Revolutionizing Television ViewingAs the leading TV streaming platform in the U.S., Canada, and Mexico, Roku is on a mission to enhance how audiences experience television globally. We pioneered streaming technology and aim to connect consumers with the content they cherish, empower content publishers to grow and monetize their audiences, and offer advertisers unique tools to engage effectively with consumers.From day one at Roku, your contributions will be meaningful and recognized. As a rapidly expanding public company, we foster an environment where everyone plays a vital role. You’ll have the chance to delight millions of TV streamers worldwide while gaining invaluable experience across diverse disciplines. Team Overview The Data Management Platform (DMP) team is pivotal within Roku's Advertising division, spearheading audience management initiatives that drive decision-making across the advertising landscape. Our team develops and oversees products that facilitate advanced audience segmentation and management for advertisers, aligning with internal operational requirements. We collaborate closely with Product Managers, Machine Learning experts, Ad Sales, Ads Operations, and various teams within Advertising Engineering to deliver impactful solutions. Looking ahead, we are investigating AI-driven capabilities to further optimize advertising campaigns and enhance our platform's operational efficiency. Role Overview We are in search of a talented Senior Software Engineer skilled in big data technologies such as Apache Spark and Apache Airflow. This hybrid role will bridge software engineering expertise with data management, focusing on developing innovative solutions that enhance our advertising capabilities.
Mar 5, 2026
Sign in to browse more jobs
Create account — see all 2,332 results
Tailoring 0 resumes…
Tailoring 0 resumes…
We'll move completed jobs to Ready to Apply automatically.