About the job
Join Cermaticom as a Senior Data Engineer where you will leverage your Java expertise to design, develop, and maintain robust data systems. Collaborate with cross-functional teams to drive data strategies and improve data quality.
Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Generate Job-Optimized Resume
One Click And Our AI Optimizes Your Resume to Match The Job Description.
Is Your Resume Optimized For This Role?
Find Out If You're Highlighting The Right Skills And Fix What's Missing
Senior
Join Cermaticom as a Senior Data Engineer where you will leverage your Java expertise to design, develop, and maintain robust data systems. Collaborate with cross-functional teams to drive data strategies and improve data quality.
Cermaticom is a leading technology firm specializing in innovative data solutions. We are committed to providing cutting-edge technology and fostering a dynamic work environment where creativity and collaboration thrive.
Search for Data Engineer With Expertise In Pyspark Informatica Bdm
2,419 results
Join our dynamic Group Risk team as a Data Engineer, where you will play a pivotal role in constructing and managing comprehensive data pipelines essential for IFRS9 reporting. This position requires close collaboration with business stakeholders to identify data requirements, perform impact assessments, and deliver exceptional data solutions utilizing cutting-edge technologies like PySpark and Informatica BDM.Key ResponsibilitiesWork alongside the Group Risk Team to gather and comprehend business and data needs.Conduct impact assessments and technical data mapping for both new and existing data sources.Perform data profiling to guarantee data quality, consistency, and completeness.Design, develop, and sustain ETL pipelines utilizing PySpark and Informatica BDM.Create scalable data transformation workflows in alignment with IFRS9 data models.Ensure precise data extraction, transformation, and loading (ETL) into reporting systems.Engage in unit testing, validation, and deployment of data pipelines.Enhance data processing performance and resolve production issues.Utilize modern tools (e.g., AI-assisted tools like Claude) to boost productivity, minimize errors, and refine development workflows.Maintain thorough documentation for data flows, mappings, and processes.Required Skills & QualificationsProven experience in PySpark for extensive data processing.Hands-on experience with Informatica BDM (Big Data Management).Strong grasp of ETL concepts, data warehousing, and data modeling.Experience in data profiling, data mapping, and impact analysis.Familiarity with IFRS9 or the Risk/Banking domain is highly advantageous.Knowledge of distributed data processing frameworks and big data ecosystems.Excellent SQL skills and experience with relational databases.Sound understanding of data quality and governance principles.Preferred SkillsExperience with cloud platforms (AWS / Azure / GCP).Familiarity with AI-assisted development tools (e.g., Claude, GitHub Copilot).Understanding of CI/CD pipelines in data engineering workflows.Soft SkillsStrong analytical and problem-solving abilities.Exceptional communication and stakeholder management skills.Capacity to thrive in a fast-paced, collaborative environment.
Join us at gsstech-group as a Data Engineer, where your expertise in PySpark and Informatica BDM will play a crucial role in enhancing our Risk & Compliance platforms within a transformative Trade Transformation initiative. Collaborate with a dedicated Data Engineering team to deliver scalable, high-quality data solutions that meet our strategic objectives.Key Responsibilities:Engage with the Data Engineering team to evaluate the impacts of transformation initiatives on Risk & Compliance platforms.Convert business requirements and impact assessments into actionable data engineering tasks.Architect, develop, and implement robust data solutions that adhere to enterprise architecture standards.Build and optimize data pipelines leveraging PySpark and Informatica BDM.Ensure prompt delivery of development tasks while maintaining exceptional quality and compliance with SLAs.Collaborate with cross-functional teams, including business, risk, and compliance stakeholders.Utilize cutting-edge tools and technologies, including AI-assisted development tools, to drive productivity.Participate in code reviews, testing, and deployment activities.
gsstech-group
Join our innovative team as a Data Engineer within the Data Engineering Chapter, supporting the Group Operations Team. As a vital contributor, you will partner with both business and technical stakeholders to define data needs, execute impact assessments, and create robust, scalable data pipelines utilizing cutting-edge technologies such as PySpark.Key ResponsibilitiesEngage with the Group Operations Team to capture and evaluate data requirementsConduct impact assessments, technical data mapping, and data profilingArchitect and implement data extraction, transformation, and loading (ETL) pipelinesEnhance and fine-tune data pipelines using PySpark as part of the bank's modern technological ecosystemCraft data solutions that align with AECB application data modelsGuarantee data quality, integrity, and consistency across all systemsParticipate in unit testing, deployment, and production supportUtilize advanced AI tools (e.g., Claude) to boost development efficiency and minimize operational errorsCollaborate in an agile environment to foster continuous improvement initiatives
Join our dynamic team at gsstech-group as a talented Data Engineer within the Data Engineering Chapter, working closely with the Group Operations team at ENBD. In this role, you will be instrumental in developing scalable data pipelines, conducting data analyses, and providing top-tier data solutions that align with our enterprise data models.Key ResponsibilitiesEngage with the Group Operations Team daily to clarify business and data requirements.Execute Impact Assessments for both new and existing data modifications.Carry out Technical Data Mapping and Data Profiling.Design, implement, and sustain ETL pipelines for efficient data extraction, transformation, and loading.Create data solutions that integrate with the AECB application following established data models.Enhance and maintain data pipelines utilizing PySpark on contemporary data platforms.Ensure the quality, consistency, and integrity of data across various systems.Conduct unit testing, debugging, and deployment of data solutions.Utilize modern tools and AI technologies (e.g., Claude) to boost development efficiency and minimize operational errors.Collaborate effectively with cross-functional teams including business analysts, architects, and QA.Required Skills & QualificationsProven expertise in PySpark and distributed data processing.Experience with Informatica BDM Development (Big Data Management).Deep understanding of ETL/ELT concepts and data pipeline architecture.Proficiency in data mapping, data profiling, and impact analysis.Experience with large-scale data systems and cloud/data platforms.Strong SQL skills and a solid grasp of data warehousing principles.Familiarity with the banking/financial domain is a plus.Knowledge of AI-assisted development tools (e.g., Claude) is advantageous.Excellent problem-solving and analytical abilities.Preferred QualificationsExperience with AECB data/reporting systems.Exposure to big data ecosystems (Hadoop/Spark clusters).Understanding of data governance and compliance standards.
Job Title: Data Engineer (PySpark)________________________________________About the RoleWe invite you to join our dynamic data engineering team as a proficient Data Engineer specializing in PySpark and the Cloudera Data Platform (CDP). In this pivotal role, you will be tasked with architecting, developing, and sustaining robust data pipelines that guarantee exceptional data quality and accessibility throughout the organization. Your expertise in big data ecosystems, cloud-native technologies, and sophisticated data processing methodologies is essential.The ideal candidate will possess extensive hands-on experience in data ingestion, transformation, and optimization on the Cloudera Data Platform, complemented by a strong history of applying data engineering best practices. You will collaborate closely with fellow data engineers to devise solutions that foster significant business insights.Key ResponsibilitiesDesign and develop scalable ETL pipelines using PySpark on CDP, ensuring data integrity.Manage data ingestion processes from diverse sources (e.g., relational databases, APIs, file systems) to the data lake or warehouse on CDP.Utilize PySpark for processing, cleansing, and transforming vast datasets to meet analytical and business needs.Optimize performance by fine-tuning PySpark code and Cloudera components to enhance resource utilization.Establish data quality checks and validation routines to maintain data accuracy throughout the pipeline.Automate workflows using orchestration tools like Apache Oozie or Airflow within the Cloudera ecosystem.Monitor pipeline performance, troubleshoot issues, and maintain the Cloudera Data Platform and associated processes.Collaborate with data engineers, analysts, product managers, and other stakeholders to understand data requirements.Document data engineering processes, code, and pipeline configurations thoroughly.QualificationsBachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or a related discipline.3+ years of experience as a Data Engineer, focusing on PySpark and the Cloudera Data Platform.
Job Title: Data Analyst (Python + Pyspark)About UsCapco, a Wipro company, is a leading global technology and management consulting firm. Recognized as Consultancy of the Year at the British Bank Awards and ranked among the Top 100 Best Companies for Women in India 2022 by Avtar & Seramount, we operate in over 32 cities worldwide, supporting more than 100 clients in banking, finance, and energy sectors. We pride ourselves on our capability to execute and deliver transformative solutions.Why Join Capco?Engage in dynamic projects with leading international and local banks, insurance companies, payment service providers, and other pivotal industry players. Contribute to projects that are set to revolutionize the financial services landscape.Make an ImpactBring innovative thinking and delivery excellence to help our clients transform their businesses. Together with our clients and industry partners, we create disruptive advancements that are reshaping the energy and financial services sectors.#BeYourselfAtWorkCapco fosters an inclusive and open culture that values diversity and creativity.Career AdvancementAt Capco, we promote a flat organizational culture that empowers everyone to take charge of their career development.Diversity & InclusionWe believe our competitive edge stems from a diverse workforce and varied perspectives.Job Description:Role: Data Analyst / Senior Data AnalystLocation: Bangalore/PuneResponsibilities include defining and sourcing data necessary for delivering insights and use cases, determining data mapping across multiple datasets, and creating processes to highlight key trends.
Join our innovative team as a Software Developer where you will leverage your expertise in Python, PySpark, and AWS to build cutting-edge solutions. You will be working on exciting projects that challenge your skills and foster your professional growth.
Squircle IT Consulting Services Pvt Ltd
We are seeking a skilled Informatica Administrator to join our dynamic team in Bengaluru. In this role, you will manage and optimize our Informatica environment, ensuring the seamless integration and transformation of data across various platforms. Your expertise will play a crucial role in supporting our data operations and enhancing our data management capabilities.
Squircle IT Consulting Services Pvt Ltd
Join our dynamic team as an Informatica Developer, where you will play a crucial role in designing, developing, and maintaining data integration solutions. You will work closely with data architects, business analysts, and stakeholders to ensure seamless data flow and integration across various platforms. If you have a passion for data and a knack for problem-solving, this is the perfect opportunity for you!
Squircle IT Consulting Services Pvt Ltd
Join our dynamic team as an Informatica Administrator at Squircle IT Consulting Services Pvt Ltd, located in the heart of Bengaluru. We are seeking a skilled individual to oversee and manage our Informatica environment, ensuring data integration processes are optimized and running smoothly.
Squircle IT Consulting Services Pvt Ltd
Join our dynamic team at Squircle IT Consulting Services Pvt Ltd as an Informatica Administrator. In this role, you will be responsible for managing and optimizing our Informatica platform, ensuring data integrity and performance. You will work closely with cross-functional teams to support data integration and transformation initiatives.
Squircle IT Consulting Services Pvt Ltd
Join our dynamic team as a Senior Informatica Administrator where you will be responsible for overseeing the Informatica environment and ensuring optimal performance of data integration processes. You will manage the installation, configuration, and maintenance of Informatica tools while collaborating with cross-functional teams to support data-driven decision-making.
Eram Talent is on the lookout for an accomplished Informatica (MDM) Developer with a focus on Data Governance to become a vital part of our team in Saudi Arabia. As a premier Talent Acquisition Company, we pride ourselves on connecting our clients with exceptional talent to enhance their data governance strategies and initiatives.In this pivotal role, you will design, develop, and maintain Master Data Management (MDM) solutions utilizing Informatica tools. You will work closely with business stakeholders and technical teams to ensure that data governance standards and processes are effectively implemented and maintained.Key ResponsibilitiesCollaborate with the Product Manager in defining product roadmaps, prioritizing features, and managing product development processes.Assist in crafting project plans, timelines, and resource allocation under the guidance of the Product Manager.Carry out assigned tasks and activities within the project scope, ensuring compliance with project schedules and milestones.Identify project risks and contribute to the formulation of mitigation strategies.Support the Product Manager in communicating project progress and updates to stakeholders and senior management.Empower product teams in their decision-making processes through quantitative data analysis.Utilize performance metrics to assess product development effectiveness and alignment with development plans.Conduct market research to track competitor activities and pricing trends, providing recommendations to inform strategic decisions.
Role Overview Tietoevry is looking for a Senior PySpark Developer to join the Tieto Tech Consulting team in Bengaluru. This position focuses on designing and building scalable data solutions for clients, using PySpark as a core technology. Projects often involve complex data analytics and processing challenges. What You Will Do Design and implement data solutions using PySpark Work closely with team members to solve data analytics problems Support clients in making the most of their data assets Contribute to collaborative project work in a consulting setting Location This role is based in Bengaluru.
Squircle IT Consulting Services Pvt Ltd
Join our dynamic team at Squircle IT Consulting Services as an Informatica Developer. We are looking for a skilled professional to design and implement data integration solutions, utilizing Informatica tools to optimize data workflows and enhance business intelligence.
Squircle IT Consulting Services Pvt Ltd
We are seeking a talented and motivated Informatica Developer to join our dynamic team at Squircle IT Consulting Services. In this role, you will leverage your expertise in data integration and ETL processes to deliver high-quality data solutions that meet our clients’ needs.
Cermaticom
Join Cermaticom as a Senior Data Engineer where you will leverage your Java expertise to design, develop, and maintain robust data systems. Collaborate with cross-functional teams to drive data strategies and improve data quality.
Squircle IT Consulting Services Pvt Ltd
We are seeking a skilled Informatica Administrator to become a vital part of our dynamic team at Squircle IT Consulting Services Pvt Ltd. As an Informatica Administrator, you will be responsible for managing and optimizing data integration processes using Informatica tools to support our clients' needs.Your expertise will ensure the smooth operation of data flows and the management of ETL processes, directly contributing to the success of our projects.
Squircle IT Consulting Services Pvt Ltd
Join our dynamic team at Squircle IT Consulting Services Pvt Ltd as an Informatica Developer. We are looking for a skilled professional who can contribute to the development and enhancement of our data integration solutions using Informatica. You will be responsible for designing, building, and maintaining ETL processes to ensure optimal data flow and integrity.
Join us as a Senior Software Engineer specializing in IAM and AI for one of Weekday's esteemed clients!Salary range: ₹20,00,000 - ₹33,00,000 per annumWe are seeking a dedicated Senior Software Engineer – Information Security to architect, develop, and sustain cutting-edge identity and access management (IAM) solutions across a variety of enterprise systems. This role will emphasize the creation of robust authentication and authorization frameworks while managing identities and ensuring adherence to stringent security protocols.In collaboration with engineering, security, and infrastructure teams, you will implement scalable IAM systems designed to protect applications, services, and critical data. A solid background in software development, coupled with practical experience in authentication systems, access control, and secure system design, is essential for success in this position.
Sign in to browse more jobs
Create account — see all 2,419 results
Tailoring 0 resumes…
We'll move completed jobs to Ready to Apply automatically.
