company

Data Engineer with PySpark and Informatica BDM Expertise

gsstech-groupBengaluru, Karnataka, India
On-site Full-time

Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.


Unlock Your Potential

Generate Job-Optimized Resume

One Click And Our AI Optimizes Your Resume to Match The Job Description.

Is Your Resume Optimized For This Role?

Find Out If You're Highlighting The Right Skills And Fix What's Missing

Experience Level

Experience

Qualifications

Bachelor's degree in Computer Science, Information Technology, or a related field.3+ years of experience in data engineering or related roles.

About the job

Join our dynamic Group Risk team as a Data Engineer, where you will play a pivotal role in constructing and managing comprehensive data pipelines essential for IFRS9 reporting. This position requires close collaboration with business stakeholders to identify data requirements, perform impact assessments, and deliver exceptional data solutions utilizing cutting-edge technologies like PySpark and Informatica BDM.

Key Responsibilities

  • Work alongside the Group Risk Team to gather and comprehend business and data needs.
  • Conduct impact assessments and technical data mapping for both new and existing data sources.
  • Perform data profiling to guarantee data quality, consistency, and completeness.
  • Design, develop, and sustain ETL pipelines utilizing PySpark and Informatica BDM.
  • Create scalable data transformation workflows in alignment with IFRS9 data models.
  • Ensure precise data extraction, transformation, and loading (ETL) into reporting systems.
  • Engage in unit testing, validation, and deployment of data pipelines.
  • Enhance data processing performance and resolve production issues.
  • Utilize modern tools (e.g., AI-assisted tools like Claude) to boost productivity, minimize errors, and refine development workflows.
  • Maintain thorough documentation for data flows, mappings, and processes.

Required Skills & Qualifications

  • Proven experience in PySpark for extensive data processing.
  • Hands-on experience with Informatica BDM (Big Data Management).
  • Strong grasp of ETL concepts, data warehousing, and data modeling.
  • Experience in data profiling, data mapping, and impact analysis.
  • Familiarity with IFRS9 or the Risk/Banking domain is highly advantageous.
  • Knowledge of distributed data processing frameworks and big data ecosystems.
  • Excellent SQL skills and experience with relational databases.
  • Sound understanding of data quality and governance principles.

Preferred Skills

  • Experience with cloud platforms (AWS / Azure / GCP).
  • Familiarity with AI-assisted development tools (e.g., Claude, GitHub Copilot).
  • Understanding of CI/CD pipelines in data engineering workflows.

Soft Skills

  • Strong analytical and problem-solving abilities.
  • Exceptional communication and stakeholder management skills.
  • Capacity to thrive in a fast-paced, collaborative environment.

About gsstech-group

gsstech-group is a leading technology solutions provider specializing in data management and analytics, dedicated to empowering businesses with innovative data-driven insights and solutions.

Similar jobs

Tailoring 0 resumes

We'll move completed jobs to Ready to Apply automatically.