Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Unlock Your Potential
Generate Job-Optimized Resume
One Click And Our AI Optimizes Your Resume to Match The Job Description.
Is Your Resume Optimized For This Role?
Find Out If You're Highlighting The Right Skills And Fix What's Missing
Experience Level
Senior
Qualifications
The ideal candidate will possess:Proven experience in Java development with a strong focus on big data technologies. Familiarity with frameworks such as Hadoop, Spark, or Kafka. Strong problem-solving skills and the ability to work in a fast-paced environment. Excellent communication and teamwork skills.
About the job
Join our dynamic team at dev2 as a Senior Java Engineer specializing in Big Data. In this role, you will leverage your expertise in Java and big data technologies to design, develop, and maintain scalable data solutions. You will collaborate with cross-functional teams to deliver high-quality products that meet our clients' needs.
About dev2
dev2 is a leading technology company based in Dubai, specializing in innovative solutions for the data-driven world. We pride ourselves on our commitment to excellence and our dynamic work environment that encourages growth and creativity.
Similar jobs
1 - 20 of 406 Jobs
Search for Data Engineer Pyspark Data Modeling Exciting Opportunities
Full-time|On-site|Dubai, Dubai, United Arab Emirates
Join gsstech-group as a Data Engineer specializing in PySpark and Data Modeling. In this critical role, you will leverage your expertise to design, construct, and maintain robust data pipelines that facilitate data analysis and reporting.As a Data Engineer, you will collaborate closely with data scientists and analysts to ensure that our data architecture supports the needs of the business. Your work will directly contribute to the optimization of our data management processes and the strategic use of data across the organization.
Full-time|On-site|Dubai, Dubai, United Arab Emirates
Join gsstech-group as a talented Data Engineer specializing in PySpark and the Cloudera Data Platform (CDP). In this pivotal role, you will be responsible for designing, developing, and maintaining high-quality data pipelines that are both scalable and efficient. Your contributions will ensure optimal data performance and availability across our organization.This position requires extensive hands-on experience in big data technologies, cloud-native environments, and advanced data processing frameworks. You will collaborate with diverse teams to create reliable data solutions that facilitate actionable business insights.Key Responsibilities1. Data Pipeline DevelopmentCraft and sustain scalable ETL/ELT data pipelines utilizing PySpark on CDP.Guarantee data integrity, reliability, and performance optimization.2. Data IngestionBuild ingestion frameworks to gather data from various sources including relational databases, APIs, streaming platforms, and file systems.Load both structured and unstructured data into Data Lake and Data Warehouse environments.3. Data Transformation & ProcessingProcess, cleanse, and transform extensive datasets using PySpark.Create reusable data processing components.4. Performance OptimizationOptimize Spark jobs and Cloudera components for peak performance.Enhance memory, partitioning, and execution strategies.Minimize ETL runtime and elevate cluster efficiency.5. Data Quality & ValidationEstablish data validation checks and monitoring systems.Maintain comprehensive data quality and governance standards.6. Automation & OrchestrationAutomate workflows using Apache Oozie, Apache Airflow, or similar orchestration tools.Support CI/CD integration for data pipelines.7. Monitoring & SupportOversee pipeline health and troubleshoot any issues that arise.Provide ongoing production support and drive continuous improvement initiatives.
Full-time|On-site|Dubai, Dubai, United Arab Emirates
Job Title: Data Engineer (PySpark)________________________________________About the RoleWe are in search of a talented Data Engineer with significant expertise in PySpark and the Cloudera Data Platform (CDP) to bolster our data engineering team. In this role, you will be tasked with the design, development, and maintenance of scalable data pipelines that guarantee high data quality and availability across the organization. A robust background in big data ecosystems, cloud-native tools, and advanced data processing techniques is essential.The right candidate will possess hands-on experience in data ingestion, transformation, and optimization within the Cloudera Data Platform, along with a solid history of implementing data engineering best practices. Collaboration with fellow data engineers will be key to creating solutions that yield impactful business insights.ResponsibilitiesDesign, develop, and sustain highly scalable and optimized ETL pipelines utilizing PySpark on the Cloudera Data Platform, ensuring data integrity and precision.Oversee data ingestion processes from diverse sources (e.g., relational databases, APIs, file systems) to the data lake or data warehouse on CDP.Employ PySpark to process, cleanse, and transform extensive datasets into actionable formats that fulfill analytical needs and business objectives.Optimize performance by tuning PySpark code and Cloudera components to enhance resource utilization and minimize ETL runtimes.Establish data quality checks, monitoring, and validation protocols to uphold data accuracy and reliability throughout the pipeline.Automate data workflows utilizing tools such as Apache Oozie, Airflow, or comparable orchestration tools within the Cloudera ecosystem.Monitor pipeline performance, troubleshoot issues, and perform routine maintenance on the Cloudera Data Platform and related data processes.Collaborate closely with other data engineers, analysts, product managers, and other stakeholders to comprehend data requirements and support various data-driven initiatives.Document data engineering processes, code, and pipeline configurations thoroughly.
Join Rackspace Technology, a premier provider of managed services and expertise across leading public and private cloud technologies. We pride ourselves on delivering a Fanatical Experience™, guiding customers from initial consultation through to daily operations. Our dedicated team combines proactive service with top-tier tools and automation to provide technology solutions tailored to our clients' needs.As a Data Modeler, you'll be at the forefront of designing and implementing both logical and physical data models in Azure Databricks (Delta Lake), employing the Medallion Architecture (Bronze, Silver, Gold). Your role will empower Data Engineers to kickstart development with precise schema definitions and clear SQL structures, ensuring clarity and precision in data management.
Full-time|On-site|Dubai, Dubai, United Arab Emirates
We are looking for a talented and experienced Senior Data Engineer to join our dynamic Data Engineering team in Dubai. The successful candidate will play a pivotal role in designing, constructing, and enhancing scalable data pipelines, while also facilitating advanced analytics and machine learning applications. This position demands a deep understanding of big data technologies, data processing frameworks, and feature engineering to effectively support machine learning models in a live production setting.Key ResponsibilitiesCollect and assess business and technical requirements for effective data solutions.Conduct Exploratory Data Analysis (EDA) to reveal data patterns and assess data quality.Design, develop, and maintain high-performance, scalable data pipelines.Ingest both structured and unstructured data from various sources.Transform and process extensive datasets using PySpark and Python.Apply feature engineering techniques to enhance machine learning models.Optimize Spark jobs for peak performance and cost efficiency.Ensure the quality, integrity, and security of data across all pipelines.Collaborate closely with Data Scientists, Analytics Teams, and Delivery Leads.Participate in an Agile environment alongside cross-functional teams.Effectively communicate with stakeholders, providing valuable technical insights.Utilize Git for version control and manage CI/CD workflows.Required Skills & ExpertiseProven experience in Python and PySpark.In-depth knowledge of Apache Spark (performance tuning and optimization).Comprehensive understanding of the Hadoop ecosystem.Advanced skills in SQL.Experience in building end-to-end data and ML pipelines.Strong grasp of data modeling and warehousing concepts.Experience with feature engineering for ML applications.Familiarity with Git and version control systems.Exposure to cloud platforms (AWS/Azure/GCP) is advantageous.QualificationsBachelor’s or Master’s degree in Computer Science, Engineering, or a related field.8–10+ years of experience in Data Engineering / Big Data.Experience in the banking or financial services sector is preferred.Soft SkillsExceptional problem-solving and analytical skills.Strong communication and stakeholder management abilities.Capability to work both independently and collaboratively within a team.Adaptability to fast-paced Agile workflows.
Join our innovative team at mexdigital as an AI Data Engineer. In this role, you will be instrumental in shaping our data-driven solutions, leveraging cutting-edge AI technologies to enhance business processes and deliver impactful insights. Collaborate with cross-functional teams to design, implement, and maintain scalable data pipelines and machine learning models that drive our projects forward.
Full-time|On-site|Dubai, Dubai, United Arab Emirates
Welcome to SFORS! As a prominent player in the global financial markets, we pride ourselves on being at the forefront of proprietary trading and pre-market strategies. With over 20 years of experience, our success is driven by a relentless commitment to talent development, cutting-edge trading technologies, sophisticated risk management models, and strategic trading methodologies, all aimed at ensuring our traders achieve remarkable success.We are seeking a talented Data Engineer to join our dynamic team in Dubai. In this role, you will be instrumental in designing, developing, and maintaining robust data pipelines and infrastructure that support our analytics and machine learning initiatives. Your expertise will guarantee the seamless flow of data across various systems, uphold data modeling best practices, ensure stringent data security measures, and empower our teams with high-quality, structured data.
Full-time|On-site|Dubai, Dubai, United Arab Emirates
Join our dynamic team at gsstech-group as a Senior Data Engineer. In this pivotal role, you will design, build, and enhance scalable data pipelines that empower advanced analytics and machine learning initiatives.Collaborating closely with Data Scientists, Analytics Delivery Leads, and interdisciplinary teams, you will transform raw data into impactful insights within a fast-paced Agile environment.Key ResponsibilitiesPartner with stakeholders to collect and evaluate data requirements.Conduct Exploratory Data Analysis (EDA) to discern data patterns and quality.Architect and implement robust, scalable, and high-performance data pipelines.Ingest, process, and convert extensive structured and unstructured datasets.Apply feature engineering techniques that bolster machine learning models.Enhance Spark jobs for optimal performance, scalability, and cost-effectiveness.Maintain data quality, integrity, and security across all pipelines.Work collaboratively with Data Scientists and Analytics teams to deploy ML pipelines.Engage in Agile ceremonies and contribute to ongoing improvements.Effectively communicate technical solutions to both technical and non-technical stakeholders.Required Skills & QualificationsBachelor’s or Master’s degree in Computer Science, Engineering, or a related discipline.8–10 years of experience in Data Engineering and Big Data ecosystems.Proficient programming skills in Python.Practical experience with PySpark / Apache Spark, including performance optimization.Strong knowledge of the Hadoop ecosystem.Advanced skills in SQL.Experience with data pipeline development and ETL frameworks.Familiarity with Machine Learning pipelines and feature engineering.Knowledge of version control systems (Git).Excellent problem-solving and analytical abilities.Good to HaveExperience with cloud platforms (AWS / Azure / GCP).Understanding of data warehousing solutions.Familiarity with workflow orchestration tools (e.g., Airflow).Experience in the banking or financial services sector.Key CompetenciesStrong collaboration and communication abilities.Capability to thrive in a fast-paced Agile setting.Detail-oriented with an ownership mindset.Proficient in stakeholder management and coordination.
Full-time|On-site|Dubai, Dubai, United Arab Emirates
DeepLight AI is a premier consultancy specializing in artificial intelligence and data solutions, renowned for our extensive expertise in deploying intelligent enterprise systems across diverse industries, particularly in financial services and banking. Our team merges profound knowledge in data science, statistical modeling, AI/ML technologies, workflow automation, and systems integration with a pragmatic grasp of intricate business operations.The Data Engineer will play a pivotal role in designing, implementing, and optimizing data pipelines and infrastructure to bolster our innovative AI systems. This role involves close collaboration with our cross-functional team to ensure the effective collection, storage, processing, and analysis of large-scale data, unlocking valuable insights and fostering innovation across various sectors.
Full-time|Hybrid|Dubai, Dubai, United Arab Emirates
Envision Employment Solutions is actively seeking talented Senior Data Management Engineers to join our esteemed partner, a prominent global leader in IT Services and Consulting.Job Overview:We are in search of a Senior Data Management Engineer who possesses extensive knowledge in data architecture, integration, and transformation, with a particular focus on leveraging Informatica tools. The successful candidate will be responsible for designing and implementing comprehensive enterprise data solutions that guarantee data accessibility, quality, governance, and scalability across various business domains.Your role will involve collaboration with cross-functional stakeholders to effectively translate business requirements into secure, reliable, and high-performance data systems.Key Responsibilities:Design, develop, and maintain robust ETL processes and data pipelines utilizing Informatica (PowerCenter, IDQ, or Cloud Data Integration).Optimize data integration workflows for both structured and unstructured data sources.Ensure the highest standards of data quality, integrity, governance, and security across enterprise platforms.Engage in data modeling and architectural discussions.Work closely with data architects, business analysts, and application teams to align data solutions with business objectives.Document data flows, transformations, and system architecture comprehensively.Identify and resolve performance bottlenecks and complex data-related challenges.Support initiatives related to enterprise data governance and metadata management.Qualifications:Bachelor’s degree in Computer Science, Information Systems, or a related field.3 to 5 years of professional experience in data engineering or management roles.Proficient hands-on experience with Informatica (PowerCenter, IDQ, or Cloud Data Integration).Strong SQL skills and familiarity with relational databases (e.g., Oracle, SQL Server).Solid understanding of data warehousing concepts and enterprise data architecture.Experience in data governance and metadata management.Knowledge of cloud data platforms (AWS, Azure, or GCP) is advantageous.Excellent analytical and troubleshooting abilities.Fluent communication skills in English.Benefits and Work Environment:Competitive basic salary.Social insurance coverage.Comprehensive family medical insurance (AXA).Location: Dubai.Work Model: Hybrid – 2 days in the office and 3 days remote. Depending on project requirements, you may be required to work from the client’s premises more frequently.Working Hours: 9 AM to 6 PM.Days Off: Saturday and Sunday.
Job Title: Lead Data Scientist / Expert in Data Insights.Roles and Responsibilities:Demonstrated proficiency in Business Intelligence tools, particularly Tableau and MicroStrategy, ensuring a continuous flow of actionable data.Strong understanding of the transformative impact of data analytics on business operations, highlighting its advantages.Innovate by developing decision analytics solutions utilizing advanced methodologies and techniques.Design user-friendly processes and reporting models that empower business units to access and manage their data in real-time.Oversee engagement methodologies to ensure compliance, mitigate risks, and maintain the highest quality standards, collaborating with teams to implement necessary adjustments.
Oversee and lead a team of proficient data analysts.Develop data mappings essential for creating data marts and lakes.Design and implement user-friendly interfaces and dashboards for effective data interaction and visualization.Conduct thorough testing and validation of software and data products to guarantee accuracy and dependability.Engage in collaboration with fellow developers, data analysts, and stakeholders to ensure software alignment with business objectives.Provide analytical support to Business Performance and Strategic Analytics for departments and decision-makers.Core Skills:Expertise in Data Product Development and Data Mart design.Knowledge of Machine Learning, Deep Learning, Time-Series Analysis, and Optimization techniques.Understanding of Retail, Business, and Wholesale Banking products.Strong communication and interpersonal skills.Proficient in people management with a focus on multicultural awareness.Technical Skills:Proficient in Python (along with R, SAS, and Spark) and SQL, DAX.Strong coding abilities in languages such as Python, R, and Java.Familiarity with cloud data technologies.Competencies:Advanced proficiency in SQL and Python; experience with SAS/R/Spark is an advantage.Experience in deployment across various databases and server/cloud environments including AWS and Azure.Thorough understanding of banking functionalities is a significant advantage.Excellent communication and documentation skills to effectively convey information to stakeholders.Familiarity with tools such as Power BI, Tableau, or Qlik is beneficial.
Join our dynamic team at dev2 as a Senior Java Engineer specializing in Big Data. In this role, you will leverage your expertise in Java and big data technologies to design, develop, and maintain scalable data solutions. You will collaborate with cross-functional teams to deliver high-quality products that meet our clients' needs.
Data Management • Identify and evaluate valuable data sources while automating data collection processes. • Prepare datasets for modeling applications. • Conduct preprocessing of both structured and unstructured data. • Analyze extensive datasets to uncover trends and insights. • Utilize data visualization techniques to effectively present information.Business Acumen • Strong understanding of retail banking products, particularly in the credit card sector.Model Development and Management • Create and refine credit risk models tailored for retail portfolios (e.g., credit decisioning, PD, LGD, EAD, IFRS9), ensuring adherence to internal and regulatory standards. • Validate that model outputs are suitable for daily business operations, underwriting decisions, and risk appetite strategies. • Continually enhance models based on user feedback, regulatory requirements, and ongoing performance assessments. • Collaborate closely with the Enterprise Risk Management (ERM) team for stress testing and ICAAP submissions.Model Monitoring and Implementation: • Ensure models remain precise, dependable, and compliant with regulatory standards. • Conduct user acceptance testing to guarantee proper implementation of models within the operational systems. • Address ad-hoc portfolio analysis requests promptly. • Consistently realign models to monitor performance, offering ongoing guidance for retail portfolio lending activities.• Develop, implement, and oversee risk models for the retail lending portfolio. • Support the team in achieving efficient delivery of requirements set forth by management and external regulators.
Join our dynamic team at ghobashgroup as a Junior Data & AI Engineer, where you will be instrumental in leveraging data-driven solutions to enhance our AI capabilities. This role is tailored for Emirati nationals who are eager to kickstart their careers in the exciting fields of data science and artificial intelligence.
Full-time|On-site|Dubai, Dubai, United Arab Emirates
DeepLight AI is a leading consultancy specializing in artificial intelligence and data solutions, renowned for our in-depth experience in deploying intelligent enterprise systems across various sectors, particularly in financial services and banking. Our expert team blends comprehensive knowledge of data science, statistical modeling, AI/ML technologies, workflow automation, and systems integration with a nuanced understanding of intricate business operations.We are currently seeking a Data Assurance Lead to act as the principal steward of data integrity within our clients' multifaceted Lakehouse environments. In this pivotal consultancy position, you will not only oversee data but also architect, execute, and operationalize a comprehensive Data Assurance Framework from the ground up. By leveraging Soda Core within the Medallion architecture (Bronze, Silver, Gold), you will establish stringent quality standards, automate data contracts, and incorporate scorecards into OpenMetadata.In this consultative role, you will transcend the role of a mere technical specialist; you will become a strategic champion for data reliability. Exceptional communication skills are essential to effectively convey the significance of data assurance to stakeholders, guide cross-functional teams through ambitious migration objectives, and foster a culture of quality within AWS Glue and Data Factory workflows. If you have over 8 years of experience—preferably within the Financial Services or Banking industries—and flourish in dynamic, Agile environments where you can simplify complexity and engage senior stakeholders, we invite you to take on this exciting challenge.
Modeling Expertise:Proven experience in designing and implementing propensity models focused on cross-selling and upselling opportunities.In-depth knowledge of Next Best Action / Next Best Offer systems.Solid understanding of supervised learning methodologies, uplift modeling, and causal inference techniques.Programming & Engineering:Proficient in Python (required); knowledge of R or Scala is advantageous.Experience with distributed computing frameworks such as Spark, Dask, or Ray.Skilled in developing production-level ML pipelines using tools like Airflow, MLflow, or Kubeflow.Strong familiarity with software engineering best practices including version control, CI/CD, and thorough testing.Data Infrastructure:Experience with cloud platforms (AWS/GCP/Azure) and data warehouses (Snowflake, BigQuery, Redshift).Excellent SQL skills; adept at optimizing queries and handling large datasets.MLOps & Deployment:Proven ability to deploy models to production using APIs or streaming technologies (Kafka, Flink).Experienced in model versioning, experiment management, and deployment practices with MLflow, SageMaker, or Vertex AI.Monitoring & Observability:Capable of establishing model monitoring, drift detection, and alerting systems utilizing Prometheus, Grafana, Evidently, or custom dashboards.Familiarity with logging frameworks and performance profiling for ML services.GenAI (Preferred):Experience with Large Language Models, embeddings, prompt engineering, and vector databases (e.g., FAISS, Pinecone).Ability to incorporate GenAI into decision-making systems or customer-facing applications.Leadership & Collaboration:Proven leadership skills to guide senior data scientists while maintaining a hands-on approach.Adept at collaborating with product managers, engineers, and business stakeholders.
Full-time|On-site|Dubai, Dubai, United Arab Emirates
Join our dynamic team at gsstech-group as a meticulous and analytical Data Analyst specializing in the insurance sector. We are looking for a candidate with robust experience in data migration projects and Master Data Management (MDM). In this pivotal role, you will contribute to transformative projects related to policy, claims, underwriting, and customer data, ensuring exceptional data quality, integrity, and adherence to governance standards.The successful candidate will possess strong analytical capabilities, comprehensive knowledge of the insurance business domain, and practical experience in data mapping, reconciliation, validation, and MDM processes.Key Responsibilities1. Insurance Domain Data Analysis· Analyze insurance data across Policy Administration, Claims, Underwriting, Billing, and Reinsurance systems.· Collaborate with business stakeholders to identify data requirements and regulatory obligations.· Interpret complex insurance data models and business rules.2. Data Migration· Execute source-to-target data mapping and document transformation logic.· Carry out data profiling, cleansing, validation, and reconciliation activities.· Identify data gaps, inconsistencies, and quality issues during the migration process.· Assist in migrating from legacy systems to contemporary core insurance platforms.· Validate migrated data through User Acceptance Testing (UAT) and post-migration audits.3. Master Data Management (MDM)· Support the implementation and ongoing maintenance of MDM frameworks.· Manage various data sets and data sources effectively.· Define and monitor data quality Key Performance Indicators (KPIs).· Ensure standardization, de-duplication, and creation of golden records.· Collaborate with governance teams to enforce data standards and policies.4. Reporting & Insights· Develop comprehensive dashboards and reports utilizing tools such as Microsoft Power BI and Tableau.· Conduct ad-hoc analyses to support underwriting and claims performance.· Present actionable data insights to business and leadership teams.
Full-time|On-site|Dubai, Dubai, United Arab Emirates
ABOUT US | Whiteshield is a premier global advisory firm strategically positioned at the crossroads of strategy, public policy, and digital transformation. We collaborate with governments and leading organizations to craft and execute initiatives aimed at enhancing economic competitiveness, fostering digital enablement, and ensuring sustainable growth.Our approach uniquely blends strategic advisory services with pragmatic execution. We empower senior decision-makers to navigate intricate regulatory, economic, and technological landscapes, converting aspirations into well-structured programs, delivery frameworks, and quantifiable outcomes.Unlike conventional consultancies that focus merely on recommendations, we remain actively engaged throughout the implementation process, ensuring that transformation plans are not only conceptualized but also effectively realized.WHERE YOU FIT IN | We are developing robust data platforms and architectures that serve as the backbone of national and enterprise transformation initiatives, facilitating decision-making, AI systems, and large-scale digital services.At Whiteshield, data is not an afterthought; it is a core component. From constructing interoperable data ecosystems across various institutions to enabling AI-ready infrastructures, our work operates at the intersection of engineering, analytics, and strategic foresight.We seek a Data Architect capable of bridging both vision and execution. You will delineate how data is structured, governed, and utilized, ensuring it is reliable, scalable, and poised to produce tangible impacts.You thrive in complex, multi-stakeholder environments, adeptly translating ambiguous requirements into clear data models, architectures, and standards. You possess a deep understanding of how contemporary data platforms bolster AI, analytics, and operational systems, and how to design them effectively.Your role will involve close collaboration with engineers, analysts, policymakers, and client stakeholders to build data ecosystems that are not only technically robust but also user-friendly, trustworthy, and aligned with strategic objectives.WHAT YOU WILL DO | Design and implement scalable, secure, and high-performing data architectures across both cloud and hybrid environments.Establish enterprise data models, standards, and governance frameworks to ensure consistency and interoperability.Architect modern data platforms (data lakes, warehouses, lakehouses) to support analytics, AI, and operational use cases.Lead client engagements as a trusted advisor on data strategy, architecture, and platform design.Translate business and policy objectives into structured data solutions and pipelines.Establish data governance practices, including data quality, lineage, cataloguing, and more.
Join our dynamic team at mexdigital as a Data Analyst. We are looking for a detail-oriented individual who is passionate about transforming data into actionable insights. You will be responsible for analyzing complex data sets, generating reports, and presenting findings to stakeholders. This role is ideal for someone who thrives in a fast-paced environment and is eager to contribute to data-driven decision-making.
Mar 31, 2026
Sign in to browse more jobs
Create account — see all 406 results
Tailoring 0 resumes…
Tailoring 0 resumes…
We'll move completed jobs to Ready to Apply automatically.