Join 9fin as our Data Engineering LeadAt 9fin, we are revolutionizing the global debt markets with our cutting-edge AI platform, which oversees the largest asset class valued at over $145 trillion.Despite the vastness and critical importance of debt markets, they continue to operate on fragmented data sources, PDFs, and outdated manual processes. Our innovative platform consolidates proprietary credit data, in-depth analysis, and valuable workflows to streamline operations across the globe.Currently, 9fin supports over 300 esteemed institutions worldwide, including major banks, asset managers, private equity firms, law firms, and advisory services. We are experiencing rapid growth, particularly in the US, with exceptional retention rates driven by our deep workflow integration.We stand at a pivotal moment in our journey. With a proven product-market fit and increasing global demand, 9fin is on the fast track to becoming the leading platform for debt markets internationally.Your Role and ResponsibilitiesAs the Data Engineering Lead, you will be responsible for guiding the technical roadmap of our data platform, enhancing reliability, scalability, and engineering excellence. This is a dynamic hands-on role where you will both lead and mentor.Define the Data Platform Roadmap: Establish the technical direction and implement impactful improvements to our platform, focusing on reliability, cost-efficiency, developer experience, and scalability. You will prioritize and execute the strategic platform enhancements for the upcoming year.Develop Customer-Facing APIs & Feeds: Collaborate on the design and implementation of API-driven feeds and enrichment pipelines that evolve into product features and revenue opportunities.Enhance Orchestration in Dagster: Create and optimize asset-based pipelines, sensors, schedules, backfills, IO managers, and monitoring patterns that are resilient, idempotent, and user-friendly. Establish standard practices for incremental jobs, full refreshes, and reverse ETL.Streamline Data Ingestion: Refine and scale ingestion processes utilizing Airbyte OSS and DLT, addressing schema drift, connector health, rate limits, retries, checkpointing, and operational robustness to ensure seamless pipeline execution.Strengthen BigQuery Infrastructure: Lead the evolution of our best practices, partitioning & clustering strategies, slot/cost management, and query performance standards.Enhance Data Quality & Observability: Implement service-level objectives (SLOs) for data freshness, automated validation checks, data provenance, and proactive alerting to ensure reliability for the business and our clients.
Feb 5, 2026