About the job
About Ouro:
Ouro is a pioneering global financial services and technology firm committed to delivering innovative financial empowerment solutions for consumers worldwide. Our diverse financial products and services encompass prepaid, debit, cross-border payments, and loyalty solutions catering to both consumers and enterprise partners.
Ouro's flagship offering, Netspend, provides prepaid and debit account solutions, granting customers secure and convenient access to extensive global payment networks for effective money management and everyday purchases. With an expansive retail network across the U. S., customers can easily purchase and reload their Netspend products at over 130,000 locations.
Since our inception in 1999, founded by industry experts, Ouro has successfully processed billions in transaction volumes, serving millions of satisfied customers globally. Headquartered in Austin, Texas, we proudly employ a diverse team of professionals around the world.
Role Overview:
We are in search of a skilled Senior Software Engineer to architect, develop, and enhance our data ecosystem. Your primary focus will be on migrating and managing intricate data pipelines transitioning from legacy Oracle systems to Snowflake, utilizing Python as your main engine for automation and transformation. The ideal candidate is a strategic thinker, ensuring data integrity at scale while staying updated with the latest in AI/ML integration.
Key Responsibilities:
Legacy Integration: Sustain and optimize current Oracle databases while spearheading strategic data migration initiatives to the cloud.
Automation: Leverage Python to automate repetitive data tasks, construct custom API integrations, and manage workflow orchestration.
Performance Tuning: Execute advanced SQL optimization and performance tuning in both Snowflake and Oracle to guarantee low-latency data availability.
Collaboration: Engage closely with Data Science and Analytics teams to deliver clean, structured, and efficient datasets for business intelligence.
AI Readiness: (Preferred) Investigate and implement AI-driven data processing techniques, such as LLM-based data cleaning or predictive pipeline monitoring.
Technical Requirements (Must-Have):
Python: Expert-level proficiency in data manipulation (e.g., Pandas, PySpark) and scripting.
Snowflake: Demonstrated expertise with Snowflake architecture (Snowpipe, Tasks, Streams).

