About the job
About GameChanger:
At GameChanger, we understand the transformative influence of youth sports, fostering essential life skills such as leadership, teamwork, responsibility, and confidence. These experiences not only benefit players but also enrich the entire community, emphasizing the vital roles that coaches, parents, and volunteers play in youth sports. Our goal is to create the premier platform for capturing and cherishing these invaluable moments, enabling families to uplift the next generation through sports.
If you are passionate about sports and their capacity to unite communities or if you thrive in creating innovative products, consider joining our remote-first, agile tech company based in New York City. We are dedicated to tackling significant challenges in youth sports.
The Position:
We are looking for a Senior Analytics Engineer to be a key player in our Analytics Hub team. Your mission will be to develop user-friendly, extensible data models and self-service capabilities that empower various stakeholders across the organization to quickly and confidently address data-related inquiries. You will collaborate closely with teams in Analytics, Data Engineering, Finance, Product, and Engineering to enhance the platforms that drive decision-making, automation, and performance measurement.
In this role, you will work with both first-party and third-party data to construct and uphold the data infrastructure required for reporting, analysis, and experimentation. Utilizing Python, SQL, and DBT, you will transform warehouse data into scalable, self-service data models and artifacts (such as metrics and dashboards) that reveal insights and enable teams across Finance, Product, and Engineering.
What You’ll Do:
Lead the architecture, optimization, and transformation of finance and product-centric data models in DBT, facilitating flexible analysis for Data Analysts/Scientists and self-service options for business stakeholders.
Design and implement data validations to maintain data integrity across our pipelines and ensure the accuracy of our reporting.
Foster transparency throughout the entire data pipeline by establishing robust processes with upstream data producers and downstream data-consuming tools (BI, Reverse ETL, experimentation).
Advance agentic tooling and automation initiatives that reduce the time spent on data issue investigations, allowing the team to focus on analyses.
Implement software engineering best practices, including version control and continuous integration, to enhance the quality of analytics code.

