About the job
WHO WE ARE
Box (NYSE:BOX) stands at the forefront of Intelligent Content Management. Our innovative platform empowers organizations to enhance collaboration, manage content throughout its lifecycle, safeguard crucial information, and revolutionize business workflows through enterprise AI. Established in 2005, Box simplifies operations for prominent global entities such as JLL, Morgan Stanley, and Nationwide. With our headquarters in Redwood City, CA, and a presence in the US, Europe, and Asia, we embrace the AI-driven future of business.
By joining Box, you will play a pivotal role in advancing our platform. Content fuels our work, encompassing the vast array of files and information exchanged daily: contracts, invoices, employee records, financial data, product specifications, marketing materials, and beyond. Our mission is to infuse intelligence into content management, enabling our clients to completely transform their workflows. With the integration of AI and enterprise content, the potential to reshape collaboration has never been more significant, and at Box, you will be at the forefront of this transformative journey.
YOUR ROLE AT BOX
Box is the Content Cloud, equipping enterprises to manage their content lifecycle with security and scale. Our Cloud Operations team is expanding, and we are developing a modern Operations Management Platform designed to enable smarter decision-making, enhance reliability, and achieve cost efficiency across our cloud ecosystem.
As a Data Engineer within Cloud Operations, you will create and manage the data infrastructure that drives insights for FinOps and Site Reliability Engineering (SRE). Collaborating with data scientists and FinOps specialists, you will integrate various data sources, maintain dependable and timely data pipelines, and develop user-friendly interfaces that inform decisions around cloud cost optimization and operational excellence.
YOUR RESPONSIBILITIES
- Collaborate with data science, SRE, and FinOps teams to convert defined business requirements into scalable data solutions.
- Design and manage ETL/ELT pipelines to ingest, cleanse, transform, and aggregate data from diverse systems utilizing GCP services.
- Establish and oversee data quality checks, testing, and monitoring to guarantee trustworthy and timely datasets and dashboards.

