About the job
The AI platform at Datadog plays a pivotal role in managing all AI infrastructure within the company. Our objective is to deliver cutting-edge tools and platforms that empower data scientists and engineers to effortlessly conduct extensive training and inference tasks. We support various innovative products, including Bits AI, LLMObs, and all our AI research initiatives.
As a Manager of the Evaluation & Annotation team, you will be part of a dynamic and rapidly expanding organization. Your role will involve building and scaling the team, defining our technical vision, and shaping the strategic roadmap. Your team will tackle several essential technical challenges, including offline and online evaluation of AI models, designing tools and processes for human annotation, and establishing standards for synthetic and AI-generated datasets.
Collaboration is key; you will work closely with other teams within the AI platform organization to ensure a seamless AI development cycle, as well as partner with the Applied AI organization and Datadog's infrastructure and tooling teams to build robust systems from the ground up.
At Datadog, we cherish our office culture, which fosters relationships, creativity, and collaboration. We embrace a hybrid work model, allowing our employees to achieve their desired work-life balance.

