About the role
Empower Every Identity, from AI to Human
At Okta, we believe that identity is the cornerstone of harnessing the potential of AI. We secure AI by creating a trusted, neutral infrastructure that allows organizations to confidently embrace this transformative era. This mission calls for a passionate commitment to tackling complex challenges that have real-world implications. We seek individuals who are builders and owners, operating with speed and urgency while executing their tasks with excellence.
This is your chance to engage in work that could define your career. If you share our commitment to this mission, we would love to connect.
The Opportunity
In the role of AI and Automation Development Engineer at Okta, you will play a pivotal role in our team as we build the platforms and services that propel our internal AI transformation. We are advancing from basic automation to developing sophisticated, "agentic" workflows by leveraging LangGraph and Gemini Enterprise to establish stateful, reasoning-capable systems that empower our global workforce.
This hands-on position is designed for a professional engineer proficient in writing production-grade Python and eager to connect AI research with enterprise-level applications. You will take charge of implementing core features, orchestrating intricate LLM interactions while upholding performance, security, and reliability standards.
Your Responsibilities
- Create Agentic Workflows: Design and maintain stateful multi-agent systems utilizing LangGraph to tackle complex, non-linear business challenges.
- Gemini Integration: Implement and refine LLM interactions using Gemini Enterprise, with a focus on prompt engineering, function calling, and structured outputs.
- State & Memory Management: Architect and oversee agent state, persistence, and "human-in-the-loop" checkpoints to guarantee dependable automated decision-making.
- Backend Development: Develop high-quality, production-ready Python code to establish the APIs and services that fuel our internal AI ecosystem.
- RAG & Grounding: Engage in Retrieval-Augmented Generation (RAG) pipelines to ensure agents have access to accurate, real-time enterprise context.
- Infrastructure & Deployment: Deploy and manage services on GCP employing containerized environments (Docker/Kubernetes) and Infrastructure as Code (Terraform).

