Skip to content

LangGraph OpenAI Serve

Welcome to the documentation for langgraph-openai-serve - a package that provides an OpenAI-compatible API for LangGraph instances.

Overview

LangGraph OpenAI Serve allows you to expose your LangGraph workflows and agents through an OpenAI-compatible API interface. This enables seamless integration with any client library or tool that works with the OpenAI API, providing a standardized way to interact with your custom LangGraph solutions.

Features

  • Expose your LangGraph instances through an OpenAI-compatible API
  • Register multiple graphs and map them to different model names
  • Use with any FastAPI application
  • Support for both streaming and non-streaming completions
  • Docker support for easy deployment

Table Of Contents

The documentation follows the best practice for project documentation as described by Daniele Procida in the Diátaxis documentation framework and consists of four separate parts:

  1. Tutorials - Step-by-step instructions to get you started
  2. How-To Guides - Practical guides for specific tasks
  3. Reference - Technical documentation of the API
  4. Explanation - Conceptual explanations of the architecture

Installation

# Using uv
uv add langgraph-openai-serve

# Using pip
pip install langgraph-openai-serve