A Python library that converts OpenAPI specifications into Large Language Model (LLM) tool/function definitions, enabling OpenAPI invocations through LLM generated tool calls.
- Features
- Installation
- Library Scope
- Quick Start
- Requirements
- Development Setup
- Testing
- License
- Security
- Contributing
- Convert OpenAPI specifications into LLM-compatible tool/function definitions
- Support for multiple LLM providers (OpenAI, Anthropic, Cohere)
- Handle complex request bodies and parameter types
- Support for multiple authentication mechanisms
- Support for OpenAPI 3.0.x and 3.1.x specifications
- Handles both YAML and JSON OpenAPI specifications
pip install openapi-llm
- Python >= 3.8
This library focuses on OpenAPI-to-LLM conversion and doesn't include LLM provider libraries by default. Install the ones you need:
# For OpenAI
pip install openai
# For Anthropic
pip install anthropic
# For Cohere
pip install cohere
OpenAPI-LLM provides core functionality for converting OpenAPI specifications into LLM-compatible tool/function definitions. It intentionally does not provide an opinionated, high-level interface for OpenAPI-LLM interactions. Users are encouraged to develop their own thin application layer above this library that suits their specific needs and preferences for OpenAPI-LLM integration.
This library does not perform OpenAPI specification validation. It is the user's responsibility to ensure that the provided OpenAPI specifications are valid. We recommend using established validation tools such as:
Example of validating a spec before using it with openapi-llm:
from openapi_spec_validator import validate_spec
import yaml
# Load and validate your OpenAPI spec
with open('your_spec.yaml', 'r') as f:
spec_dict = yaml.safe_load(f)
validate_spec(spec_dict)
Here's a practical example using OpenAI to perform a Google search via SerperDev API:
import os
from openai import OpenAI
from openapi_llm.client.config import ClientConfig
from openapi_llm.client.openapi import OpenAPIClient
from openapi_llm.core.spec import OpenAPISpecification
# Configure the OpenAPI client with SerperDev API spec and credentials
config = ClientConfig(
openapi_spec=OpenAPISpecification.from_url("https://bit.ly/serperdev_openapi"),
credentials=os.getenv("SERPERDEV_API_KEY")
)
# Initialize OpenAI client
client = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))
# Create a chat completion with tool definitions
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Do a serperdev google search: Who was Nikola Tesla?"}],
tools=config.get_tool_definitions(),
)
# Execute the API call based on the LLM's response
service_api = OpenAPIClient(config)
service_response = service_api.invoke(response)
This example demonstrates:
- Loading an OpenAPI specification from a URL
- Integrating with OpenAI's function calling
- Handling API authentication
- Converting and executing OpenAPI calls based on LLM responses
- Python >= 3.8
- Dependencies:
- jsonref
- requests
- PyYAML
- Clone the repository
git clone https://github.com/vblagoje/openapi-llm.git
- Install hatch if you haven't already
pip install hatch
- Install pre-commit hooks
pre-commit install
- Install desired LLM provider dependencies (as needed)
pip install openai anthropic cohere
Run tests using hatch:
# Unit tests
hatch run test:unit
# Integration tests
hatch run test:integration
# Type checking
hatch run test:typing
# Linting
hatch run test:lint
MIT License - See LICENSE for details.
For security concerns, please see our Security Policy.
Contributions are welcome! Please feel free to submit a Pull Request.
Vladimir Blagojevic ([email protected])
Reviews and guidance by Madeesh Kannan