Ollama
Install
To use Ollama, you need to either install pydantic-ai, or install pydantic-ai-slim with the openai optional group (as it uses an OpenAI-compatible API):
pip install 'pydantic-ai-slim[openai]'
uv add 'pydantic-ai-slim[openai]'
Configuration
Pydantic AI supports both self-hosted Ollama servers (running locally or remotely) and Ollama Cloud.
For servers running locally, use the http://localhost:11434/v1 base URL. For Ollama Cloud, use https://ollama.com/v1 and ensure an API key is set.
You can set the OLLAMA_BASE_URL and (optionally) OLLAMA_API_KEY environment variables and use OllamaProvider by name:
from pydantic_ai import Agent
agent = Agent('ollama:gpt-oss:20b')
...
Or initialise the model and provider directly:
from pydantic import BaseModel
from pydantic_ai import Agent
from pydantic_ai.models.openai import OpenAIChatModel
from pydantic_ai.providers.ollama import OllamaProvider
class CityLocation(BaseModel):
city: str
country: str
ollama_model = OpenAIChatModel(
model_name='gpt-oss:20b',
provider=OllamaProvider(base_url='http://localhost:11434/v1'), # (1)!
)
agent = Agent(ollama_model, output_type=CityLocation)
result = agent.run_sync('Where were the olympics held in 2012?')
print(result.output)
#> city='London' country='United Kingdom'
print(result.usage())
#> RunUsage(input_tokens=57, output_tokens=8, requests=1)
- For Ollama Cloud, use the
base_url='https://ollama.com/v1'and set theOLLAMA_API_KEYenvironment variable.