ollama-python/examples/README.md
2025-09-23 21:15:21 -07:00

3.2 KiB

Running Examples

Run the examples in this directory with:

# Run example
python3 examples/<example>.py

# or with uv
uv run examples/<example>.py

See ollama/docs/api.md for full API documentation

Chat - Chat with a model

Generate - Generate text with a model

Tools/Function Calling - Call a function with a model

gpt-oss

An API key from Ollama's cloud service is required. You can create one here.

export OLLAMA_API_KEY="your_api_key_here"

MCP server

The MCP server can be used with an MCP client like Cursor, Cline, Codex, Open WebUI, Goose, and more.

uv run examples/web-search-mcp.py

Configuration to use with an MCP client:

{
  "mcpServers": {
    "web_search": {
      "type": "stdio",
      "command": "uv",
      "args": ["run", "path/to/ollama-python/examples/web-search-mcp.py"],
      "env": { "OLLAMA_API_KEY": "api-key" }
    }
  }
}

Multimodal with Images - Chat with a multimodal (image chat) model

Structured Outputs - Generate structured outputs with a model

Ollama List - List all downloaded models and their properties

Ollama Show - Display model properties and capabilities

Ollama ps - Show model status with CPU/GPU usage

Ollama Pull - Pull a model from Ollama

Requirement: pip install tqdm

Ollama Create - Create a model from a Modelfile

Ollama Embed - Generate embeddings with a model

Thinking - Enable thinking mode for a model

Thinking (generate) - Enable thinking mode for a model

Thinking (levels) - Choose the thinking level