mirror of
https://github.com/ollama/ollama-python.git
synced 2026-01-13 21:57:16 +08:00
currently only supported on gpt-oss, but as more models come out with support like this we'll likely relax the particular values that can be provided
2.3 KiB
2.3 KiB
Running Examples
Run the examples in this directory with:
# Run example
python3 examples/<example>.py
See ollama/docs/api.md for full API documentation
Chat - Chat with a model
- chat.py
- async-chat.py
- chat-stream.py - Streamed outputs
- chat-with-history.py - Chat with model and maintain history of the conversation
Generate - Generate text with a model
- generate.py
- async-generate.py
- generate-stream.py - Streamed outputs
- fill-in-middle.py - Given a prefix and suffix, fill in the middle
Tools/Function Calling - Call a function with a model
- tools.py - Simple example of Tools/Function Calling
- async-tools.py
- multi-tool.py - Using multiple tools, with thinking enabled
gpt-oss
- gpt-oss-tools.py - Using tools with gpt-oss
- gpt-oss-tools-stream.py - Using tools with gpt-oss, with streaming enabled
Multimodal with Images - Chat with a multimodal (image chat) model
Structured Outputs - Generate structured outputs with a model
Ollama List - List all downloaded models and their properties
Ollama Show - Display model properties and capabilities
Ollama ps - Show model status with CPU/GPU usage
Ollama Pull - Pull a model from Ollama
Requirement: pip install tqdm