mirror of
https://github.com/ollama/ollama-python.git
synced 2026-01-13 21:57:16 +08:00
* Examples and README updates --------- Co-authored-by: fujitatomoya <tomoya.fujita825@gmail.com> Co-authored-by: Michael Yang <mxyng@pm.me> |
||
|---|---|---|
| .. | ||
| async-chat.py | ||
| async-generate.py | ||
| async-tools.py | ||
| chat-stream.py | ||
| chat-with-history.py | ||
| chat.py | ||
| create.py | ||
| embed.py | ||
| fill-in-middle.py | ||
| generate-stream.py | ||
| generate.py | ||
| list.py | ||
| multimodal-chat.py | ||
| multimodal-generate.py | ||
| ps.py | ||
| pull.py | ||
| README.md | ||
| tools.py | ||
Running Examples
Run the examples in this directory with:
# Run example
python3 examples/<example>.py
Chat - Chat with a model
- chat.py
- async-chat.py
- chat-stream.py - Streamed outputs
- chat-with-history.py - Chat with model and maintain history of the conversation
Generate - Generate text with a model
- generate.py
- async-generate.py
- generate-stream.py - Streamed outputs
- fill-in-middle.py - Given a prefix and suffix, fill in the middle
Tools/Function Calling - Call a function with a model
- tools.py - Simple example of Tools/Function Calling
- async-tools.py
Multimodal with Images - Chat with a multimodal (image chat) model
Ollama List - List all downloaded models and their properties
Ollama ps - Show model status with CPU/GPU usage
Ollama Pull - Pull a model from Ollama
Requirement: pip install tqdm
Ollama Create - Create a model from a Modelfile
python create.py <model> <modelfile>
See ollama/docs/modelfile.md for more information on the Modelfile format.