mirror of
https://github.com/ollama/ollama-python.git
synced 2026-05-02 12:18:18 +08:00
update readme
This commit is contained in:
parent
5e54b4241c
commit
43090cf116
@ -1,6 +1,7 @@
|
|||||||
# Running Examples
|
# Running Examples
|
||||||
|
|
||||||
Run the examples in this directory with:
|
Run the examples in this directory with:
|
||||||
|
|
||||||
```sh
|
```sh
|
||||||
# Run example
|
# Run example
|
||||||
python3 examples/<example>.py
|
python3 examples/<example>.py
|
||||||
@ -9,85 +10,87 @@ python3 examples/<example>.py
|
|||||||
See [ollama/docs/api.md](https://github.com/ollama/ollama/blob/main/docs/api.md) for full API documentation
|
See [ollama/docs/api.md](https://github.com/ollama/ollama/blob/main/docs/api.md) for full API documentation
|
||||||
|
|
||||||
### Chat - Chat with a model
|
### Chat - Chat with a model
|
||||||
|
|
||||||
- [chat.py](chat.py)
|
- [chat.py](chat.py)
|
||||||
- [async-chat.py](async-chat.py)
|
- [async-chat.py](async-chat.py)
|
||||||
- [chat-stream.py](chat-stream.py) - Streamed outputs
|
- [chat-stream.py](chat-stream.py) - Streamed outputs
|
||||||
- [chat-with-history.py](chat-with-history.py) - Chat with model and maintain history of the conversation
|
- [chat-with-history.py](chat-with-history.py) - Chat with model and maintain history of the conversation
|
||||||
|
|
||||||
|
|
||||||
### Generate - Generate text with a model
|
### Generate - Generate text with a model
|
||||||
|
|
||||||
- [generate.py](generate.py)
|
- [generate.py](generate.py)
|
||||||
- [async-generate.py](async-generate.py)
|
- [async-generate.py](async-generate.py)
|
||||||
- [generate-stream.py](generate-stream.py) - Streamed outputs
|
- [generate-stream.py](generate-stream.py) - Streamed outputs
|
||||||
- [fill-in-middle.py](fill-in-middle.py) - Given a prefix and suffix, fill in the middle
|
- [fill-in-middle.py](fill-in-middle.py) - Given a prefix and suffix, fill in the middle
|
||||||
|
|
||||||
|
|
||||||
### Tools/Function Calling - Call a function with a model
|
### Tools/Function Calling - Call a function with a model
|
||||||
|
|
||||||
- [tools.py](tools.py) - Simple example of Tools/Function Calling
|
- [tools.py](tools.py) - Simple example of Tools/Function Calling
|
||||||
- [async-tools.py](async-tools.py)
|
- [async-tools.py](async-tools.py)
|
||||||
- [multi-tool.py](multi-tool.py) - Using multiple tools, with thinking enabled
|
- [multi-tool.py](multi-tool.py) - Using multiple tools, with thinking enabled
|
||||||
|
|
||||||
#### gpt-oss
|
#### gpt-oss
|
||||||
|
|
||||||
- [gpt-oss-tools.py](gpt-oss-tools.py)
|
- [gpt-oss-tools.py](gpt-oss-tools.py)
|
||||||
- [gpt-oss-tools-stream.py](gpt-oss-tools-stream.py)
|
- [gpt-oss-tools-stream.py](gpt-oss-tools-stream.py)
|
||||||
- [gpt-oss-tools-browser.py](gpt-oss-tools-browser.py) - Using browser research tools with gpt-oss
|
- [gpt-oss-tools-browser.py](gpt-oss-tools-browser.py) - Using browser research tools with gpt-oss
|
||||||
- [gpt-oss-tools-browser-stream.py](gpt-oss-tools-browser-stream.py) - Using browser research tools with gpt-oss, with streaming enabled
|
- [gpt-oss-tools-browser-stream.py](gpt-oss-tools-browser-stream.py) - Using browser research tools with gpt-oss, with streaming enabled
|
||||||
|
|
||||||
|
|
||||||
### Multimodal with Images - Chat with a multimodal (image chat) model
|
### Multimodal with Images - Chat with a multimodal (image chat) model
|
||||||
|
|
||||||
- [multimodal-chat.py](multimodal-chat.py)
|
- [multimodal-chat.py](multimodal-chat.py)
|
||||||
- [multimodal-generate.py](multimodal-generate.py)
|
- [multimodal-generate.py](multimodal-generate.py)
|
||||||
|
|
||||||
|
|
||||||
### Structured Outputs - Generate structured outputs with a model
|
### Structured Outputs - Generate structured outputs with a model
|
||||||
|
|
||||||
- [structured-outputs.py](structured-outputs.py)
|
- [structured-outputs.py](structured-outputs.py)
|
||||||
- [async-structured-outputs.py](async-structured-outputs.py)
|
- [async-structured-outputs.py](async-structured-outputs.py)
|
||||||
- [structured-outputs-image.py](structured-outputs-image.py)
|
- [structured-outputs-image.py](structured-outputs-image.py)
|
||||||
|
|
||||||
|
|
||||||
### Ollama List - List all downloaded models and their properties
|
### Ollama List - List all downloaded models and their properties
|
||||||
|
|
||||||
- [list.py](list.py)
|
- [list.py](list.py)
|
||||||
|
|
||||||
|
|
||||||
### Ollama Show - Display model properties and capabilities
|
### Ollama Show - Display model properties and capabilities
|
||||||
|
|
||||||
- [show.py](show.py)
|
- [show.py](show.py)
|
||||||
|
|
||||||
|
|
||||||
### Ollama ps - Show model status with CPU/GPU usage
|
### Ollama ps - Show model status with CPU/GPU usage
|
||||||
|
|
||||||
- [ps.py](ps.py)
|
- [ps.py](ps.py)
|
||||||
|
|
||||||
|
|
||||||
### Ollama Pull - Pull a model from Ollama
|
### Ollama Pull - Pull a model from Ollama
|
||||||
Requirement: `pip install tqdm`
|
|
||||||
- [pull.py](pull.py)
|
|
||||||
|
|
||||||
|
Requirement: `pip install tqdm`
|
||||||
|
|
||||||
|
- [pull.py](pull.py)
|
||||||
|
|
||||||
### Ollama Create - Create a model from a Modelfile
|
### Ollama Create - Create a model from a Modelfile
|
||||||
- [create.py](create.py)
|
|
||||||
|
|
||||||
|
- [create.py](create.py)
|
||||||
|
|
||||||
### Ollama Embed - Generate embeddings with a model
|
### Ollama Embed - Generate embeddings with a model
|
||||||
|
|
||||||
- [embed.py](embed.py)
|
- [embed.py](embed.py)
|
||||||
|
|
||||||
|
|
||||||
### Thinking - Enable thinking mode for a model
|
### Thinking - Enable thinking mode for a model
|
||||||
|
|
||||||
- [thinking.py](thinking.py)
|
- [thinking.py](thinking.py)
|
||||||
|
|
||||||
### Thinking (generate) - Enable thinking mode for a model
|
### Thinking (generate) - Enable thinking mode for a model
|
||||||
|
|
||||||
- [thinking-generate.py](thinking-generate.py)
|
- [thinking-generate.py](thinking-generate.py)
|
||||||
|
|
||||||
### Thinking (levels) - Choose the thinking level
|
### Thinking (levels) - Choose the thinking level
|
||||||
|
|
||||||
- [thinking-levels.py](thinking-levels.py)
|
- [thinking-levels.py](thinking-levels.py)
|
||||||
|
|
||||||
|
### Web search and fetch MCP server
|
||||||
|
|
||||||
### MCP server - Expose web search and crawl tools to MCP clients
|
|
||||||
Requires: `pip install mcp`
|
|
||||||
- [mcp_web_search_crawl_server.py](mcp_web_search_crawl_server.py)
|
- [mcp_web_search_crawl_server.py](mcp_web_search_crawl_server.py)
|
||||||
|
|
||||||
Run via stdio (for Cursor/Claude MCP):
|
|
||||||
```sh
|
```sh
|
||||||
python3 examples/mcp_web_search_crawl_server.py
|
uv run examples/mcp-web-search-and-fetch.py
|
||||||
```
|
```
|
||||||
|
|
||||||
Optional environment:
|
`OLLAMA_API_KEY` is required. You can get one from [ollama.com/settings/keys](https://ollama.com/settings/keys).
|
||||||
- `OLLAMA_API_KEY`: If set, will be passed as an Authorization header for Ollama hosted web search/crawl APIs.
|
|
||||||
|
|||||||
Loading…
Reference in New Issue
Block a user