mirror of
https://github.com/ollama/ollama-python.git
synced 2026-01-13 21:57:16 +08:00
| .. | ||
| main.py | ||
| README.md | ||
async-chat-stream
This example demonstrates how to create a conversation history using an asynchronous Ollama client and the chat endpoint. The streaming response is outputted to stdout as well as a TTS if enabled with --speak and available. Supported TTS are say on macOS and espeak on Linux.