ollama-python/examples/async-chat-stream
2024-01-18 11:20:48 -08:00
..
main.py examples: add async chat 2024-01-18 11:20:48 -08:00
README.md examples: add async chat 2024-01-18 11:20:48 -08:00

async-chat-stream

This example demonstrates how to create a conversation history using an asynchronous Ollama client and the chat endpoint. The streaming response is outputted to stdout as well as a TTS if enabled with --speak and available. Supported TTS are say on macOS and espeak on Linux.