Skip to content

Commit

Permalink
feat(Chat): The .append_message() method now automatically streams in…
Browse files Browse the repository at this point in the history
… generators
  • Loading branch information
cpsievert committed Dec 17, 2024
1 parent 7ae7725 commit 10797fd
Showing 1 changed file with 11 additions and 3 deletions.
14 changes: 11 additions & 3 deletions shiny/ui/_chat.py
Original file line number Diff line number Diff line change
Expand Up @@ -515,13 +515,21 @@ async def append_message(self, message: Any) -> None:
The message to append. A variety of message formats are supported including
a string, a dictionary with `content` and `role` keys, or a relevant chat
completion object from platforms like OpenAI, Anthropic, Ollama, and others.
When the message is a generator or async generator, it is automatically
treated as a stream of message chunks (i.e., uses
`.append_message_stream()`)
Note
----
Use `.append_message_stream()` instead of this method when `stream=True` (or
similar) is specified in model's completion method.
Although this method tries its best to handle various message formats, it's
not always possible to handle every message format. If you encounter an error
or no response when appending a message, try extracting the message content
as a string and passing it to this method.
"""
await self._append_message(message)
if inspect.isasyncgen(message) or inspect.isgenerator(message):
await self.append_message_stream(message)
else:
await self._append_message(message)

async def _append_message(
self, message: Any, *, chunk: ChunkOption = False, stream_id: str | None = None
Expand Down

0 comments on commit 10797fd

Please sign in to comment.