Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update OpenAiClient::chat_completions to stream ChatCompletionChunk messages #257

Open
3 tasks
declark1 opened this issue Nov 27, 2024 · 0 comments · May be fixed by #268
Open
3 tasks

Update OpenAiClient::chat_completions to stream ChatCompletionChunk messages #257

declark1 opened this issue Nov 27, 2024 · 0 comments · May be fixed by #268

Comments

@declark1
Copy link
Collaborator

declark1 commented Nov 27, 2024

Description

Currently, the OpenAiClient::chat_completions method returns a stream of sse::Event messages when stream: true, as opposed to deserializing the events to ChatCompletionChunks as we are passing them through to the client directly. As we will need to apply detections on the chunks, this needs to be changed to the latter.

Acceptance Criteria

  • ChatCompletionsResponse::Streaming variant receiver type is updated to mpsc::Receiver<Result<ChatCompletionChunk, Error>>
  • OpenAiClient::chat_completions deserializes SSE event data to ChatCompletionChunk and maps errors to client::Error
  • Orchestrator::handle_chat_completions_detection is updated accordingly
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant