-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OpenAI Streaming #25
OpenAI Streaming #25
Conversation
foundation-models/openai/src/main/java/com/sap/ai/sdk/foundationmodels/openai/OpenAiClient.java
Show resolved
Hide resolved
...dels/openai/src/main/java/com/sap/ai/sdk/foundationmodels/openai/OpenAiStreamingHandler.java
Outdated
Show resolved
Hide resolved
e2e-test-app/src/main/java/com/sap/ai/sdk/app/controllers/OpenAiController.java
Outdated
Show resolved
Hide resolved
...i/src/main/java/com/sap/ai/sdk/foundationmodels/openai/model/OpenAiChatCompletionOutput.java
Outdated
Show resolved
Hide resolved
...i/src/main/java/com/sap/ai/sdk/foundationmodels/openai/model/OpenAiChatCompletionStream.java
Outdated
Show resolved
Hide resolved
foundation-models/openai/src/main/java/com/sap/ai/sdk/foundationmodels/openai/OpenAiClient.java
Outdated
Show resolved
Hide resolved
...odels/openai/src/main/java/com/sap/ai/sdk/foundationmodels/openai/OpenAiClientException.java
Outdated
Show resolved
Hide resolved
...odels/openai/src/main/java/com/sap/ai/sdk/foundationmodels/openai/OpenAiClientException.java
Outdated
Show resolved
Hide resolved
...odels/openai/src/main/java/com/sap/ai/sdk/foundationmodels/openai/OpenAiResponseHandler.java
Outdated
Show resolved
Hide resolved
e2e-test-app/src/main/java/com/sap/ai/sdk/app/controllers/OpenAiController.java
Show resolved
Hide resolved
e2e-test-app/src/main/java/com/sap/ai/sdk/app/controllers/OpenAiController.java
Show resolved
Hide resolved
.peek(totalOutput::addDelta) | ||
.forEach(delta -> send(emitter, delta.getDeltaContent())); | ||
} finally { | ||
send(emitter, "\n\n-----Total Output-----\n\n" + objectToJson(totalOutput)); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(Comment/Minor)
It could be worthwhile to not just dump the object-to-string. But instead to extract the relevant information and write it piece-by-piece. That would also demonstrate the actual benefit of the feature. Right now the interested reader would wonder.. "what does total output even mean? what does it contain?"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are potentially 30 relevant informations. Which one in particular?
Also you could say the regular OpenAiClient.chatCompletion()
is confusing because it returns the total output.
Maybe you have a name suggestion for this var?
...odels/openai/src/main/java/com/sap/ai/sdk/foundationmodels/openai/OpenAiResponseHandler.java
Outdated
Show resolved
Hide resolved
...ion-models/openai/src/test/java/com/sap/ai/sdk/foundationmodels/openai/OpenAiClientTest.java
Outdated
Show resolved
Hide resolved
* Reduce sample code * Format
Context
AI/ai-sdk-java-backlog#59
Streaming allows the user to get an initial response from the LLM without waiting for the whole response.
Feature scope:
streamChatCompletion()
to theOpenAiClient
Definition of Done
Release notes updated