Breaking changes for version 3.0 #63
Closed
awaescher
announced in
Announcements
Replies: 2 comments
-
Changes are about to be merged at #64 |
Beta Was this translation helpful? Give feedback.
0 replies
-
Changes are merged and documented in release info of version 3.0. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Thanks everyone for using and improving OllamaSharp. I learned to love this little project, especially because we have some really engaged contributors around.
I want OllamaSharp to stay the best Ollama API bindings, that's why I suggest to the following changes for an upcoming release. As I am proposing breaking changes, I will bump the major version to 3 in near future.
Drop streamer callback support
Right now, OllamaSharp supports multiple ways to react to incoming streams from Ollama.
First, I started with a
IResponseStreamer<T>
callback that could be passed as an argument. Later, @JerrettDavis added IAsyncEnumerable syntax support:I prefer the
IAsyncEnumerable
syntax as it is more flexible and easier to read.Providing both syntax options leaves new developers with too many obsolete choices. It also bloats the code base unnecessarily.
That's why I decided to drop the support for the streamer callback syntax starting from version 3.0.
IAsyncEnumerable
will be the way to go.Improving chats
Also, it's not so clear how to implement a chat. There is a very helpful
Chat
class that acts like a wrapper to the Ollama API and automatically collects and transmits the chat history. It can be instantiated directly but there's also an extension methodChat()
on theIOllamaApiClient
that returns an instance. So far so good, but theIOllamaApiClient
also provides some chat-related methods, that or more or less internal. That makes three:Chat()
(the extension method) Starts a new chat and returns theChat
class instance (preferred way)Chat()
sends a chat request message to the Ollama API (sync, no streaming)SendChat()
sends a chat request message to the Ollama API (async, streaming)StreamChat()
sends a chat request message to the Ollama API (async, IAsyncEnumerable)By dropping the streamer syntax support,
SendChat()
should be gone too. I should drop the pretty useless syncChat()
too. I might also drop the extension method in favor for instantiating theChat
instance directly.As always, feedback is welcome. Contributions even more.
Beta Was this translation helpful? Give feedback.
All reactions