You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello. I'm working on a project that uses the ollama service to run the mistral 8x7b model. I try to make it run a simple kernel function to return the current date and time but I get this exception:
System.Exception: The LLM is not compatible with this approach.
at JC.SemanticKernel.Planners.UniversalLLMFunctionCaller.UniversalLLMFunctionCaller.RunAsync(String task)
at JC.SemanticKernel.Planners.UniversalLLMFunctionCaller.UniversalLLMFunctionCaller.RunAsync(ChatHistory askHistory)
at SemanticKernelApp.SemanticKernelApp.Main(String[] args) in C:\Users\pgimeno\source\repos\SemanticKernelApp\Program.cs:line 44
Hey there,
i have not tested it with olama, but looking at your code i see that you are using the OpenAI connector. OpenAI and Mistral share a lot of similarities in their API, but have some detailed differences. Please try to use the Mistral connector. Microsoft now added an official mistral connector to SK, so please don't use mine. Theirs is under maintenance development, mine is abandoned.
Hello. I'm working on a project that uses the ollama service to run the mistral 8x7b model. I try to make it run a simple kernel function to return the current date and time but I get this exception:
This is the code of the project i'm working on:
It would be very helpful to know if the issue is in my code or if the function caller is just not compatible with ollama. Thank you in advance.
The text was updated successfully, but these errors were encountered: