Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Compatibility with Ollama #1

Open
frikimanHD opened this issue Jul 3, 2024 · 2 comments
Open

Compatibility with Ollama #1

frikimanHD opened this issue Jul 3, 2024 · 2 comments

Comments

@frikimanHD
Copy link

Hello. I'm working on a project that uses the ollama service to run the mistral 8x7b model. I try to make it run a simple kernel function to return the current date and time but I get this exception:

System.Exception: The LLM is not compatible with this approach.
   at JC.SemanticKernel.Planners.UniversalLLMFunctionCaller.UniversalLLMFunctionCaller.RunAsync(String task)
   at JC.SemanticKernel.Planners.UniversalLLMFunctionCaller.UniversalLLMFunctionCaller.RunAsync(ChatHistory askHistory)
   at SemanticKernelApp.SemanticKernelApp.Main(String[] args) in C:\Users\pgimeno\source\repos\SemanticKernelApp\Program.cs:line 44

This is the code of the project i'm working on:

using JC.SemanticKernel.Planners.UniversalLLMFunctionCaller;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;
using Microsoft.SemanticKernel.Connectors.OpenAI;

namespace SemanticKernelApp
{
    class SemanticKernelApp
    {
        static async Task Main(string[] args)
        {
#pragma warning disable SKEXP0010
#pragma warning disable SKEXP0060
            var endpoint = new Uri("http://192.168.1.18:42069");
            var modelId = "mistral:latest";
            bool acabat = false;
            HttpClient client = new HttpClient();
            client.Timeout = TimeSpan.FromDays(5);
            var kernelBuilder = Kernel.CreateBuilder().AddOpenAIChatCompletion(modelId: modelId, apiKey: null, endpoint: endpoint, httpClient: client);

            var kernel = kernelBuilder.Build();
            kernel.Plugins.AddFromType<CustomPlugin>("CustomPlugin");
            Console.WriteLine("Type \"$leave\" to leave");
            var chatCompletion = kernel.GetRequiredService<IChatCompletionService>();
            var chat = new ChatHistory();
            UniversalLLMFunctionCaller planner = new(kernel);

            OpenAIPromptExecutionSettings settings = new() { ToolCallBehavior = ToolCallBehavior.AutoInvokeKernelFunctions };

            while (!acabat)
            {
                Console.Write("\nUser: ");
                var userInput = Console.ReadLine();

                if (userInput != "$leave")
                {
                    try
                    {

                        chat.AddUserMessage(userInput);
                        var bot_answer = await planner.RunAsync(chat);
                        Console.Write($"\nAI: {bot_answer.ToString()}\n");
                        chat.AddAssistantMessage(bot_answer.ToString());

                    }
                    catch (Exception e)
                    {
                        Console.WriteLine(e.ToString());
                        Console.ReadLine();
                    }


                }
                else
                {
                    acabat = true;
                }


            }

        }
        
    }
}

It would be very helpful to know if the issue is in my code or if the function caller is just not compatible with ollama. Thank you in advance.

@Jenscaasen
Copy link
Owner

Hey there,
i have not tested it with olama, but looking at your code i see that you are using the OpenAI connector. OpenAI and Mistral share a lot of similarities in their API, but have some detailed differences. Please try to use the Mistral connector. Microsoft now added an official mistral connector to SK, so please don't use mine. Theirs is under maintenance development, mine is abandoned.

@d3-eugene-titov
Copy link

Its working perfect with latest ollama

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants