-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error Encountered After Configuring LLM with Ollama Provider in Config File #1903
Comments
@vrajpatel04 Are you using the config as file or as a dictionary? |
I created a separate config.yaml file. |
Hi @vrajpatel04 use |
Hi @ketangangal , still giving the same error: |
I can take a look at this tomorrow @Dev-Khant |
@vrajpatel04 https://docs.mem0.ai/components/llms/config Also, could you please upgrade your Mem0 version. |
Issue with current documentation:
I've configured the LLM in my config file as follows:
However, I'm encountering an error (see screenshot). Could you help me resolve this issue ?
The text was updated successfully, but these errors were encountered: