-
-
Notifications
You must be signed in to change notification settings - Fork 343
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update prompts for locally installed models #226
Comments
Would this also solve cases where your model is intended for auto-completing code? E.g. the current prompts asking magics with code format doesn't work with all models since it adds natural language instructions after the code asking about the output being code in markdown etc. |
Fixed by #309. |
@JasonWeill |
OK, reopened. |
Code Llama instruction models require different prompting specification. This requires an ability to customize the prompt for chat generation as well. Can this be supported? |
Problem
Current prompts in Jupyter AI work well with remote providers, but are not optimized for locally installed models provided by GPT4All. See the discussion on #190 to some examples where the responses are not honoring the guardrails in the prompt.
Proposed Solution
Update the prompt, so it behaves consistently with all models. A second option is to provide a custom prompt for local providers.
The text was updated successfully, but these errors were encountered: