Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update prompts for locally installed models #226

Open
3coins opened this issue Jun 16, 2023 · 6 comments
Open

Update prompts for locally installed models #226

3coins opened this issue Jun 16, 2023 · 6 comments
Assignees
Labels
enhancement New feature or request

Comments

@3coins
Copy link
Collaborator

3coins commented Jun 16, 2023

Problem

Current prompts in Jupyter AI work well with remote providers, but are not optimized for locally installed models provided by GPT4All. See the discussion on #190 to some examples where the responses are not honoring the guardrails in the prompt.

Proposed Solution

Update the prompt, so it behaves consistently with all models. A second option is to provide a custom prompt for local providers.

@3coins 3coins added the enhancement New feature or request label Jun 16, 2023
@vidartf
Copy link
Member

vidartf commented Jun 28, 2023

Would this also solve cases where your model is intended for auto-completing code? E.g. the current prompts asking magics with code format doesn't work with all models since it adds natural language instructions after the code asking about the output being code in markdown etc.

@3coins
Copy link
Collaborator Author

3coins commented Jun 29, 2023

@vidartf
We are looking into applying different prompt templates for each provider, so they are specific to the provider. It's not captured fully here, but this issue along with #225 should make improvements to tackling peculiarities with different providers.

@JasonWeill
Copy link
Collaborator

Fixed by #309.

@3coins
Copy link
Collaborator Author

3coins commented Sep 6, 2023

@JasonWeill
This issue is only solved partially by 309. There is pending work to add these templates and use them with chat UI.

@JasonWeill JasonWeill reopened this Sep 6, 2023
@JasonWeill
Copy link
Collaborator

OK, reopened.

@sundaraa-deshaw
Copy link

Code Llama instruction models require different prompting specification. This requires an ability to customize the prompt for chat generation as well. Can this be supported?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants