Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Ollama Context Size Configuration #409

Open
homjay opened this issue Nov 6, 2024 · 0 comments
Open

Support Ollama Context Size Configuration #409

homjay opened this issue Nov 6, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@homjay
Copy link

homjay commented Nov 6, 2024

例行检查

  • [ x] 我已确认目前没有类似 issue
  • [ x] 我已确认我已升级到最新版本
  • [ x] 我已完整查看过项目 README,尤其是常见问题部分
  • [x ] 我理解并愿意跟进此 issue,协助测试和提供反馈
  • [x ] 我理解并认可上述内容,并理解项目维护者精力有限,不遵循规则的 issue 可能会被无视或直接关闭

问题描述

In ollama, the default context size is 2048, which causes long texts to be unable to track instructions.

one-api has added relevant settings in the request, but they are only effective for non-OpenAI API requests. For example:
songquanpeng/one-api#1694

For applications that only support the OpenAI API, this is very inconvenient.

预期结果

It would be great if ollama's native parameter num_ctx could be supported, and ideally, it should be possible to override the existing variable directly in the settings.

@homjay homjay added the bug Something isn't working label Nov 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant