Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error in local deployment model. #3394

Open
3 tasks
gaomeng6470 opened this issue Dec 16, 2024 · 2 comments
Open
3 tasks

Error in local deployment model. #3394

gaomeng6470 opened this issue Dec 16, 2024 · 2 comments
Assignees
Labels
area:autocomplete Relates to the auto complete feature kind:bug Indicates an unexpected problem or unintended behavior "needs-triage"

Comments

@gaomeng6470
Copy link

Before submitting your bug report

Relevant environment info

- OS:
- Continue version:
- IDE version:
- Model:
- config.json:
  
 "tabAutocompleteModel": {
    "title": "deepseek-coder-v2",
    "provider": "ollama",
    "model": "deepseek-coder-v2:16b-lite-instruct-q8_0",
    "apiBase": "http://114.55.254.105:11434"
  },
  "tabAutocompleteOptions": {
    "debounceDelay": 500,
    "maxPromptTokens": 4000,
    "disableInFiles": [
      "*.md"
    ],
    "multilineCompletions": "always"
  },

Description

Unable to connect,logs is

Code: undefined
Error number: undefined
Syscall: undefined
Type: undefined

Error: HTTP 404 Not Found from http://114.55.254.105:11434/api/generate

This may mean that you forgot to add '/v1' to the end of your 'apiBase' in config.json.
at customFetch (C:\snapshot\continue\binary\out\index.js:531387:17)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async withExponentialBackoff (C:\snapshot\continue\binary\out\index.js:531128:22)
at async Ollama._streamComplete (C:\snapshot\continue\binary\out\index.js:536634:22)
at async Ollama.streamComplete (C:\snapshot\continue\binary\out\index.js:531520:24)
at async ListenableGenerator._start (C:\snapshot\continue\binary\out\index.js:534349:24)
[2024-12-16T07:00:36] Error generating autocompletion: Error: HTTP 404 Not Found from http://114.55.254.105:11434/api/generate

This may mean that you forgot to add '/v1' to the end of your 'apiBase' in config.json.
[2024-12-16T07:00:38] Codebase indexing already in progress, skipping indexing of files

To reproduce

No response

Log output

No response

@sestinj sestinj self-assigned this Dec 16, 2024
@dosubot dosubot bot added area:autocomplete Relates to the auto complete feature kind:bug Indicates an unexpected problem or unintended behavior labels Dec 16, 2024
@sestinj
Copy link
Contributor

sestinj commented Dec 16, 2024

@gaomeng6470 would you be able to share what version of Ollama you are using?

@gaomeng6470
Copy link
Author

ollama version is 0.2.1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:autocomplete Relates to the auto complete feature kind:bug Indicates an unexpected problem or unintended behavior "needs-triage"
Projects
None yet
Development

No branches or pull requests

2 participants