Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

store model to vscode ml mappings #596

Merged
merged 8 commits into from
Jul 31, 2024
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/src/content/docs/reference/token.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ GenAIScript will try to find the connection token from various sources:

- a `.env` file in the root of your project (VSCode and CLI)
- environment variables, typically within your CI/CD environment (CLI only)
- Visual Studio Language Models (VSCode only)
- Visual Studio Language Chat Models (VSCode only)
pelikhan marked this conversation as resolved.
Show resolved Hide resolved
pelikhan marked this conversation as resolved.
Show resolved Hide resolved
pelikhan marked this conversation as resolved.
Show resolved Hide resolved
pelikhan marked this conversation as resolved.
Show resolved Hide resolved
pelikhan marked this conversation as resolved.
Show resolved Hide resolved

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The phrase "Visual Studio Language Chat Models" should be corrected to "Visual Studio Code Language Models" for consistency and accuracy.

generated by pr-docs-review-commit typo


## .env file or process environment

Expand Down
1 change: 0 additions & 1 deletion packages/cli/src/nodehost.ts
Original file line number Diff line number Diff line change
Expand Up @@ -157,7 +157,6 @@ export class NodeHost implements RuntimeHost {
tok.token = "Bearer " + this._azureToken

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The variable tok is assigned but its value is never used. Consider removing it if it's not needed. 🧹

generated by pr-review-commit unused_variable

pelikhan marked this conversation as resolved.
Show resolved Hide resolved
}
if (!tok && this.clientLanguageModel) {
logVerbose(`model: using client language model`)
return <LanguageModelConfiguration>{
model: modelId,
provider: this.clientLanguageModel.id,

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The log statement logVerbose('model: using client language model') has been removed. This could lead to lack of debugging information when troubleshooting issues related to the client language model. Consider adding it back or replacing it with a more appropriate log statement. 😊

generated by pr-review-commit missing_log

Expand Down
5 changes: 3 additions & 2 deletions packages/cli/src/server.ts
Original file line number Diff line number Diff line change
Expand Up @@ -107,8 +107,9 @@ export async function startServer(options: { port: string }) {
// add handler
const chatId = randomHex(6)
chats[chatId] = async (chunk) => {
if (!responseSoFar) {
trace.itemValue("model", chunk.model)
if (!responseSoFar && chunk.model) {
logVerbose(`visual studio: chat model ${chunk.model}`)
trace.itemValue("chat model", chunk.model)
trace.appendContent("\n\n")
}
trace.appendToken(chunk.chunk)
Expand Down
5 changes: 4 additions & 1 deletion packages/sample/.vscode/settings.json
Original file line number Diff line number Diff line change
Expand Up @@ -4,5 +4,8 @@
"openai",
"outputfilename"
],
"genaiscript.cli.path": "../cli/built/genaiscript.cjs"
"genaiscript.cli.path": "../cli/built/genaiscript.cjs",
"genaiscript.languageChatModels": {
"openai:gpt-4": "github.copilot-chat/3/gpt-4-0125-preview"
}
}
4 changes: 4 additions & 0 deletions packages/vscode/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -234,6 +234,10 @@
{
"title": "GenAIScript",
"properties": {
"genaiscript.languageChatModels": {
"type": "object",
"description": "Mapping from GenAIScript model (openai:gpt-4) to Visual Studio Code Language Chat Model (github...)"
},
"genaiscript.diagnostics": {
"type": "boolean",
"default": false,
Expand Down
15 changes: 8 additions & 7 deletions packages/vscode/src/lmaccess.ts
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,8 @@ async function generateLanguageModelConfiguration(
return { provider }
}

if (Object.keys(state.languageChatModels).length)
const languageChatModels = await state.languageChatModels()
if (Object.keys(languageChatModels).length)
return { provider: MODEL_PROVIDER_CLIENT, model: "*" }
pelikhan marked this conversation as resolved.
Show resolved Hide resolved

const items: (vscode.QuickPickItem & {
Expand All @@ -46,8 +47,8 @@ async function generateLanguageModelConfiguration(
const models = await vscode.lm.selectChatModels()
if (models.length)
items.push({
label: "Visual Studio Language Models",
detail: `Use a registered Language Model (e.g. GitHub Copilot).`,
label: "Visual Studio Language Chat Models",
detail: `Use a registered LLM such as GitHub Copilot.`,
model: "*",
provider: MODEL_PROVIDER_CLIENT,
})
Expand Down Expand Up @@ -104,8 +105,8 @@ async function pickChatModel(
model: string
): Promise<vscode.LanguageModelChat> {
const chatModels = await vscode.lm.selectChatModels()

const chatModelId = state.languageChatModels[model]
const languageChatModels = await state.languageChatModels()
pelikhan marked this conversation as resolved.
Show resolved Hide resolved
const chatModelId = languageChatModels[model]
pelikhan marked this conversation as resolved.
Show resolved Hide resolved
let chatModel = chatModelId && chatModels.find((m) => m.id === chatModelId)
if (!chatModel) {
const items: (vscode.QuickPickItem & {
Expand All @@ -117,10 +118,10 @@ async function pickChatModel(
chatModel,
}))
const res = await vscode.window.showQuickPick(items, {
title: `Pick a Chat Model for ${model}`,
title: `Pick a Language Chat Model for ${model}`,
})
chatModel = res?.chatModel
if (chatModel) state.languageChatModels[model] = chatModel.id
if (chatModel) await state.updateLanguageChatModels(model, chatModel.id)
}
return chatModel
}
Expand Down
2 changes: 1 addition & 1 deletion packages/vscode/src/servermanager.ts
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ export class TerminalServerManager implements ServerManager {
)
subscriptions.push(
vscode.workspace.onDidChangeConfiguration((e) => {
if (e.affectsConfiguration(TOOL_ID)) this.close()
if (e.affectsConfiguration(TOOL_ID + ".cli")) this.close()
})
)

Expand Down
29 changes: 18 additions & 11 deletions packages/vscode/src/state.ts
Original file line number Diff line number Diff line change
Expand Up @@ -107,8 +107,6 @@ export class ExtensionState extends EventTarget {
AIRequestSnapshot
> = undefined
readonly output: vscode.LogOutputChannel
// modelid -> vscode language mode id
languageChatModels: Record<string, string> = {}

constructor(public readonly context: ExtensionContext) {
super()
Expand Down Expand Up @@ -138,15 +136,24 @@ export class ExtensionState extends EventTarget {
subscriptions
)
)
if (
typeof vscode.lm !== "undefined" &&
typeof vscode.lm.onDidChangeChatModels === "function"
)
subscriptions.push(
vscode.lm.onDidChangeChatModels(
() => (this.languageChatModels = {})
)
)
}

async updateLanguageChatModels(model: string, chatModel: string) {
pelikhan marked this conversation as resolved.
Show resolved Hide resolved
pelikhan marked this conversation as resolved.
Show resolved Hide resolved
pelikhan marked this conversation as resolved.
Show resolved Hide resolved
const config = vscode.workspace.getConfiguration(TOOL_ID)
const res = await this.languageChatModels()
if (chatModel === undefined) delete res[model]
else res[model] = chatModel
await config.update("languageChatModels", res)
}
pelikhan marked this conversation as resolved.
Show resolved Hide resolved

async languageChatModels() {
const config = vscode.workspace.getConfiguration(TOOL_ID)
const res =
((await config.get("languageChatModels")) as Record<
string,
string
>) || {}
return res
pelikhan marked this conversation as resolved.
Show resolved Hide resolved
pelikhan marked this conversation as resolved.
Show resolved Hide resolved
}

private async saveScripts() {
Expand Down
Loading