-
Notifications
You must be signed in to change notification settings - Fork 126
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add anthropic models #788
Add anthropic models #788
Conversation
@microsoft-github-policy-service agree |
prompt_tokens: usage.input_tokens, | ||
completion_tokens: usage.output_tokens, | ||
total_tokens: usage.input_tokens + usage.output_tokens, | ||
} satisfies OpenAI.ChatCompletionUsage |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nice, i should start using this one
import * as OpenAI from "./chattypes" | ||
|
||
import { logError } from "./util" | ||
import { ChatCompletionMessageToolCall } from "openai/resources/index.mjs" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
all the openai types are redefined in "chattypes.ts"
historical reasons: genaiscript used to crossbuild for vscode and the openai sdk was node only so we do import type on their interfaces.
} | ||
|
||
const convertToolCallMessage = ( | ||
msg: OpenAI.ChatCompletionMessageParam & { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you want ChatCompletionAssistantMessageParam
content: msg.tool_calls.map((tool) => ({ | ||
type: "tool_use", | ||
id: tool.id, | ||
input: JSON.parse(tool.function.arguments), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ChatCompletionToolCall arguments is a string. The parsing happens later and handles malformed JSON.
const version = env.ANTHROPIC_API_VERSION || undefined | ||
const source = "env: ANTHROPIC_API_..." | ||
const modelKey = "ANTHROPIC_API_KEY" | ||
const type = "anthropic" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
MODEL_PROVIDER_ANTHROPIC ?
| "azure" | ||
| "localai" | ||
| "azure_serverless" | ||
| "anthropic" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the apitype is really only used for variants of openai; since we have a specific provider here, i think we don't need this one.
Looks good to me. THere's a few minor comments and if you could add a section in Configuration.mdx (run |
I'll patch up the types |
Integrate Anthropic Models
@anthropic-ai/sdk
as a dependency.prompt_template.d.ts
.