Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add basic support for Google AI models #868

Merged
merged 4 commits into from
Nov 17, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
70 changes: 62 additions & 8 deletions docs/src/content/docs/getting-started/configuration.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,6 @@ import lmSelectAlt from "../../../assets/vscode-language-models-select.png.txt?r
import oaiModelsSrc from "../../../assets/openai-model-names.png"
import oaiModelsAlt from "../../../assets/openai-model-names.png.txt?raw"


You will need to configure the LLM connection and authorization secrets.

:::tip
Expand Down Expand Up @@ -148,26 +147,25 @@ envFile: ~/.env.genaiscript

### No .env file

If you do not want to use a `.env` file, make sure to populate the environment variables
If you do not want to use a `.env` file, make sure to populate the environment variables
of the genaiscript process with the configuration values.

Here are some common examples:

- Using bash syntax
- Using bash syntax

```sh
OPENAI_API_KEY="value" npx --yes genaiscript run ...
```

- GitHub Action configuration
- GitHub Action configuration

```yaml title=".github/workflows/genaiscript.yml"
run: npx --yes genaiscript run ...
env:
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
run: npx --yes genaiscript run ...
env:
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
```


## OpenAI

This provider, `openai`, is the OpenAI chat model provider.
Expand Down Expand Up @@ -685,6 +683,62 @@ model3=key3
"
```

## Google AI <a href="" id="google" />

The `google` provider allows you to use Google AI models. It gives you access

:::note

GenAIScript uses the [OpenAI compatibility](https://ai.google.dev/gemini-api/docs/openai) layer of Google AI,
so some [limitations](https://ai.google.dev/gemini-api/docs/openai#current-limitations) apply.

:::

<Steps>

<ol>

<li>

Open [Google AI Studio](https://aistudio.google.com/app/apikey) and create a new API key.

</li>

<li>

Update the `.env` file with the API key.

```txt title=".env"
GOOGLE_API_KEY=...
```

</li>

<li>

Find the model identifier in the [Gemini documentation](https://ai.google.dev/gemini-api/docs/models/gemini)
and use it in your script or cli with the `google` provider.

```py "gemini-1.5-pro-002"
...
const model = genAI.getGenerativeModel({
model: "gemini-1.5-pro-002",
});
...
```

then use the model identifier in your script.

```js "gemini-1.5-pro-002"
script({ model: "google:gemini-1.5-pro-002" })
```

</li>

</ol>
pelikhan marked this conversation as resolved.
Show resolved Hide resolved

</Steps>

## GitHub Copilot Chat Models <a id="github-copilot" href=""></a>

If you have access to **GitHub Copilot Chat in Visual Studio Code**,
Expand Down
1 change: 1 addition & 0 deletions docs/src/content/docs/reference/scripts/system.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -3401,6 +3401,7 @@ defTool(
},
{
model: "vision",
cache: "vision_ask_image",
system: [
"system",
"system.assistant",
Expand Down
2 changes: 1 addition & 1 deletion packages/core/src/chat.ts
Original file line number Diff line number Diff line change
Expand Up @@ -818,7 +818,7 @@
topLogprobs,
} = genOptions
const top_logprobs = genOptions.topLogprobs > 0 ? topLogprobs : undefined
const logprobs = genOptions.logprobs || top_logprobs > 0
const logprobs = genOptions.logprobs || top_logprobs > 0 ? true : undefined

Check failure on line 821 in packages/core/src/chat.ts

View workflow job for this annotation

GitHub Actions / build

The expression `genOptions.logprobs || top_logprobs > 0 ? true : undefined` may not work as intended due to operator precedence. Consider using parentheses to clarify the logic.
pelikhan marked this conversation as resolved.
Show resolved Hide resolved
traceLanguageModelConnection(trace, genOptions, connectionToken)
const tools: ChatCompletionTool[] = toolDefinitions?.length
? toolDefinitions.map(
Expand Down
34 changes: 27 additions & 7 deletions packages/core/src/connection.ts
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,8 @@
HUGGINGFACE_API_BASE,
OLLAMA_API_BASE,
OLLAMA_DEFAUT_PORT,
MODEL_PROVIDER_GOOGLE,
GOOGLE_API_BASE,
} from "./constants"
import { fileExists, readText, writeText } from "./fs"
import {
Expand Down Expand Up @@ -129,7 +131,7 @@
token,
source: "env: OPENAI_API_...",
version,
}
} satisfies LanguageModelConfiguration
}

if (provider === MODEL_PROVIDER_GITHUB) {
Expand All @@ -148,7 +150,7 @@
type,
token,
source: `env: ${tokenVar}`,
}
} satisfies LanguageModelConfiguration
}

if (provider === MODEL_PROVIDER_AZURE_OPENAI) {
Expand Down Expand Up @@ -194,7 +196,7 @@
: "env: AZURE_OPENAI_API_... + Entra ID",
version,
azureCredentialsType,
}
} satisfies LanguageModelConfiguration
}

if (provider === MODEL_PROVIDER_AZURE_SERVERLESS_OPENAI) {
Expand Down Expand Up @@ -239,7 +241,7 @@
: "env: AZURE_SERVERLESS_OPENAI_API_... + Entra ID",
version,
azureCredentialsType,
}
} satisfies LanguageModelConfiguration
}

if (provider === MODEL_PROVIDER_AZURE_SERVERLESS_MODELS) {
Expand Down Expand Up @@ -281,7 +283,25 @@
? "env: AZURE_SERVERLESS_MODELS_API_..."
: "env: AZURE_SERVERLESS_MODELS_API_... + Entra ID",
version,
}
} satisfies LanguageModelConfiguration
}

if (provider === MODEL_PROVIDER_GOOGLE) {
pelikhan marked this conversation as resolved.
Show resolved Hide resolved
const token = env.GOOGLE_API_KEY
if (!token) return undefined
if (token === PLACEHOLDER_API_KEY)
throw new Error("GOOGLE_API_KEY not configured")
const base = env.GOOGLE_API_BASE || GOOGLE_API_BASE

Check failure on line 294 in packages/core/src/connection.ts

View workflow job for this annotation

GitHub Actions / build

The base URL for the Google API is hardcoded. Consider making it configurable to enhance flexibility and adaptability.
pelikhan marked this conversation as resolved.
Show resolved Hide resolved

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The base URL for the Google API is hardcoded. Consider making it configurable to enhance flexibility and adaptability.

generated by pr-review-commit hardcoded_url

if (base === PLACEHOLDER_API_BASE)
throw new Error("GOOGLE_API_BASE not configured")
return {
provider,
model,
base,
token,
type: "openai",
source: "env: GOOGLE_API_...",
} satisfies LanguageModelConfiguration
}
pelikhan marked this conversation as resolved.
Show resolved Hide resolved

if (provider === MODEL_PROVIDER_ANTHROPIC) {
Expand All @@ -301,7 +321,7 @@
base,
version,
source,
}
} satisfies LanguageModelConfiguration
}

if (provider === MODEL_PROVIDER_OLLAMA) {
Expand All @@ -314,7 +334,7 @@
token: "ollama",
type: "openai",
source: "env: OLLAMA_HOST",
}
} satisfies LanguageModelConfiguration
}

if (provider === MODEL_PROVIDER_HUGGINGFACE) {
Expand Down
13 changes: 13 additions & 0 deletions packages/core/src/constants.ts
Original file line number Diff line number Diff line change
Expand Up @@ -69,6 +69,7 @@ export const DEFAULT_VISION_MODEL_CANDIDATES = [
"azure_serverless:gpt-4o",
DEFAULT_MODEL,
"anthropic:claude-2",
"google:gemini-1.5-pro-002",
"github:gpt-4o",
]
export const DEFAULT_SMALL_MODEL = "openai:gpt-4o-mini"
Expand All @@ -78,6 +79,7 @@ export const DEFAULT_SMALL_MODEL_CANDIDATES = [
DEFAULT_SMALL_MODEL,
"anthropic:claude-instant-1",
"github:gpt-4o-mini",
"google:gemini-1.5-flash-002",
"client:gpt-4-mini",
]
export const DEFAULT_EMBEDDINGS_MODEL_CANDIDATES = [
Expand Down Expand Up @@ -160,6 +162,7 @@ export const EMOJI_UNDEFINED = "?"
export const MODEL_PROVIDER_OPENAI = "openai"
export const MODEL_PROVIDER_GITHUB = "github"
export const MODEL_PROVIDER_AZURE_OPENAI = "azure"
export const MODEL_PROVIDER_GOOGLE = "google"
export const MODEL_PROVIDER_AZURE_SERVERLESS_OPENAI = "azure_serverless"
export const MODEL_PROVIDER_AZURE_SERVERLESS_MODELS = "azure_serverless_models"
export const MODEL_PROVIDER_OLLAMA = "ollama"
Expand Down Expand Up @@ -203,6 +206,8 @@ export const DOCS_CONFIGURATION_AICI_URL =
"https://microsoft.github.io/genaiscript/reference/scripts/aici/"
export const DOCS_CONFIGURATION_ANTHROPIC_URL =
"https://microsoft.github.io/genaiscript/getting-started/configuration/#anthropic"
export const DOCS_CONFIGURATION_GOOGLE_URL =
"https://microsoft.github.io/genaiscript/getting-started/configuration/#google"
export const DOCS_CONFIGURATION_HUGGINGFACE_URL =
"https://microsoft.github.io/genaiscript/getting-started/configuration/#huggingface"
export const DOCS_CONFIGURATION_CONTENT_SAFETY_URL =
Expand Down Expand Up @@ -247,6 +252,11 @@ export const MODEL_PROVIDERS = Object.freeze([
detail: "Anthropic models",
url: DOCS_CONFIGURATION_ANTHROPIC_URL,
},
{
id: MODEL_PROVIDER_GOOGLE,
detail: "Google AI",
url: DOCS_CONFIGURATION_GOOGLE_URL,
},
{
id: MODEL_PROVIDER_HUGGINGFACE,
detail: "Hugging Face models",
Expand Down Expand Up @@ -357,3 +367,6 @@ export const CHOICE_LOGIT_BIAS = 5

export const SANITIZED_PROMPT_INJECTION =
"...prompt injection detected, content removed..."

export const GOOGLE_API_BASE =
"https://generativelanguage.googleapis.com/v1beta/openai/"
6 changes: 3 additions & 3 deletions packages/core/src/fetch.ts
Original file line number Diff line number Diff line change
Expand Up @@ -163,12 +163,12 @@
? "Bearer ***" // Mask Bearer tokens
: "***") // Mask other authorization headers
)
const cmd = `curl ${url} \\
const cmd = `curl "${url}" \\

Check failure on line 166 in packages/core/src/fetch.ts

View workflow job for this annotation

GitHub Actions / build

The `curl` command is missing quotes around the URL, which could lead to issues if the URL contains special characters.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The curl command is missing quotes around the URL, which could lead to issues if the URL contains special characters.

generated by pr-review-commit missing_quotes

--no-buffer \\
${Object.entries(headers)
.map(([k, v]) => `-H "${k}: ${v}"`)
.join("\\\n")} \\
.join(" \\\n")} \\
-d '${JSON.stringify(body, null, 2).replace(/'/g, "'\\''")}'
--no-buffer
`
if (trace) trace.detailsFenced(`βœ‰οΈ fetch`, cmd, "bash")
else logVerbose(cmd)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,7 @@ defTool(
},
{
model: "vision",
cache: "vision_ask_image",
system: [
"system",
"system.assistant",
Expand Down
2 changes: 1 addition & 1 deletion packages/core/src/ollama.ts
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ async function listModels(
cfg: LanguageModelConfiguration
): Promise<LanguageModelInfo[]> {
// Create a fetch instance to make HTTP requests
const fetch = await createFetch()
const fetch = await createFetch({ retries: 1 })
// Fetch the list of models from the remote API
const res = await fetch(cfg.base.replace("/v1", "/api/tags"), {
method: "GET",
Expand Down
8 changes: 5 additions & 3 deletions packages/core/src/openai.ts
Original file line number Diff line number Diff line change
Expand Up @@ -203,8 +203,8 @@ export const OpenAIChatCompletion: ChatCompletionHandler = async (
trace.dispatchChange()

const fetchHeaders: HeadersInit = {
...getConfigHeaders(cfg),
"Content-Type": "application/json",
...getConfigHeaders(cfg),
...(headers || {}),
}
traceFetchPost(trace, url, fetchHeaders as any, postReq)
Expand Down Expand Up @@ -282,7 +282,9 @@ export const OpenAIChatCompletion: ChatCompletionHandler = async (
numTokens += estimateTokens(delta.content, encoder)
chatResp += delta.content
tokens.push(
...serializeChunkChoiceToLogProbs(choice as ChatCompletionChunkChoice)
...serializeChunkChoiceToLogProbs(
choice as ChatCompletionChunkChoice
)
pelikhan marked this conversation as resolved.
Show resolved Hide resolved
)
trace.appendToken(delta.content)
} else if (Array.isArray(delta.tool_calls)) {
Expand Down Expand Up @@ -409,7 +411,7 @@ export const OpenAIChatCompletion: ChatCompletionHandler = async (
async function listModels(
cfg: LanguageModelConfiguration
): Promise<LanguageModelInfo[]> {
const fetch = await createFetch()
const fetch = await createFetch({ retries: 1 })
const res = await fetch(cfg.base + "/models", {
method: "GET",
headers: {
Expand Down
45 changes: 45 additions & 0 deletions packages/core/src/pricing.json
Original file line number Diff line number Diff line change
Expand Up @@ -270,5 +270,50 @@
"azure_serverless_models:ministral-3b": {
"price_per_million_input_tokens": 0.04,
"price_per_million_output_tokens": 0.04
},
"google:gemini-1.5-flash": {
"price_per_million_input_tokens": 0.075,
"price_per_million_output_tokens": 0.3
},
"google:gemini-1.5-flash-002": {
"price_per_million_input_tokens": 0.075,
"price_per_million_output_tokens": 0.3
},
"google:gemini-1.5-flash-8b": {
"price_per_million_input_tokens": 0.0375,
"price_per_million_output_tokens": 0.15,
"tiers": [
{
"context_size": 128000,
"price_per_million_input_tokens": 0.075,
"price_per_million_output_tokens": 0.3
}
]
},
"google:gemini-1.5-pro": {
"price_per_million_input_tokens": 1.25,
"price_per_million_output_tokens": 5,
"tiers": [
{
"context_size": 128000,
"price_per_million_input_tokens": 2.5,
"price_per_million_output_tokens": 10
}
]
},
"google:gemini-1.5-pro-002": {
"price_per_million_input_tokens": 1.25,
"price_per_million_output_tokens": 5,
"tiers": [
{
"context_size": 128000,
"price_per_million_input_tokens": 2.5,
"price_per_million_output_tokens": 10
}
]
},
"google:gemini-1-pro": {
"price_per_million_input_tokens": 0.5,
"price_per_million_output_tokens": 1.5
}
}
4 changes: 4 additions & 0 deletions packages/core/src/tools.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ import {
MODEL_PROVIDER_AZURE_OPENAI,
MODEL_PROVIDER_AZURE_SERVERLESS_MODELS,
MODEL_PROVIDER_GITHUB,
MODEL_PROVIDER_GOOGLE,
MODEL_PROVIDER_OLLAMA,
MODEL_PROVIDER_OPENAI,
} from "./constants"
Expand Down Expand Up @@ -38,6 +39,9 @@ export function isToolsSupported(modelId: string): boolean | undefined {
[MODEL_PROVIDER_OPENAI]: oai,
[MODEL_PROVIDER_AZURE_OPENAI]: oai,
[MODEL_PROVIDER_AZURE_SERVERLESS_MODELS]: oai,
[MODEL_PROVIDER_GOOGLE]: {
// all supported
},
[MODEL_PROVIDER_GITHUB]: {
"Phi-3.5-mini-instruct": false,
},
Expand Down
Loading