Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Alibaba #888

Merged
merged 2 commits into from
Nov 22, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
60 changes: 60 additions & 0 deletions docs/src/content/docs/getting-started/configuration.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -927,6 +927,66 @@ OPENAI_API_TYPE=localai

</Steps>

## Alibaba Cloud

[Alibaba Cloud](https://www.alibabacloud.com/) provides a range of AI services, including language models.

```js "alibaba:"
script({
model: "alibaba:qwen-max"
})
```

<Steps>

<ol>

<li>

Sign up for a [Alibaba Cloud account](https://www.alibabacloud.com/help/en/model-studio/developer-reference/get-api-key) and obtain an API key from their [console](https://bailian.console.alibabacloud.com/).

</li>

<li>

Add your Alibaba API key to the `.env` file:

```txt title=".env"
ALIBABA_API_KEY=sk_...
```

</li>

<li>

Find the model that best suits your needs by visiting the [Alibaba models](https://www.alibabacloud.com/help/en/model-studio/developer-reference/use-qwen-by-calling-api).

</li>

<li>

Update your script to use the `model` you choose.

```js
script({
...
model: "alibaba:qwen-max",
})
```

</li>

</ol>

</Steps>

:::note

GenAIScript uses the [OpenAI compatibility](https://www.alibabacloud.com/help/en/model-studio/developer-reference/compatibility-of-openai-with-dashscope) layer
to access Alibaba.

:::

## Ollama

[Ollama](https://ollama.ai/) is a desktop application that lets you download and run models locally.
Expand Down
24 changes: 24 additions & 0 deletions packages/core/src/connection.ts
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,8 @@ import {
MODEL_PROVIDER_GOOGLE,
GOOGLE_API_BASE,
MODEL_PROVIDER_TRANSFORMERS,
MODEL_PROVIDER_ALIBABA,
ALIBABA_BASE,
} from "./constants"
import { fileExists, readText, writeText } from "./fs"
import {
Expand Down Expand Up @@ -325,6 +327,28 @@ export async function parseTokenFromEnv(
} satisfies LanguageModelConfiguration
}

if (provider === MODEL_PROVIDER_ALIBABA) {
const base =
env.ALIBABA_API_BASE ||
env.DASHSCOPE_API_BASE ||
env.DASHSCOPE_HTTP_BASE_URL ||
ALIBABA_BASE
if (base === PLACEHOLDER_API_BASE)
throw new Error("ALIBABA_API_BASE not configured")
if (!URL.canParse(base)) throw new Error(`${base} must be a valid URL`)
const token = env.ALIBABA_API_KEY || env.DASHSCOPE_API_KEY
if (token === undefined || token === PLACEHOLDER_API_KEY)
throw new Error("ALIBABA_API_KEY not configured")
return {
provider,
model,
base,
token,
type: "alibaba",
source: "env: ALIBABA_API_...",
}
}

if (provider === MODEL_PROVIDER_OLLAMA) {
const host = ollamaParseHostVariable(env)
const base = cleanApiBase(host)
Expand Down
10 changes: 10 additions & 0 deletions packages/core/src/constants.ts
Original file line number Diff line number Diff line change
Expand Up @@ -147,6 +147,8 @@ export const LOCALAI_API_BASE = "http://localhost:8080/v1"
export const LITELLM_API_BASE = "http://localhost:4000"
export const ANTHROPIC_API_BASE = "https://api.anthropic.com"
export const HUGGINGFACE_API_BASE = "https://api-inference.huggingface.co/v1"
export const ALIBABA_BASE =
"https://dashscope-intl.aliyuncs.com/compatible-mode/v1"

export const PROMPTFOO_CACHE_PATH = ".genaiscript/cache/tests"
export const PROMPTFOO_CONFIG_DIR = ".genaiscript/config/tests"
Expand All @@ -173,6 +175,7 @@ export const MODEL_PROVIDER_CLIENT = "client"
export const MODEL_PROVIDER_ANTHROPIC = "anthropic"
export const MODEL_PROVIDER_HUGGINGFACE = "huggingface"
export const MODEL_PROVIDER_TRANSFORMERS = "transformers"
export const MODEL_PROVIDER_ALIBABA = "alibaba"

export const TRACE_FILE_PREVIEW_MAX_LENGTH = 240

Expand Down Expand Up @@ -213,6 +216,8 @@ export const DOCS_CONFIGURATION_HUGGINGFACE_URL =
"https://microsoft.github.io/genaiscript/getting-started/configuration/#huggingface"
export const DOCS_CONFIGURATION_HUGGINGFACE_TRANSFORMERS_URL =
"https://microsoft.github.io/genaiscript/getting-started/configuration/#transformers"
export const DOCS_CONFIGURATION_ALIBABA_URL =
"https://microsoft.github.io/genaiscript/getting-started/configuration/#alibaba"
export const DOCS_CONFIGURATION_CONTENT_SAFETY_URL =
"https://microsoft.github.io/genaiscript/reference/scripts/content-safety"
export const DOCS_DEF_FILES_IS_EMPTY_URL =
Expand Down Expand Up @@ -275,6 +280,11 @@ export const MODEL_PROVIDERS = Object.freeze([
detail: "Ollama local model",
url: DOCS_CONFIGURATION_OLLAMA_URL,
},
{
id: MODEL_PROVIDER_ALIBABA,
detail: "Alibaba models",
url: DOCS_CONFIGURATION_ALIBABA_URL,
},
{
id: MODEL_PROVIDER_LLAMAFILE,
detail: "llamafile.ai local model",
Expand Down
1 change: 1 addition & 0 deletions packages/core/src/host.ts
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,7 @@ export type OpenAIAPIType =
| "localai"
| "azure_serverless"
| "azure_serverless_models"
| "alibaba"

export type AzureCredentialsType =
| "default"
Expand Down
10 changes: 8 additions & 2 deletions packages/core/src/openai.ts
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,9 @@ export function getConfigHeaders(cfg: LanguageModelConfiguration) {
"api-key":
token &&
!isBearer &&
(type === "azure" || type === "azure_serverless")
(type === "azure" ||
type === "azure_serverless" ||
type === "alibaba")
? token
: undefined,
"User-Agent": TOOL_ID,
Expand Down Expand Up @@ -154,7 +156,11 @@ export const OpenAIChatCompletion: ChatCompletionHandler = async (
let url = ""
const toolCalls: ChatCompletionToolCall[] = []

if (cfg.type === "openai" || cfg.type === "localai") {
if (
cfg.type === "openai" ||
cfg.type === "localai" ||
cfg.type === "alibaba"
) {
url = trimTrailingSlash(cfg.base) + "/chat/completions"
if (url === OPENROUTER_API_CHAT_URL) {
;(headers as any)[OPENROUTER_SITE_URL_HEADER] =
Expand Down
12 changes: 12 additions & 0 deletions packages/core/src/pricing.json
Original file line number Diff line number Diff line change
Expand Up @@ -315,5 +315,17 @@
"google:gemini-1-pro": {
"price_per_million_input_tokens": 0.5,
"price_per_million_output_tokens": 1.5
},
"alibaba:qwen-max": {
"price_per_million_input_tokens": 10,
"price_per_million_output_tokens": 30
},
"alibaba:qwen-plus": {
"price_per_million_input_tokens": 3,
"price_per_million_output_tokens": 9
},
"alibaba:qwen-turbo": {
"price_per_million_input_tokens": 0.4,
"price_per_million_output_tokens": 1.2
}
}
7 changes: 7 additions & 0 deletions packages/core/src/tools.ts
Original file line number Diff line number Diff line change
@@ -1,10 +1,12 @@
import {
MODEL_PROVIDER_ALIBABA,
MODEL_PROVIDER_AZURE_OPENAI,
MODEL_PROVIDER_AZURE_SERVERLESS_MODELS,
MODEL_PROVIDER_GITHUB,
MODEL_PROVIDER_GOOGLE,
MODEL_PROVIDER_OLLAMA,
MODEL_PROVIDER_OPENAI,
MODEL_PROVIDER_TRANSFORMERS,
} from "./constants"
import { parseModelIdentifier } from "./models"

Expand All @@ -16,6 +18,8 @@ export function isToolsSupported(modelId: string): boolean | undefined {
return false
}

if (provider === MODEL_PROVIDER_TRANSFORMERS) return false

const oai = {
"o1-preview": false,
"o1-mini": false,
Expand Down Expand Up @@ -45,6 +49,9 @@ export function isToolsSupported(modelId: string): boolean | undefined {
[MODEL_PROVIDER_GITHUB]: {
"Phi-3.5-mini-instruct": false,
},
[MODEL_PROVIDER_ALIBABA]: {
// all supported
}
}

return data[provider]?.[model]
Expand Down
5 changes: 5 additions & 0 deletions packages/core/src/types/prompt_template.d.ts
Original file line number Diff line number Diff line change
Expand Up @@ -152,6 +152,11 @@ type ModelType = OptionsOrString<
| "google:gemini-1.5-pro"
| "google:gemini-1.5-pro-002"
| "google:gemini-1-pro"
| "alibaba:qwen-turbo"
| "alibaba:qwen-max"
| "alibaba:qwen-plus"
| "alibaba:qwen2-72b-instruct"
| "alibaba:qwen2-57b-a14b-instruct"
| "transformers:onnx-community/Qwen2.5-0.5B-Instruct:q4"
>

Expand Down
7 changes: 7 additions & 0 deletions packages/vscode/src/lmaccess.ts
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ import {
MODEL_PROVIDER_AZURE_SERVERLESS_OPENAI,
DOCS_CONFIGURATION_URL,
MODEL_PROVIDER_GOOGLE,
MODEL_PROVIDER_ALIBABA,
} from "../../core/src/constants"
import { OpenAIAPIType } from "../../core/src/host"
import { parseModelIdentifier } from "../../core/src/models"
Expand All @@ -39,6 +40,7 @@ async function generateLanguageModelConfiguration(
MODEL_PROVIDER_AZURE_SERVERLESS_MODELS,
MODEL_PROVIDER_LITELLM,
MODEL_PROVIDER_GOOGLE,
MODEL_PROVIDER_ALIBABA,
]
if (supportedProviders.includes(provider)) {
return { provider }
Expand Down Expand Up @@ -92,6 +94,11 @@ async function generateLanguageModelConfiguration(
detail: `Use a GitHub Models with a GitHub subscription.`,
provider: MODEL_PROVIDER_GITHUB,
},
{
label: "Alibaba Cloud",
detail: "Use Alibaba Cloud models.",
provider: MODEL_PROVIDER_ALIBABA,
},
{
label: "LocalAI",
description: "https://localai.io/",
Expand Down
Loading