Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Add AI SDK for Server-Side JavaScript. #619

Merged
merged 62 commits into from
Nov 6, 2024
Merged
Show file tree
Hide file tree
Changes from 17 commits
Commits
Show all changes
62 commits
Select commit Hold shift + click to select a range
1e700c0
basic model
InTheCloudDan Sep 2, 2024
76dd721
add package that was dropped from rebase
InTheCloudDan Sep 3, 2024
6f393f7
change to map, switch type to Record
InTheCloudDan Sep 3, 2024
92ff33e
fix type in other location
InTheCloudDan Sep 3, 2024
d4c4fd9
change Record to string string everywhere
InTheCloudDan Sep 3, 2024
97365a0
switch back to unknown
InTheCloudDan Sep 3, 2024
488bc09
update interpolate function signature and escape within function call
InTheCloudDan Sep 3, 2024
679fb26
new package
InTheCloudDan Sep 6, 2024
5b1336e
add tracker
InTheCloudDan Sep 10, 2024
0a2586c
fix inadvertantly removed comment
InTheCloudDan Sep 10, 2024
4abed57
switch to variation ID
InTheCloudDan Sep 11, 2024
48b9a0f
fix package name
InTheCloudDan Oct 11, 2024
d34cb66
update naming
InTheCloudDan Oct 11, 2024
8ed5d9d
remove types file
InTheCloudDan Oct 16, 2024
76f25cc
exports
InTheCloudDan Oct 16, 2024
cf7582c
update example
InTheCloudDan Oct 16, 2024
c88ad2d
update how variationId is passed into tracker
InTheCloudDan Oct 16, 2024
9f78679
Update tracking methods
InTheCloudDan Oct 25, 2024
da3024f
whitespace changes
InTheCloudDan Oct 25, 2024
9ce1049
fix file organization
InTheCloudDan Oct 25, 2024
9c4e69f
openai and bedrock examples
InTheCloudDan Oct 25, 2024
f95775b
Merge branch 'main' into dob/modelConfig
kinyoklion Nov 4, 2024
5ea8a83
feat: AI Tracking to AI SDK (#652)
InTheCloudDan Nov 4, 2024
fcad822
Merge branch 'dob/modelConfig' of github.com:launchdarkly/js-server-s…
kinyoklion Nov 4, 2024
59d0c27
Merge branch 'dob/modelConfig' into feat/dob/REL-3130/aiExamples
kinyoklion Nov 4, 2024
95fa2bc
Build updates.
kinyoklion Nov 4, 2024
817d4f0
Fix bedrock example.
kinyoklion Nov 4, 2024
7d91ea8
Commenting improvements.
kinyoklion Nov 4, 2024
f87f30d
Refine typing.
kinyoklion Nov 4, 2024
b6804f0
Building bedrock example.
kinyoklion Nov 4, 2024
6106c82
Refactoring and cleanup.
kinyoklion Nov 4, 2024
c8d3c05
Renaming.
kinyoklion Nov 4, 2024
96185fb
Add more comments.
kinyoklion Nov 4, 2024
8eb4ad3
Convert openai example to typescript.
kinyoklion Nov 4, 2024
dddd5e7
CI and docs.
kinyoklion Nov 4, 2024
318da7d
Linting.
kinyoklion Nov 4, 2024
3c7aa41
Renames.
kinyoklion Nov 4, 2024
c1b1f3e
Lint
kinyoklion Nov 4, 2024
ef4c8ac
Example build changes.
kinyoklion Nov 4, 2024
f361b9b
Move AI package.
kinyoklion Nov 4, 2024
7043d26
Rename AIClient file to LDAIClient.
kinyoklion Nov 4, 2024
3d8697e
Merge branch 'main' into dob/modelConfig
kinyoklion Nov 4, 2024
e79b713
lint fix
InTheCloudDan Nov 5, 2024
c75ba9d
update examples to emit
InTheCloudDan Nov 5, 2024
fb05b7e
Better OpenAI types.
kinyoklion Nov 5, 2024
a4ce22e
Plurals
kinyoklion Nov 5, 2024
43e9a5d
Broaden server compatibility
kinyoklion Nov 5, 2024
1626a9d
Lint
kinyoklion Nov 5, 2024
4186201
Discussion feedback.
kinyoklion Nov 5, 2024
d0e487b
Change versionId to versionKey.
kinyoklion Nov 5, 2024
29024c1
Lint rules.
kinyoklion Nov 5, 2024
a0c9eca
Trivago lint.
kinyoklion Nov 5, 2024
3cb1b89
Add release configuration.
kinyoklion Nov 5, 2024
e78a10d
Merge branch 'main' into dob/modelConfig
kinyoklion Nov 5, 2024
3c5780c
Readme improvements. Example improvements.
kinyoklion Nov 5, 2024
e36bc35
Update example readme files.
kinyoklion Nov 5, 2024
69086da
Add beta notice.
kinyoklion Nov 5, 2024
205e600
Add markdown type annotation.
kinyoklion Nov 5, 2024
6379a78
Add release configuration to update examples.
kinyoklion Nov 5, 2024
e20a498
Make sure examples are not published to NPM.
kinyoklion Nov 5, 2024
807fa19
Node version.
kinyoklion Nov 5, 2024
4ff029b
Build example deps.
kinyoklion Nov 5, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion package.json
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,8 @@
"packages/store/node-server-sdk-dynamodb",
"packages/telemetry/node-server-sdk-otel",
"packages/tooling/jest",
"packages/sdk/browser"
"packages/sdk/browser",
"packages/sdk/ai"
kinyoklion marked this conversation as resolved.
Show resolved Hide resolved
],
"private": true,
"scripts": {
Expand Down
5 changes: 5 additions & 0 deletions packages/sdk/ai/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
# LaunchDarkly AI SDK for Node.js

This package provides the LaunchDarkly AI SDK for Node.js.

## Installation
36 changes: 36 additions & 0 deletions packages/sdk/ai/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
{
"name": "@launchdarkly/ai",
"version": "0.1.0",
"description": "LaunchDarkly AI SDK for Node.js",
"main": "dist/index.js",
"types": "dist/index.d.ts",
"scripts": {
"build": "tsc",
"test": "jest",
"lint": "eslint . --ext .ts"
},
"keywords": [
"launchdarkly",
"ai",
"llm"
],
"author": "LaunchDarkly",
"license": "Apache-2.0",
"dependencies": {
"@launchdarkly/node-server-sdk": "^9.5.2",
"mustache": "^4.2.0"
},
"devDependencies": {
"@types/jest": "^29.5.3",
"@types/mustache": "^4.2.5",
"@typescript-eslint/eslint-plugin": "^6.20.0",
"@typescript-eslint/parser": "^6.20.0",
"eslint": "^8.45.0",
"jest": "^29.6.1",
"ts-jest": "^29.1.1",
"typescript": "5.1.6"
},
"peerDependencies": {
"@launchdarkly/node-server-sdk": ">=9.4.3"
}
}
77 changes: 77 additions & 0 deletions packages/sdk/ai/src/LDAIConfigTracker.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
import { LDClient, LDContext } from '@launchdarkly/node-server-sdk';

import { BedrockTokenUsage, FeedbackKind, TokenUsage, UnderscoreTokenUsage } from './api/metrics';
import { usageToTokenMetrics } from './trackUtils';

export class LDAIConfigTracker {
private ldClient: LDClient;
private configKey: string;
private variationId: string;
private context: LDContext;

constructor(ldClient: LDClient, configKey: string, variationId: string, context: LDContext) {
this.ldClient = ldClient;
this.configKey = configKey;
this.variationId = variationId;
this.context = context;
}

getTrackData() {
return {
configKey: this.configKey,
variationId: this.variationId,
};
}

trackDuration(duration: number): void {
this.ldClient.track('$ld:ai:duration:total', this.context, this.variationId, duration);
}

trackTokens(tokens: TokenUsage | UnderscoreTokenUsage | BedrockTokenUsage) {
console.log('tracking LLM tokens', tokens);
const tokenMetrics = usageToTokenMetrics(tokens);
console.log('token metrics', tokenMetrics);
if (tokenMetrics.total > 0) {
this.ldClient.track(
'$ld:ai:tokens:total',
this.context,
this.getTrackData(),
tokenMetrics.total,
);
}
if (tokenMetrics.input > 0) {
console.log('tracking input tokens', tokenMetrics.input);
this.ldClient.track(
'$ld:ai:tokens:input',
this.context,
this.getTrackData(),
tokenMetrics.input,
);
}
if (tokenMetrics.output > 0) {
console.log('tracking output tokens', tokenMetrics.output);
this.ldClient.track(
'$ld:ai:tokens:output',
this.context,
this.getTrackData(),
tokenMetrics.output,
);
}
}

trackError(error: number) {
this.ldClient.track('$ld:ai:error', this.context, this.getTrackData(), error);
}

trackGeneration(generation: number) {
this.ldClient.track('$ld:ai:generation', this.context, this.getTrackData(), generation);
}

trackFeedback(feedback: { kind: FeedbackKind }) {
if (feedback.kind === FeedbackKind.Positive) {
this.ldClient.track('$ld:ai:feedback:user:positive', this.context, this.getTrackData(), 1);
} else if (feedback.kind === FeedbackKind.Negative) {
this.ldClient.track('$ld:ai:feedback:user:negative', this.context, this.getTrackData(), 1);
}
}
}
16 changes: 16 additions & 0 deletions packages/sdk/ai/src/api/config/LDAIConfig.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
import { LDAIConfigTracker } from './LDAIConfigTracker';

/**
* AI Config value and tracker.
*/
export interface LDAIConfig {
/**
* The result of the AI Config evaluation.
*/
config: unknown;

/**
* A tracker which can be used to generate analytics for the migration.
*/
tracker: LDAIConfigTracker;
}
9 changes: 9 additions & 0 deletions packages/sdk/ai/src/api/config/LDAIConfigTracker.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
import { BedrockTokenUsage, FeedbackKind, TokenUsage, UnderscoreTokenUsage } from '../metrics';

export interface LDAIConfigTracker {
trackDuration: (duration: number) => void;
trackTokens: (tokens: TokenUsage | UnderscoreTokenUsage | BedrockTokenUsage) => void;
trackError: (error: number) => void;
trackGeneration: (generation: number) => void;
trackFeedback: (feedback: { kind: FeedbackKind }) => void;
}
2 changes: 2 additions & 0 deletions packages/sdk/ai/src/api/config/index.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
export * from './LDAIConfig';
export * from './LDAIConfigTracker';
5 changes: 5 additions & 0 deletions packages/sdk/ai/src/api/metrics/BedrockTokenUsage.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
export interface BedrockTokenUsage {
inputTokens: number;
outputTokens: number;
totalTokens: number;
}
4 changes: 4 additions & 0 deletions packages/sdk/ai/src/api/metrics/FeedbackKind.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
export enum FeedbackKind {
Positive = 'positive',
Negative = 'negative',
}
5 changes: 5 additions & 0 deletions packages/sdk/ai/src/api/metrics/TokenMetrics.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
export interface TokenMetrics {
total: number;
input: number;
output: number;
}
5 changes: 5 additions & 0 deletions packages/sdk/ai/src/api/metrics/TokenUsage.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
export interface TokenUsage {
completionTokens?: number;
promptTokens?: number;
totalTokens?: number;
}
5 changes: 5 additions & 0 deletions packages/sdk/ai/src/api/metrics/UnderscoreTokenUsage.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
export interface UnderscoreTokenUsage {
completion_tokens?: number;
prompt_tokens?: number;
total_tokens?: number;
}
5 changes: 5 additions & 0 deletions packages/sdk/ai/src/api/metrics/index.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
export * from './BedrockTokenUsage';
export * from './FeedbackKind';
export * from './TokenMetrics';
export * from './TokenUsage';
export * from './UnderScoreTokenUsage';
95 changes: 95 additions & 0 deletions packages/sdk/ai/src/index.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,95 @@
import Mustache from 'mustache';

import { LDClient, LDContext } from '@launchdarkly/node-server-sdk';

import { LDAIConfigTracker } from './LDAIConfigTracker';
import { LDAIConfig } from './api/config';

export class AIClient {
kinyoklion marked this conversation as resolved.
Show resolved Hide resolved
private ldClient: LDClient;

constructor(ldClient: LDClient) {
this.ldClient = ldClient;
}

/**
* Parses and interpolates a template string with the provided variables.
*
* @param template - The template string to be parsed and interpolated.
* @param variables - An object containing the variables to be used for interpolation.
* @returns The interpolated string.
*/
interpolateTemplate(template: string, variables: Record<string, unknown>): string {
return Mustache.render(template, variables, undefined, { escape: (item: any) => item });
}

/**
* Retrieves and processes a prompt template based on the provided key, LaunchDarkly context, and variables.
*
* @param key - A unique identifier for the prompt template. This key is used to fetch the correct prompt from storage or configuration.
* @param context - The LaunchDarkly context object that contains relevant information about the current environment, user, or session. This context may influence how the prompt is processed or personalized.
* @param variables - A map of key-value pairs representing dynamic variables to be injected into the prompt template. The keys correspond to placeholders within the template, and the values are the corresponding replacements.
* @param defaultValue - A fallback value to be used if the prompt template associated with the key is not found or if any errors occur during processing.
*
* @returns The processed prompt after all variables have been substituted in the stored prompt template. If the prompt cannot be retrieved or processed, the `defaultValue` is returned.
*
* @example
kinyoklion marked this conversation as resolved.
Show resolved Hide resolved
* ```
* const key = "welcome_prompt";
* const context = new LDContext(...);
* const variables = new Record<string, string>([["username", "John"]]);
* const defaultValue = {}};
*
* const result = modelConfig(key, context, defaultValue, variables);
* console.log(result);
* // Output:
* {
* modelId: "gpt-4o",
* temperature: 0.2,
* maxTokens: 4096,
* userDefinedKey: "myValue",
* prompt: [
* {
* role: "system",
* content: "You are an amazing GPT."
* },
* {
* role: "user",
* content: "Explain how you're an amazing GPT."
* }
* ]
* }
* ```
*/
async modelConfig(
key: string,
context: LDContext,
defaultValue: string,
variables?: Record<string, unknown>,
): Promise<LDAIConfig> {
const detail = await this.ldClient.variationDetail(key, context, defaultValue);

const allVariables = { ldctx: context, ...variables };

detail.value.prompt = detail.value.prompt.map((entry: any) => ({
...entry,
content: this.interpolateTemplate(entry.content, allVariables),
}));

return {
config: detail.value,
tracker: new LDAIConfigTracker(
this.ldClient,
key,
detail.value["_ldMeta"]["variationId"],
context,
),
};
}
}

export function init(ldClient: LDClient): AIClient {
return new AIClient(ldClient);
}

export { LDAIConfigTracker } from './LDAIConfigTracker';
25 changes: 25 additions & 0 deletions packages/sdk/ai/src/trackUtils.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
import { BedrockTokenUsage, TokenMetrics, TokenUsage, UnderscoreTokenUsage } from './types';

export function usageToTokenMetrics(
usage: TokenUsage | UnderscoreTokenUsage | BedrockTokenUsage,
): TokenMetrics {
if ('inputTokens' in usage && 'outputTokens' in usage) {
// Bedrock usage
return {
total: usage.totalTokens,
input: usage.inputTokens,
output: usage.outputTokens,
};
}

// OpenAI usage (both camelCase and snake_case)
return {
total: 'total_tokens' in usage ? usage.total_tokens! : (usage as TokenUsage).totalTokens ?? 0,
input:
'prompt_tokens' in usage ? usage.prompt_tokens! : (usage as TokenUsage).promptTokens ?? 0,
output:
'completion_tokens' in usage
? usage.completion_tokens!
: (usage as TokenUsage).completionTokens ?? 0,
};
}
12 changes: 12 additions & 0 deletions packages/sdk/ai/tsconfig.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
{
"compilerOptions": {
"target": "ES2017",
"module": "commonjs",
"declaration": true,
"outDir": "./dist",
"strict": true,
"esModuleInterop": true
},
"include": ["src"],
"exclude": ["node_modules", "**/*.test.ts"]
}