Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add MCP server support and refactor related logic #905

Merged
merged 10 commits into from
Dec 3, 2024
14 changes: 7 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,9 @@

🚀 **JavaScript-ish environment with convenient tooling for file ingestion, prompt development, and structured data extraction.**

- 📄 **Read the ONLINE DOCUMENTATION at [microsoft.github.io/genaiscript](https://microsoft.github.io/genaiscript/)**
- 📺 Watch an [interview on YouTube with nickyt](https://www.youtube.com/watch?v=aeXQ2MJ0Ye0)
- 🎙️ **Listen to the (cringy) podcast** (generated by NotebookLM).
- 📄 **Read the ONLINE DOCUMENTATION at [microsoft.github.io/genaiscript](https://microsoft.github.io/genaiscript/)**
- 📺 Watch an [interview on YouTube with nickyt](https://www.youtube.com/watch?v=aeXQ2MJ0Ye0)
- 🎙️ **Listen to the (cringy) podcast** (generated by NotebookLM).

https://github.com/user-attachments/assets/ce181cc0-47d5-41cd-bc03-f220407d4dd0

Expand All @@ -18,9 +18,9 @@ https://github.com/user-attachments/assets/ce181cc0-47d5-41cd-bc03-f220407d4dd0

Programmatically assemble prompts for LLMs using JavaScript. Orchestrate LLMs, tools, and data in a single script.

- JavaScript toolbox to work with prompts
- Abstraction to make it easy and productive
- Seamless Visual Studio Code integration
- JavaScript toolbox to work with prompts
- Abstraction to make it easy and productive
- Seamless Visual Studio Code integration

## Hello world

Expand Down Expand Up @@ -150,7 +150,7 @@ const { files } = await workspace.grep(/[a-z][a-z0-9]+/, { globs: "*.md" })
### LLM Tools

Register JavaScript functions as [tools](https://microsoft.github.io/genaiscript/reference/scripts/tools)
(with fallback for models that don't support tools).
(with fallback for models that don't support tools). [Model Context Protocol (MCP) tools](https://microsoft.github.io/genaiscript/reference/scripts/mcp-tools) are also supported.

```js
defTool(
Expand Down
Binary file added docs/src/assets/mcp.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
1 change: 1 addition & 0 deletions docs/src/assets/mcp.png.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Logo of the Model Context Protocol project.
2 changes: 1 addition & 1 deletion docs/src/content/docs/guides/agentic-tools.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
title: Agentic tools
description: Using agentic tools in your script
sidebar:
order: 11
order: 5.1
---

import { Steps } from "@astrojs/starlight/components"
Expand Down
32 changes: 22 additions & 10 deletions docs/src/content/docs/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -38,9 +38,9 @@ import testExplorerAlt from "../../assets/vscode-test-explorer.png.txt?raw"

Programmatically assemble prompts for LLMs using JavaScript. Orchestrate LLMs, tools, and data in a single script.

- JavaScript toolbox to work with prompts
- Abstraction to make it easy and productive
- Seamless Visual Studio Code integration
- JavaScript toolbox to work with prompts
- Abstraction to make it easy and productive
- Seamless Visual Studio Code integration

## Hello world

Expand Down Expand Up @@ -136,13 +136,25 @@ defTool("weather", "live weather",
{ ... "sunny" }
)
```

or use built-in [@agentic tools](/genaiscript/guides/agentic-tools/)
or use [@agentic tools](/genaiscript/guides/agentic-tools/)

```js wrap
import { WeatherClient } from "@agentic/weather"
defTool(new WeatherClient())
```
````

</Card>

<Card title="Model Context Provider Client" icon="setting">

Use [tools](https://modelcontextprotocol.io/docs/concepts/tools) exposed in [MCP Servers](/genaiscript/reference/scripts/mcp-tools)

````js wrap
defTool({ "memory": {
command: "npx",
args: ["-y", "@modelcontextprotocol/server-memory"]
}})
````

</Card>

Expand Down Expand Up @@ -174,9 +186,9 @@ Scripts are [files](/genaiscript/reference/scripts/)! They can be versioned, sha

<FileTree>

- genaisrc
- my-script.genai.mjs
- another-great-script.genai.mjs
- genaisrc
- my-script.genai.mjs
- another-great-script.genai.mjs

</FileTree>

Expand Down Expand Up @@ -250,7 +262,7 @@ The quick brown fox jumps over the lazy dog.

<FileTree>

- poem.txt extracted by genaiscript
- poem.txt extracted by genaiscript

</FileTree>

Expand Down
54 changes: 54 additions & 0 deletions docs/src/content/docs/reference/scripts/mcp-tools.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
---
title: Model Context Protocol Tools
sidebar:
order: 5.09
---
import { Image } from "astro:assets"
import logoPng from "../../../../assets/mcp.png"
import logoPngTxt from "../../../../assets/mcp.png.txt?raw"

<Image src={logoPng} alt={logoPngTxt} />

[Model Context Protocol](https://modelcontextprotocol.io/) (MCP) is an emerging standard
for portable tool definitions.

The MCP defines a protocol that allows to share [tools](https://modelcontextprotocol.io/docs/concepts/tools)
and consume them reguardless of the underlying framework/runtime.

**GenAIScript implements a client for MCP tools**.

## Configuring servers

You can use [defTool](/genaiscript/reference/scripts/tools) to declare a set of server configurations,
using the same syntax as in the [Claude configuration file](https://github.com/modelcontextprotocol/servers?tab=readme-ov-file#using-an-mcp-client).

```js
defTool({
memory: {
command: "npx",
args: ["-y", "@modelcontextprotocol/server-memory"],
},
filesystem: {
command: "npx",
args: [
"-y",
"@modelcontextprotocol/server-filesystem",
path.resolve("."),
],
},
})
```

GenAIScript will launch the server and register all the tools listed by the server.
The tool identifier will be `server_toolname` to avoid clashes.

## Lifecycle of servers

Servers are started when rendering the prompt and stopped once the chat session is completed.

This means that if you define servers in an [inline prompt](/genaiscript/reference/prompts/inline),
the server will be started/stopped for each inline prompt.

## Finding servers

The list of available servers can be found in the [Model Context Protocol Servers project](https://github.com/modelcontextprotocol/servers).
53 changes: 43 additions & 10 deletions docs/src/content/docs/reference/scripts/tools.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,40 @@ to evaluate a math expression.
title="math-agent.genai.mjs"
/>

## Fallback Tools
## Model Context Protocol Tools

[Model Context Provider](https://modelcontextprotocol.io/) (MCP) is an open protocol
that enables seamless integration between LLM applications and external data sources and [tools](https://modelcontextprotocol.io/docs/concepts/tools).

You can leverage [MCP servers](https://github.com/modelcontextprotocol/servers) to provide tools to your LLM.

```js
defTool({
memory: {
command: "npx",
args: ["-y", "@modelcontextprotocol/server-memory"],
},
})
```

See [Model Context Protocol Tools](/genaiscript/reference/scripts/mcp-tools) for more information.


## Agentic Tools

[Agentic](https://agentic.so) is
a standard library of AI functions / tools
which are optimized for both normal TS-usage as well as LLM-based usage.
You can register any agentic tool in your script using `defTool`.

```js
import { calculator } from "@agentic/calculator"
defTool(calculator)
```

See [Agentic tools](/genaiscript/guides/agentic-tools) for more information.

## Fallback Tool Support

Some LLM models do not have built-in model support.
For those model, it is possible to enable tool support through system prompts. The performance may be lower than built-in tools, but it is still possible to use tools.
Expand All @@ -86,15 +119,15 @@ tools so it will happen automatically for those models.

To enable this mode, you can either

- add the `fallbackTools` option to the script
- add the `fallbackTools` option to the script

```js "fallbackTools: true"
script({
fallbackTools: true,
})
```

- or add the `--fallack-tools` flag to the CLI
- or add the `--fallack-tools` flag to the CLI

```sh "--fallback-tools"
npx genaiscript run ... --fallback-tools
Expand All @@ -119,26 +152,26 @@ script({
defTool("current_weather", ...)
```

then use the script id in the `tools` field.
then use the tool id in the `tools` field.

```js 'tools: ["system.current_weather"]'
```js 'tools: ["current_weather"]'
script({
...,
tools: ["system.current_weather"],
tools: ["current_weather"],
})
```

### Example

Let's illustrate how tools come together with a question answering script.

In the script below, we add the `system.retrieval_web_search` which registers the `retrieval_web_search` tool. This tool
In the script below, we add the `retrieval_web_search` tool. This tool
will call into `retrieval.webSearch` as needed.

```js file="answers.genai.mjs"
script({
title: "Answer questions",
system: ["system", "system.retrieval_web_search"]
tool: ["retrieval_web_search"]
})

def("FILES", env.files)
Expand All @@ -151,8 +184,8 @@ $`Answer the questions in FILES using a web search.
We can then apply this script to the `questions.md` file below.

```md file="questions.md"
- What is the weather in Seattle?
- What laws were voted in the USA congress last week?
- What is the weather in Seattle?
- What laws were voted in the USA congress last week?
```

After the first request, the LLM requests to call the `web_search` for each questions.
Expand Down
4 changes: 4 additions & 0 deletions packages/core/src/chat.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ import { MarkdownTrace } from "./trace"
import { PromptImage, PromptPrediction, renderPromptNode } from "./promptdom"
import { LanguageModelConfiguration, host } from "./host"
import { GenerationOptions } from "./generation"
import { dispose } from "./dispose"
import {
JSON5TryParse,
JSON5parse,
Expand Down Expand Up @@ -782,6 +783,7 @@ export async function executeChatSession(
prediction: PromptPrediction,
completer: ChatCompletionHandler,
chatParticipants: ChatParticipant[],
disposables: AsyncDisposable[],
genOptions: GenerationOptions
): Promise<RunPromptResult> {
const {
Expand Down Expand Up @@ -814,6 +816,7 @@ export async function executeChatSession(
}
)
: undefined

try {
trace.startDetails(`🧠 llm chat`)
if (toolDefinitions?.length)
Expand Down Expand Up @@ -923,6 +926,7 @@ export async function executeChatSession(
}
}
} finally {
await dispose(disposables, { trace })
stats.trace(trace)
trace.endDetails()
}
Expand Down
19 changes: 19 additions & 0 deletions packages/core/src/dispose.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
import { TraceOptions } from "./trace"
import { arrayify, logError } from "./util"

export async function dispose(
disposables: ElementOrArray<AsyncDisposable>,
options: TraceOptions
) {
const { trace } = options || {}
for (const disposable of arrayify(disposables)) {
if (disposable !== undefined && disposable[Symbol.asyncDispose]) {
try {
await disposable[Symbol.asyncDispose]()
} catch (e) {
logError(e)
trace.error(e)
}
}
}
}
Loading
Loading