Skip to content

Commit

Permalink
Add initial direct client docs
Browse files Browse the repository at this point in the history
  • Loading branch information
dltn committed Nov 23, 2024
1 parent 31e983a commit beab798
Show file tree
Hide file tree
Showing 3 changed files with 44 additions and 1 deletion.
2 changes: 1 addition & 1 deletion docs/source/distributions/building_distro.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Build your own Distribution


This guide will walk you through the steps to get started with building a Llama Stack distributiom from scratch with your choice of API providers.
This guide will walk you through the steps to get started with building a Llama Stack distribution from scratch with your choice of API providers.


## Llama Stack Build
Expand Down
42 changes: 42 additions & 0 deletions docs/source/distributions/importing_as_library.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
# Importing Llama Stack as a Python Library

Llama Stack is typically utilized in a client-server configuration. To get started quickly, you can import Llama Stack as a library and call the APIs directly without needing to set up a server. For [example](https://github.com/meta-llama/llama-stack-client-python/blob/main/src/llama_stack_client/lib/direct/test.py):

```python
from llama_stack_client.lib.direct.direct import LlamaStackDirectClient

client = await LlamaStackDirectClient.from_template('ollama')
await client.initialize()
```

This will parse your config and set up any inline implementations and remote clients needed for your implementation.

Then, you can access the APIs like `models` and `inference` on the client and call their methods directly:

```python
response = await client.models.list()
print(response)
```

```python
response = await client.inference.chat_completion(
messages=[UserMessage(content="What is the capital of France?", role="user")],
model="Llama3.1-8B-Instruct",
stream=False,
)
print("\nChat completion response:")
print(response)
```

If you've created a [custom distribution](https://llama-stack.readthedocs.io/en/latest/distributions/building_distro.html), you can also import it with the `from_config` constructor:

```python
import yaml

with open(config_path, "r") as f:
config_dict = yaml.safe_load(f)

run_config = parse_and_maybe_upgrade_config(config_dict)

client = await LlamaStackDirectClient.from_config(run_config)
```
1 change: 1 addition & 0 deletions docs/source/distributions/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
:maxdepth: 3
:hidden:
importing_as_library
self_hosted_distro/index
remote_hosted_distro/index
building_distro
Expand Down

0 comments on commit beab798

Please sign in to comment.