-
Notifications
You must be signed in to change notification settings - Fork 616
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Showing
3 changed files
with
44 additions
and
1 deletion.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,42 @@ | ||
# Importing Llama Stack as a Python Library | ||
|
||
Llama Stack is typically utilized in a client-server configuration. To get started quickly, you can import Llama Stack as a library and call the APIs directly without needing to set up a server. For [example](https://github.com/meta-llama/llama-stack-client-python/blob/main/src/llama_stack_client/lib/direct/test.py): | ||
|
||
```python | ||
from llama_stack_client.lib.direct.direct import LlamaStackDirectClient | ||
|
||
client = await LlamaStackDirectClient.from_template('ollama') | ||
await client.initialize() | ||
``` | ||
|
||
This will parse your config and set up any inline implementations and remote clients needed for your implementation. | ||
|
||
Then, you can access the APIs like `models` and `inference` on the client and call their methods directly: | ||
|
||
```python | ||
response = await client.models.list() | ||
print(response) | ||
``` | ||
|
||
```python | ||
response = await client.inference.chat_completion( | ||
messages=[UserMessage(content="What is the capital of France?", role="user")], | ||
model="Llama3.1-8B-Instruct", | ||
stream=False, | ||
) | ||
print("\nChat completion response:") | ||
print(response) | ||
``` | ||
|
||
If you've created a [custom distribution](https://llama-stack.readthedocs.io/en/latest/distributions/building_distro.html), you can also import it with the `from_config` constructor: | ||
|
||
```python | ||
import yaml | ||
|
||
with open(config_path, "r") as f: | ||
config_dict = yaml.safe_load(f) | ||
|
||
run_config = parse_and_maybe_upgrade_config(config_dict) | ||
|
||
client = await LlamaStackDirectClient.from_config(run_config) | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters