Skip to content

Commit

Permalink
add examples
Browse files Browse the repository at this point in the history
  • Loading branch information
monofuel committed Sep 25, 2024
1 parent d94a002 commit 6f89788
Show file tree
Hide file tree
Showing 3 changed files with 15 additions and 2 deletions.
12 changes: 10 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
# llama_leap

- WIP
- Nim library to work with the Ollama API

## Example
Expand All @@ -9,13 +8,16 @@
- you may pass an alternate to `newOllamaAPI()`

```nim
import llama_leap
let ollama = newOllamaAPI()
echo ollama.generate("llama2", "How are you today?")
```

## Generate

- Only the non-streaming generate API is currently supported
- streaming is coming soon (TM)

```nim
# simple interface
Expand Down Expand Up @@ -75,4 +77,10 @@ echo "Embedding Length: " & $resp.embedding.len

- ensure ollama is running on the default port
- `./ollama serve`
- run `nim c -r tests/test_llama_leap.nim`
- run `nimble test`

## Related Repos

- [llama_leap](https://github.com/monofuel/openai_leap) is a Nim client for the OpenAI API.
- [vertex_leap](https://github.com/monofuel/vertex_leap) is a client for Google's VertexAI API.
- [mono_llm](https://github.com/monofuel/mono_llm) is a higher-level Nim library that creates a unified interface for OpenAI, Ollama, and VertexAI.
1 change: 1 addition & 0 deletions examples/config.nims
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
--path:"../src"
4 changes: 4 additions & 0 deletions examples/example.nim
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
import llama_leap

let ollama = newOllamaAPI()
echo ollama.generate("llama2", "How are you today?")

0 comments on commit 6f89788

Please sign in to comment.