Skip to content

Commit

Permalink
update docs
Browse files Browse the repository at this point in the history
  • Loading branch information
svilupp committed Dec 13, 2023
1 parent f009c5f commit 29a299e
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 4 deletions.
5 changes: 3 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -409,7 +409,8 @@ It all just works, because we have registered the models in the `PromptingTools.
Under the hood, we use a dedicated schema `MistralOpenAISchema` that leverages most of the OpenAI-specific code base, so you can always provide that explicitly as the first argument:

```julia
msg = aigenerate(MistralOpenAISchema(), "Say Hi!"; model="mistral-tiny", api_key=ENV["MISTRALAI_API_KEY"])
const PT = PromptingTools
msg = aigenerate(PT.MistralOpenAISchema(), "Say Hi!"; model="mistral-tiny", api_key=ENV["MISTRALAI_API_KEY"])
```
As you can see, we can load your API key either from the ENV or via the Preferences.jl mechanism (see `?PREFERENCES` for more information).

Expand All @@ -420,7 +421,7 @@ As long as they are compatible with the OpenAI API (eg, sending `messages` with
# Set your API key and the necessary base URL for the API
api_key = "..."
prompt = "Say hi!"
msg = aigenerate(CustomOpenAISchema(), prompt; model="my_model", api_key, api_kwargs=(; url="http://localhost:8081"))
msg = aigenerate(PT.CustomOpenAISchema(), prompt; model="my_model", api_key, api_kwargs=(; url="http://localhost:8081"))
```

As you can see, it also works for any local models that you might have running on your computer!
Expand Down
5 changes: 3 additions & 2 deletions docs/src/examples/readme_examples.md
Original file line number Diff line number Diff line change
Expand Up @@ -303,7 +303,8 @@ It all just works, because we have registered the models in the `PromptingTools.
Under the hood, we use a dedicated schema `MistralOpenAISchema` that leverages most of the OpenAI-specific code base, so you can always provide that explicitly as the first argument:

```julia
msg = aigenerate(MistralOpenAISchema(), "Say Hi!"; model="mistral-tiny", api_key=ENV["MISTRALAI_API_KEY"])
const PT = PromptingTools
msg = aigenerate(PT.MistralOpenAISchema(), "Say Hi!"; model="mistral-tiny", api_key=ENV["MISTRALAI_API_KEY"])
```
As you can see, we can load your API key either from the ENV or via the Preferences.jl mechanism (see `?PREFERENCES` for more information).

Expand All @@ -314,7 +315,7 @@ As long as they are compatible with the OpenAI API (eg, sending `messages` with
# Set your API key and the necessary base URL for the API
api_key = "..."
prompt = "Say hi!"
msg = aigenerate(CustomOpenAISchema(), prompt; model="my_model", api_key, api_kwargs=(; url="http://localhost:8081"))
msg = aigenerate(PT.CustomOpenAISchema(), prompt; model="my_model", api_key, api_kwargs=(; url="http://localhost:8081"))
```

As you can see, it also works for any local models that you might have running on your computer!
Expand Down

0 comments on commit 29a299e

Please sign in to comment.