Skip to content

Commit

Permalink
Preferences.jl integration + new model registry
Browse files Browse the repository at this point in the history
Preferences.jl integration + new model registry
  • Loading branch information
svilupp authored Dec 10, 2023
2 parents ef6ae26 + 815e40f commit c038343
Show file tree
Hide file tree
Showing 21 changed files with 4,912 additions and 4,237 deletions.
5 changes: 5 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,10 +10,15 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
- Introduced a set of utilities for working with generate Julia code (Eg, extract code-fenced Julia code with `PromptingTools.extract_code_blocks` ) or simply apply `AICode` to the AI messages. `AICode` tries to extract, parse and eval Julia code, if it fails both stdout and errors are captured. It is useful for generating Julia code and, in the future, creating self-healing code agents
- Introduced ability to have multi-turn conversations. Set keyword argument `return_all=true` and `ai*` functions will return the whole conversation, not just the last message. To continue a previous conversation, you need to provide it to a keyword argument `conversation`
- Introduced schema `NoSchema` that does not change message format, it merely replaces the placeholders with user-provided variables. It serves as the first pass of the schema pipeline and allow more code reuse across schemas
- Support for project-based and global user preferences with Preferences.jl. See `?PREFERENCES` docstring for more information. It allows you to persist your configuration and model aliases across sessions and projects (eg, if you would like to default to Ollama models instead of OpenAI's)
- Refactored `MODEL_REGISTRY` around `ModelSpec` struct, so you can record the name, schema(!) and token cost of new models in a single place. The biggest benefit is that your `ai*` calls will now automatically lookup the right model schema, eg, no need to define schema explicitly for your Ollama models! See `?ModelSpec` for more information and `?register_model!`for an example of how to register a new model

### Fixed
- Changed type of global `PROMPT_SCHEMA::AbstractPromptSchema` for an easier switch to local models as a default option

### Breaking Changes
- `API_KEY` global variable has been renamed to `OPENAI_API_KEY` to align with the name of the environment variable and preferences

## [0.2.0]

### Added
Expand Down
2 changes: 2 additions & 0 deletions Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ JSON3 = "0f8b85d8-7281-11e9-16c2-39a750bddbf1"
Logging = "56ddb016-857b-54e1-b83d-db4d58db5568"
OpenAI = "e9f21f70-7185-4079-aca2-91159181367c"
PrecompileTools = "aea7be01-6a6a-4083-8856-8a6e6704d82a"
Preferences = "21216c6a-2e73-6563-6e65-726566657250"

[compat]
Aqua = "0.7"
Expand All @@ -19,6 +20,7 @@ JSON3 = "1"
Logging = "<0.0.1, 1"
OpenAI = "0.8.7"
PrecompileTools = "1"
Preferences = "1"
Test = "<0.0.1, 1"
julia = "1.9,1.10"

Expand Down
8 changes: 6 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -514,7 +514,7 @@ Resources:

If you use a local model (eg, with Ollama), it's free. If you use any commercial APIs (eg, OpenAI), you will likely pay per "token" (a sub-word unit).

For example, a simple request with a simple question and 1 sentence response in return (”Is statement XYZ a positive comment”) will cost you ~$0.0001 (ie, one hundredth of a cent)
For example, a simple request with a simple question and 1 sentence response in return (”Is statement XYZ a positive comment”) will cost you ~$0.0001 (ie, one-hundredth of a cent)

**Is it worth paying for?**

Expand Down Expand Up @@ -546,10 +546,14 @@ A better way:
- On a Mac, add the configuration line to your terminal's configuration file (eg, `~/.zshrc`). It will get automatically loaded every time you launch the terminal
- On Windows, set it as a system variable in "Environment Variables" settings (see the Resources)

We also support Preferences.jl, so you can simply run: `PromptingTools.set_preferences!("OPENAI_API_KEY"="your-api-key")` and it will be persisted across sessions.
To see the current preferences, run `PromptingTools.get_preferences("OPENAI_API_KEY")`.

Be careful NOT TO COMMIT `LocalPreferences.toml` to GitHub, as it would show your API Key to the world!

Resources:
- [OpenAI Guide](https://platform.openai.com/docs/quickstart?context=python)

Note: In the future, we hope to add `Preferences.jl`-based workflow to set the API key and other preferences.

### Understanding the API Keyword Arguments in `aigenerate` (`api_kwargs`)

Expand Down
1 change: 1 addition & 0 deletions docs/generate_examples.jl
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ example_files = joinpath(@__DIR__, "..", "examples") |> x -> readdir(x; join = t
output_dir = joinpath(@__DIR__, "src", "examples")

# Run the production loop
filter!(endswith(".jl"), example_files)
for fn in example_files
Literate.markdown(fn, output_dir; execute = true)
end
32 changes: 22 additions & 10 deletions docs/src/examples/working_with_aitemplates.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,14 +37,26 @@ msg = aigenerate(:JuliaExpertAsk; ask = "How do I add packages?")
````

````
AIMessage("To add packages in Julia, you can use the built-in package manager called `Pkg`. Here are the steps:
AIMessage("To add packages in Julia, you can use the `Pkg` module. Here are the steps:
1. Open the Julia REPL (Read-Eval-Print Loop).
2. Press the `]` key to enter the package manager mode.
3. Use the `add` command followed by the name of the package you want to install. For example, to install the `DataFrames` package, type: `add DataFrames`.
4. Press the `backspace` or `ctrl + C` key to exit the package manager mode and return to the REPL.
1. Start Julia by running the Julia REPL (Read-Eval-Print Loop).
2. Press the `]` key to enter the Pkg mode.
3. To add a package, use the `add` command followed by the package name.
4. Press the backspace key to exit Pkg mode and return to the Julia REPL.
After following these steps, the specified package will be installed and available for use in your Julia environment.")
For example, to add the `Example` package, you would enter:
```julia
]add Example
```
After the package is added, you can start using it in your Julia code by using the `using` keyword. For the `Example` package, you would add the following line to your code:
```julia
using Example
```
Note: The first time you add a package, Julia may take some time to download and compile the package and its dependencies.")
````

You can see that it had a placeholder for the actual question (`ask`) that we provided as a keyword argument.
Expand Down Expand Up @@ -90,8 +102,8 @@ msgs = PT.render(AITemplate(:JuliaExpertAsk))

````
2-element Vector{PromptingTools.AbstractChatMessage}:
SystemMessage("You are a world-class Julia language programmer with the knowledge of the latest syntax. Your communication is brief and concise. You're precise and answer only when you're confident in the high quality of your answer.")
UserMessage{String}("# Question\n\n{{ask}}", [:ask], :usermessage)
PromptingTools.SystemMessage("You are a world-class Julia language programmer with the knowledge of the latest syntax. Your communication is brief and concise. You're precise and answer only when you're confident in the high quality of your answer.")
PromptingTools.UserMessage{String}("# Question\n\n{{ask}}", [:ask], :usermessage)
````

Now, you know exactly what's in the template!
Expand All @@ -107,8 +119,8 @@ tpl = [PT.SystemMessage("You are a world-class Julia language programmer with th

````
2-element Vector{PromptingTools.AbstractChatMessage}:
SystemMessage("You are a world-class Julia language programmer with the knowledge of the latest syntax. You're also a senior Data Scientist and proficient in data analysis in Julia. Your communication is brief and concise. You're precise and answer only when you're confident in the high quality of your answer.")
UserMessage{String}("# Question\n\n{{ask}}", [:ask], :usermessage)
PromptingTools.SystemMessage("You are a world-class Julia language programmer with the knowledge of the latest syntax. You're also a senior Data Scientist and proficient in data analysis in Julia. Your communication is brief and concise. You're precise and answer only when you're confident in the high quality of your answer.")
PromptingTools.UserMessage{String}("# Question\n\n{{ask}}", [:ask], :usermessage)
````

Templates are saved in the `templates` directory of the package. Name of the file will become the template name (eg, call `:JuliaDataExpertAsk`)
Expand Down
Loading

0 comments on commit c038343

Please sign in to comment.