From 6630e7efd06ee09075967abe7cd76f17b1f0f2cc Mon Sep 17 00:00:00 2001 From: "Documenter.jl" Date: Thu, 7 Dec 2023 22:02:43 +0000 Subject: [PATCH] build based on cf80d57 --- dev/.documenter-siteinfo.json | 2 +- dev/examples/readme_examples/index.html | 2 +- .../working_with_aitemplates/index.html | 2 +- dev/examples/working_with_ollama/index.html | 2 +- dev/frequently_asked_questions/index.html | 2 +- dev/getting_started/index.html | 2 +- dev/index.html | 2 +- dev/reference/index.html | 46 +++++++++---------- dev/search_index.js | 2 +- 9 files changed, 31 insertions(+), 31 deletions(-) diff --git a/dev/.documenter-siteinfo.json b/dev/.documenter-siteinfo.json index b23d9cc9e..a3f339757 100644 --- a/dev/.documenter-siteinfo.json +++ b/dev/.documenter-siteinfo.json @@ -1 +1 @@ -{"documenter":{"julia_version":"1.9.4","generation_timestamp":"2023-12-07T10:21:53","documenter_version":"1.2.1"}} \ No newline at end of file +{"documenter":{"julia_version":"1.9.4","generation_timestamp":"2023-12-07T22:02:41","documenter_version":"1.2.1"}} \ No newline at end of file diff --git a/dev/examples/readme_examples/index.html b/dev/examples/readme_examples/index.html index f04a39ba2..dbb6a76c0 100644 --- a/dev/examples/readme_examples/index.html +++ b/dev/examples/readme_examples/index.html @@ -83,4 +83,4 @@ msg.content # 4096-element JSON3.Array{Float64... msg = aiembed(schema, ["Embed me", "Embed me"]; model="openhermes2.5-mistral") -msg.content # 4096×2 Matrix{Float64}:

If you're getting errors, check that Ollama is running - see the Setup Guide for Ollama section below.

+msg.content # 4096×2 Matrix{Float64}:

If you're getting errors, check that Ollama is running - see the Setup Guide for Ollama section below.

diff --git a/dev/examples/working_with_aitemplates/index.html b/dev/examples/working_with_aitemplates/index.html index 482b63786..2adb55c2b 100644 --- a/dev/examples/working_with_aitemplates/index.html +++ b/dev/examples/working_with_aitemplates/index.html @@ -31,4 +31,4 @@ PT.save_template(filename, tpl; description = "For asking data analysis questions in Julia language. Placeholders: `ask`") -rm(filename) # cleanup if we don't like it

When you create a new template, remember to re-load the templates with load_templates!() so that it's available for use.

PT.load_templates!();

!!! If you have some good templates (or suggestions for the existing ones), please consider sharing them with the community by opening a PR to the templates directory!


This page was generated using Literate.jl.

+rm(filename) # cleanup if we don't like it

When you create a new template, remember to re-load the templates with load_templates!() so that it's available for use.

PT.load_templates!();

!!! If you have some good templates (or suggestions for the existing ones), please consider sharing them with the community by opening a PR to the templates directory!


This page was generated using Literate.jl.

diff --git a/dev/examples/working_with_ollama/index.html b/dev/examples/working_with_ollama/index.html index f3c150fe2..42133278b 100644 --- a/dev/examples/working_with_ollama/index.html +++ b/dev/examples/working_with_ollama/index.html @@ -4124,4 +4124,4 @@ LinearAlgebra.normalize; model = "openhermes2.5-mistral")
DataMessage(Matrix{Float64} of size (4096, 2))

Cosine similarity is then a simple multiplication

msg.content' * msg.content[:, 1]
2-element Vector{Float64}:
  0.9999999999999946
- 0.34130017815042357

This page was generated using Literate.jl.

+ 0.34130017815042357

This page was generated using Literate.jl.

diff --git a/dev/frequently_asked_questions/index.html b/dev/frequently_asked_questions/index.html index 957aff9a5..ed895285f 100644 --- a/dev/frequently_asked_questions/index.html +++ b/dev/frequently_asked_questions/index.html @@ -1,3 +1,3 @@ F.A.Q. · PromptingTools.jl

Frequently Asked Questions

Why OpenAI

OpenAI's models are at the forefront of AI research and provide robust, state-of-the-art capabilities for many tasks.

There will be situations not or cannot use it (eg, privacy, cost, etc.). In that case, you can use local models (eg, Ollama) or other APIs (eg, Anthropic).

Note: To get started with Ollama.ai, see the Setup Guide for Ollama section below.

Data Privacy and OpenAI

At the time of writing, OpenAI does NOT use the API calls for training their models.

API

OpenAI does not use data submitted to and generated by our API to train OpenAI models or improve OpenAI’s service offering. In order to support the continuous improvement of our models, you can fill out this form to opt-in to share your data with us. – How your data is used to improve our models

You can always double-check the latest information on the OpenAI's How we use your data page.

Resources:

Creating OpenAI API Key

You can get your API key from OpenAI by signing up for an account and accessing the API section of the OpenAI website.

  1. Create an account with OpenAI
  2. Go to API Key page
  3. Click on “Create new secret key”

!!! Do not share it with anyone and do NOT save it to any files that get synced online.

Resources:

Pro tip: Always set the spending limits!

Setting OpenAI Spending Limits

OpenAI allows you to set spending limits directly on your account dashboard to prevent unexpected costs.

  1. Go to OpenAI Billing
  2. Set Soft Limit (you’ll receive a notification) and Hard Limit (API will stop working not to spend more money)

A good start might be a soft limit of c.$5 and a hard limit of c.$10 - you can always increase it later in the month.

Resources:

How much does it cost? Is it worth paying for?

If you use a local model (eg, with Ollama), it's free. If you use any commercial APIs (eg, OpenAI), you will likely pay per "token" (a sub-word unit).

For example, a simple request with a simple question and 1 sentence response in return (”Is statement XYZ a positive comment”) will cost you ~0.0001 (ie, one hundredth of a cent)

Is it worth paying for?

GenAI is a way to buy time! You can pay cents to save tens of minutes every day.

Continuing the example above, imagine you have a table with 200 comments. Now, you can parse each one of them with an LLM for the features/checks you need. Assuming the price per call was 0.0001, you'd pay 2 cents for the job and save 30-60 minutes of your time!

Resources:

Configuring the Environment Variable for API Key

To use the OpenAI API with PromptingTools.jl, set your API key as an environment variable:

ENV["OPENAI_API_KEY"] = "your-api-key"

As a one-off, you can:

  • set it in the terminal before launching Julia: export OPENAI_API_KEY = <your key>
  • set it in your setup.jl (make sure not to commit it to GitHub!)

Make sure to start Julia from the same terminal window where you set the variable. Easy check in Julia, run ENV["OPENAI_API_KEY"] and you should see your key!

A better way:

  • On a Mac, add the configuration line to your terminal's configuration file (eg, ~/.zshrc). It will get automatically loaded every time you launch the terminal
  • On Windows, set it as a system variable in "Environment Variables" settings (see the Resources)

Resources:

Note: In the future, we hope to add Preferences.jl-based workflow to set the API key and other preferences.

Understanding the API Keyword Arguments in aigenerate (api_kwargs)

See OpenAI API reference for more information.

Instant Access from Anywhere

For easy access from anywhere, add PromptingTools into your startup.jl (can be found in ~/.julia/config/startup.jl).

Add the following snippet:

using PromptingTools
-const PT = PromptingTools # to access unexported functions and types

Now, you can just use ai"Help me do X to achieve Y" from any REPL session!

Open Source Alternatives

The ethos of PromptingTools.jl is to allow you to use whatever model you want, which includes Open Source LLMs. The most popular and easiest to setup is Ollama.ai - see below for more information.

Setup Guide for Ollama

Ollama runs a background service hosting LLMs that you can access via a simple API. It's especially useful when you're working with some sensitive data that should not be sent anywhere.

Installation is very easy, just download the latest version here.

Once you've installed it, just launch the app and you're ready to go!

To check if it's running, go to your browser and open 127.0.0.1:11434. You should see the message "Ollama is running". Alternatively, you can run ollama serve in your terminal and you'll get a message that it's already running.

There are many models available in Ollama Library, including Llama2, CodeLlama, SQLCoder, or my personal favorite openhermes2.5-mistral.

Download new models with ollama pull <model_name> (eg, ollama pull openhermes2.5-mistral).

Show currently available models with ollama list.

See Ollama.ai for more information.

+const PT = PromptingTools # to access unexported functions and types

Now, you can just use ai"Help me do X to achieve Y" from any REPL session!

Open Source Alternatives

The ethos of PromptingTools.jl is to allow you to use whatever model you want, which includes Open Source LLMs. The most popular and easiest to setup is Ollama.ai - see below for more information.

Setup Guide for Ollama

Ollama runs a background service hosting LLMs that you can access via a simple API. It's especially useful when you're working with some sensitive data that should not be sent anywhere.

Installation is very easy, just download the latest version here.

Once you've installed it, just launch the app and you're ready to go!

To check if it's running, go to your browser and open 127.0.0.1:11434. You should see the message "Ollama is running". Alternatively, you can run ollama serve in your terminal and you'll get a message that it's already running.

There are many models available in Ollama Library, including Llama2, CodeLlama, SQLCoder, or my personal favorite openhermes2.5-mistral.

Download new models with ollama pull <model_name> (eg, ollama pull openhermes2.5-mistral).

Show currently available models with ollama list.

See Ollama.ai for more information.

diff --git a/dev/getting_started/index.html b/dev/getting_started/index.html index 23a5e044c..a15b760e2 100644 --- a/dev/getting_started/index.html +++ b/dev/getting_started/index.html @@ -4,4 +4,4 @@ AIMessage("The capital of France is Paris.")

Returned object is a light wrapper with generated message in field :content (eg, ans.content) for additional downstream processing.

You can easily inject any variables with string interpolation:

country = "Spain"
 ai"What is the capital of \$(country)?"
[ Info: Tokens: 32 @ Cost: $0.0001 in 0.5 seconds
 AIMessage("The capital of Spain is Madrid.")

Pro tip: Use after-string-flags to select the model to be called, eg, ai"What is the capital of France?"gpt4 (use gpt4t for the new GPT-4 Turbo model). Great for those extra hard questions!

Using aigenerate with placeholders

For more complex prompt templates, you can use handlebars-style templating and provide variables as keyword arguments:

msg = aigenerate("What is the capital of {{country}}? Is the population larger than {{population}}?", country="Spain", population="1M")
[ Info: Tokens: 74 @ Cost: $0.0001 in 1.3 seconds
-AIMessage("The capital of Spain is Madrid. And yes, the population of Madrid is larger than 1 million. As of 2020, the estimated population of Madrid is around 3.3 million people.")

Pro tip: Use asyncmap to run multiple AI-powered tasks concurrently.

Pro tip: If you use slow models (like GPT-4), you can use async version of @ai_str -> @aai_str to avoid blocking the REPL, eg, aai"Say hi but slowly!"gpt4

For more practical examples, see the Various Examples section.

+AIMessage("The capital of Spain is Madrid. And yes, the population of Madrid is larger than 1 million. As of 2020, the estimated population of Madrid is around 3.3 million people.")

Pro tip: Use asyncmap to run multiple AI-powered tasks concurrently.

Pro tip: If you use slow models (like GPT-4), you can use async version of @ai_str -> @aai_str to avoid blocking the REPL, eg, aai"Say hi but slowly!"gpt4

For more practical examples, see the Various Examples section.

diff --git a/dev/index.html b/dev/index.html index 652b69315..ecb01c98a 100644 --- a/dev/index.html +++ b/dev/index.html @@ -1,2 +1,2 @@ -Home · PromptingTools.jl

PromptingTools

Documentation for PromptingTools.

Streamline your life using PromptingTools.jl, the Julia package that simplifies interacting with large language models.

PromptingTools.jl is not meant for building large-scale systems. It's meant to be the go-to tool in your global environment that will save you 20 minutes every day!

Why PromptingTools.jl?

Prompt engineering is neither fast nor easy. Moreover, different models and their fine-tunes might require different prompt formats and tricks, or perhaps the information you work with requires special models to be used. PromptingTools.jl is meant to unify the prompts for different backends and make the common tasks (like templated prompts) as simple as possible.

Some features:

  • aigenerate Function: Simplify prompt templates with handlebars (eg, {{variable}}) and keyword arguments
  • @ai_str String Macro: Save keystrokes with a string macro for simple prompts
  • Easy to Remember: All exported functions start with ai... for better discoverability
  • Light Wraper Types: Benefit from Julia's multiple dispatch by having AI outputs wrapped in specific types
  • Minimal Dependencies: Enjoy an easy addition to your global environment with very light dependencies
  • No Context Switching: Access cutting-edge LLMs with no context switching and minimum extra keystrokes directly in your REPL

First Steps

To get started, see the Getting Started section.

+Home · PromptingTools.jl

PromptingTools

Documentation for PromptingTools.

Streamline your life using PromptingTools.jl, the Julia package that simplifies interacting with large language models.

PromptingTools.jl is not meant for building large-scale systems. It's meant to be the go-to tool in your global environment that will save you 20 minutes every day!

Why PromptingTools.jl?

Prompt engineering is neither fast nor easy. Moreover, different models and their fine-tunes might require different prompt formats and tricks, or perhaps the information you work with requires special models to be used. PromptingTools.jl is meant to unify the prompts for different backends and make the common tasks (like templated prompts) as simple as possible.

Some features:

  • aigenerate Function: Simplify prompt templates with handlebars (eg, {{variable}}) and keyword arguments
  • @ai_str String Macro: Save keystrokes with a string macro for simple prompts
  • Easy to Remember: All exported functions start with ai... for better discoverability
  • Light Wraper Types: Benefit from Julia's multiple dispatch by having AI outputs wrapped in specific types
  • Minimal Dependencies: Enjoy an easy addition to your global environment with very light dependencies
  • No Context Switching: Access cutting-edge LLMs with no context switching and minimum extra keystrokes directly in your REPL

First Steps

To get started, see the Getting Started section.

diff --git a/dev/reference/index.html b/dev/reference/index.html index a005971e5..4101a3bf3 100644 --- a/dev/reference/index.html +++ b/dev/reference/index.html @@ -1,5 +1,5 @@ -Reference · PromptingTools.jl

Reference

PromptingTools.AICodeType
AICode(code::AbstractString; safe_eval::Bool=false, prefix::AbstractString="", suffix::AbstractString="")

A mutable structure representing a code block (received from the AI model) with automatic parsing, execution, and output/error capturing capabilities.

Upon instantiation with a string, the AICode object automatically runs a code parser and executor (via PromptingTools.eval!()), capturing any standard output (stdout) or errors. This structure is useful for programmatically handling and evaluating Julia code snippets.

See also: PromptingTools.extract_code_blocks, PromptingTools.eval!

Workflow

  • Until cb::AICode has been evaluated, cb.success is set to nothing (and so are all other fields).
  • The text in cb.code is parsed (saved to cb.expression).
  • The parsed expression is evaluated.
  • Outputs of the evaluated expression are captured in cb.output.
  • Any stdout outputs (e.g., from println) are captured in cb.stdout.
  • If an error occurs during evaluation, it is saved in cb.error.
  • After successful evaluation without errors, cb.success is set to true. Otherwise, it is set to false and you can inspect the cb.error to understand why.

Properties

  • code::AbstractString: The raw string of the code to be parsed and executed.
  • expression: The parsed Julia expression (set after parsing code).
  • stdout: Captured standard output from the execution of the code.
  • output: The result of evaluating the code block.
  • success::Union{Nothing, Bool}: Indicates whether the code block executed successfully (true), unsuccessfully (false), or has yet to be evaluated (nothing).
  • error::Union{Nothing, Exception}: Any exception raised during the execution of the code block.

Keyword Arguments

  • safe_eval::Bool: If set to true, the code block checks for package operations (e.g., installing new packages) and missing imports, and then evaluates the code inside a bespoke scratch module. This is to ensure that the evaluation does not alter any user-defined variables or the global state. Defaults to false.
  • prefix::AbstractString: A string to be prepended to the code block before parsing and evaluation. Useful to add some additional code definition or necessary imports. Defaults to an empty string.
  • suffix::AbstractString: A string to be appended to the code block before parsing and evaluation. Useful to check that tests pass or that an example executes. Defaults to an empty string.

Methods

  • Base.isvalid(cb::AICode): Check if the code block has executed successfully. Returns true if cb.success == true.

Examples

code = AICode("println("Hello, World!")") # Auto-parses and evaluates the code, capturing output and errors.
+Reference · PromptingTools.jl

Reference

PromptingTools.AICodeType
AICode(code::AbstractString; safe_eval::Bool=false, prefix::AbstractString="", suffix::AbstractString="")

A mutable structure representing a code block (received from the AI model) with automatic parsing, execution, and output/error capturing capabilities.

Upon instantiation with a string, the AICode object automatically runs a code parser and executor (via PromptingTools.eval!()), capturing any standard output (stdout) or errors. This structure is useful for programmatically handling and evaluating Julia code snippets.

See also: PromptingTools.extract_code_blocks, PromptingTools.eval!

Workflow

  • Until cb::AICode has been evaluated, cb.success is set to nothing (and so are all other fields).
  • The text in cb.code is parsed (saved to cb.expression).
  • The parsed expression is evaluated.
  • Outputs of the evaluated expression are captured in cb.output.
  • Any stdout outputs (e.g., from println) are captured in cb.stdout.
  • If an error occurs during evaluation, it is saved in cb.error.
  • After successful evaluation without errors, cb.success is set to true. Otherwise, it is set to false and you can inspect the cb.error to understand why.

Properties

  • code::AbstractString: The raw string of the code to be parsed and executed.
  • expression: The parsed Julia expression (set after parsing code).
  • stdout: Captured standard output from the execution of the code.
  • output: The result of evaluating the code block.
  • success::Union{Nothing, Bool}: Indicates whether the code block executed successfully (true), unsuccessfully (false), or has yet to be evaluated (nothing).
  • error::Union{Nothing, Exception}: Any exception raised during the execution of the code block.

Keyword Arguments

  • safe_eval::Bool: If set to true, the code block checks for package operations (e.g., installing new packages) and missing imports, and then evaluates the code inside a bespoke scratch module. This is to ensure that the evaluation does not alter any user-defined variables or the global state. Defaults to false.
  • prefix::AbstractString: A string to be prepended to the code block before parsing and evaluation. Useful to add some additional code definition or necessary imports. Defaults to an empty string.
  • suffix::AbstractString: A string to be appended to the code block before parsing and evaluation. Useful to check that tests pass or that an example executes. Defaults to an empty string.

Methods

  • Base.isvalid(cb::AICode): Check if the code block has executed successfully. Returns true if cb.success == true.

Examples

code = AICode("println("Hello, World!")") # Auto-parses and evaluates the code, capturing output and errors.
 isvalid(code) # Output: true
 code.stdout # Output: "Hello, World!
 "

We try to evaluate "safely" by default (eg, inside a custom module, to avoid changing user variables). You can avoid that with save_eval=false:

code = AICode("new_variable = 1"; safe_eval=false)
@@ -18,7 +18,7 @@
 code.code |> clipboard
 
 # or execute it in the current module (=Main)
-eval(code.expression)
source
PromptingTools.AITemplateType
AITemplate

AITemplate is a template for a conversation prompt. This type is merely a container for the template name, which is resolved into a set of messages (=prompt) by render.

Naming Convention

  • Template names should be in CamelCase
  • Follow the format <Persona>...<Variable>... where possible, eg, JudgeIsItTrue, ``
    • Starting with the Persona (=System prompt), eg, Judge = persona is meant to judge some provided information
    • Variable to be filled in with context, eg, It = placeholder it
    • Ending with the variable name is helpful, eg, JuliaExpertTask for a persona to be an expert in Julia language and task is the placeholder name
  • Ideally, the template name should be self-explanatory, eg, JudgeIsItTrue = persona is meant to judge some provided information where it is true or false

Examples

Save time by re-using pre-made templates, just fill in the placeholders with the keyword arguments:

msg = aigenerate(:JuliaExpertAsk; ask = "How do I add packages?")

The above is equivalent to a more verbose version that explicitly uses the dispatch on AITemplate:

msg = aigenerate(AITemplate(:JuliaExpertAsk); ask = "How do I add packages?")

Find available templates with aitemplates:

tmps = aitemplates("JuliaExpertAsk")
+eval(code.expression)
source
PromptingTools.AITemplateType
AITemplate

AITemplate is a template for a conversation prompt. This type is merely a container for the template name, which is resolved into a set of messages (=prompt) by render.

Naming Convention

  • Template names should be in CamelCase
  • Follow the format <Persona>...<Variable>... where possible, eg, JudgeIsItTrue, ``
    • Starting with the Persona (=System prompt), eg, Judge = persona is meant to judge some provided information
    • Variable to be filled in with context, eg, It = placeholder it
    • Ending with the variable name is helpful, eg, JuliaExpertTask for a persona to be an expert in Julia language and task is the placeholder name
  • Ideally, the template name should be self-explanatory, eg, JudgeIsItTrue = persona is meant to judge some provided information where it is true or false

Examples

Save time by re-using pre-made templates, just fill in the placeholders with the keyword arguments:

msg = aigenerate(:JuliaExpertAsk; ask = "How do I add packages?")

The above is equivalent to a more verbose version that explicitly uses the dispatch on AITemplate:

msg = aigenerate(AITemplate(:JuliaExpertAsk); ask = "How do I add packages?")

Find available templates with aitemplates:

tmps = aitemplates("JuliaExpertAsk")
 # Will surface one specific template
 # 1-element Vector{AITemplateMetadata}:
 # PromptingTools.AITemplateMetadata
@@ -33,18 +33,18 @@
 {{ask}}"
 #   source: String ""

The above gives you a good idea of what the template is about, what placeholders are available, and how much it would cost to use it (=wordcount).

Search for all Julia-related templates:

tmps = aitemplates("Julia")
 # 2-element Vector{AITemplateMetadata}... -> more to come later!

If you are on VSCode, you can leverage nice tabular display with vscodedisplay:

using DataFrames
-tmps = aitemplates("Julia") |> DataFrame |> vscodedisplay

I have my selected template, how do I use it? Just use the "name" in aigenerate or aiclassify like you see in the first example!

You can inspect any template by "rendering" it (this is what the LLM will see):

julia> AITemplate(:JudgeIsItTrue) |> PromptingTools.render

See also: save_template, load_template, load_templates! for more advanced use cases (and the corresponding script in examples/ folder)

source
PromptingTools.ChatMLSchemaType

ChatMLSchema is used by many open-source chatbots, by OpenAI models (under the hood) and by several models and inferfaces (eg, Ollama, vLLM)

You can explore it on tiktokenizer

It uses the following conversation structure:

<im_start>system
+tmps = aitemplates("Julia") |> DataFrame |> vscodedisplay

I have my selected template, how do I use it? Just use the "name" in aigenerate or aiclassify like you see in the first example!

You can inspect any template by "rendering" it (this is what the LLM will see):

julia> AITemplate(:JudgeIsItTrue) |> PromptingTools.render

See also: save_template, load_template, load_templates! for more advanced use cases (and the corresponding script in examples/ folder)

source
PromptingTools.ChatMLSchemaType

ChatMLSchema is used by many open-source chatbots, by OpenAI models (under the hood) and by several models and inferfaces (eg, Ollama, vLLM)

You can explore it on tiktokenizer

It uses the following conversation structure:

<im_start>system
 ...<im_end>
 <|im_start|>user
 ...<|im_end|>
 <|im_start|>assistant
-...<|im_end|>
source
PromptingTools.MaybeExtractType

Extract a result from the provided data, if any, otherwise set the error and message fields.

Arguments

  • error::Bool: true if a result is found, false otherwise.
  • message::String: Only present if no result is found, should be short and concise.
source
PromptingTools.NoSchemaType

Schema that keeps messages (<:AbstractMessage) and does not transform for any specific model. It used by the first pass of the prompt rendering system (see ?render).

source
PromptingTools.OllamaManagedSchemaType

Ollama by default manages different models and their associated prompt schemas when you pass system_prompt and prompt fields to the API.

Warning: It works only for 1 system message and 1 user message, so anything more than that has to be rejected.

If you need to pass more messagese / longer conversational history, you can use define the model-specific schema directly and pass your Ollama requests with raw=true, which disables and templating and schema management by Ollama.

source
PromptingTools.OpenAISchemaType

OpenAISchema is the default schema for OpenAI models.

It uses the following conversation template:

[Dict(role="system",content="..."),Dict(role="user",content="..."),Dict(role="assistant",content="...")]

It's recommended to separate sections in your prompt with markdown headers (e.g. `##Answer

`).

source
PromptingTools.MaybeExtractType

Extract a result from the provided data, if any, otherwise set the error and message fields.

Arguments

  • error::Bool: true if a result is found, false otherwise.
  • message::String: Only present if no result is found, should be short and concise.
source
PromptingTools.NoSchemaType

Schema that keeps messages (<:AbstractMessage) and does not transform for any specific model. It used by the first pass of the prompt rendering system (see ?render).

source
PromptingTools.OllamaManagedSchemaType

Ollama by default manages different models and their associated prompt schemas when you pass system_prompt and prompt fields to the API.

Warning: It works only for 1 system message and 1 user message, so anything more than that has to be rejected.

If you need to pass more messagese / longer conversational history, you can use define the model-specific schema directly and pass your Ollama requests with raw=true, which disables and templating and schema management by Ollama.

source
PromptingTools.OpenAISchemaType

OpenAISchema is the default schema for OpenAI models.

It uses the following conversation template:

[Dict(role="system",content="..."),Dict(role="user",content="..."),Dict(role="assistant",content="...")]

It's recommended to separate sections in your prompt with markdown headers (e.g. `##Answer

`).

source
PromptingTools.aiclassifyMethod
aiclassify(prompt_schema::AbstractOpenAISchema, prompt::ALLOWED_PROMPT_TYPE;
 api_kwargs::NamedTuple = (logit_bias = Dict(837 => 100, 905 => 100, 9987 => 100),
     max_tokens = 1, temperature = 0),
 kwargs...)

Classifies the given prompt/statement as true/false/unknown.

Note: this is a very simple classifier, it is not meant to be used in production. Credit goes to AAAzzam.

It uses Logit bias trick and limits the output to 1 token to force the model to output only true/false/unknown.

Output tokens used (via api_kwargs):

  • 837: ' true'
  • 905: ' false'
  • 9987: ' unknown'

Arguments

  • prompt_schema::AbstractOpenAISchema: The schema for the prompt.
  • prompt: The prompt/statement to classify if it's a String. If it's a Symbol, it is expanded as a template via render(schema,template).

Example

aiclassify("Is two plus two four?") # true
 aiclassify("Is two plus three a vegetable on Mars?") # false

aiclassify returns only true/false/unknown. It's easy to get the proper Bool output type out with tryparse, eg,

tryparse(Bool, aiclassify("Is two plus two four?")) isa Bool # true

Output of type Nothing marks that the model couldn't classify the statement as true/false.

Ideally, we would like to re-use some helpful system prompt to get more accurate responses. For this reason we have templates, eg, :JudgeIsItTrue. By specifying the template, we can provide our statement as the expected variable (it in this case) See that the model now correctly classifies the statement as "unknown".

aiclassify(:JudgeIsItTrue; it = "Is two plus three a vegetable on Mars?") # unknown

For better results, use higher quality models like gpt4, eg,

aiclassify(:JudgeIsItTrue;
     it = "If I had two apples and I got three more, I have five apples now.",
-    model = "gpt4") # true
source
PromptingTools.aiembedMethod
aiembed(prompt_schema::AbstractOllamaManagedSchema,
         doc_or_docs::Union{AbstractString, Vector{<:AbstractString}},
         postprocess::F = identity;
         verbose::Bool = true,
@@ -73,7 +73,7 @@
 schema = PT.OllamaManagedSchema()
 
 msg = aiembed(schema, "Hello World", copy; model="openhermes2.5-mistral")
-msg.content # 4096-element Vector{Float64}
source
PromptingTools.aiembedMethod
aiembed(prompt_schema::AbstractOpenAISchema,
         doc_or_docs::Union{AbstractString, Vector{<:AbstractString}},
         postprocess::F = identity;
         verbose::Bool = true,
@@ -89,7 +89,7 @@
 msg = aiembed(["embed me", "and me too"], LinearAlgebra.normalize)
 
 # calculate cosine distance between the two normalized embeddings as a simple dot product
-msg.content' * msg.content[:, 1] # [1.0, 0.787]
source
PromptingTools.aiextractMethod
aiextract([prompt_schema::AbstractOpenAISchema,] prompt::ALLOWED_PROMPT_TYPE; 
+msg.content' * msg.content[:, 1] # [1.0, 0.787]
source
PromptingTools.aiextractMethod
aiextract([prompt_schema::AbstractOpenAISchema,] prompt::ALLOWED_PROMPT_TYPE; 
 return_type::Type,
 verbose::Bool = true,
     model::String = MODEL_CHAT,
@@ -132,7 +132,7 @@
 # If LLM extraction fails, it will return a Dict with `error` and `message` fields instead of the result!
 msg = aiextract("Extract measurements from the text: I am giraffe", type)
 msg.content
-# MaybeExtract{MyMeasurement}(nothing, true, "I'm sorry, but I can only assist with human measurements.")

That way, you can handle the error gracefully and get a reason why extraction failed (in msg.content.message).

Note that the error message refers to a giraffe not being a human, because in our MyMeasurement docstring, we said that it's for people!

source
PromptingTools.aigenerateMethod
aigenerate(prompt_schema::AbstractOllamaManagedSchema, prompt::ALLOWED_PROMPT_TYPE; verbose::Bool = true,
+# MaybeExtract{MyMeasurement}(nothing, true, "I'm sorry, but I can only assist with human measurements.")

That way, you can handle the error gracefully and get a reason why extraction failed (in msg.content.message).

Note that the error message refers to a giraffe not being a human, because in our MyMeasurement docstring, we said that it's for people!

source
PromptingTools.aigenerateMethod
aigenerate(prompt_schema::AbstractOllamaManagedSchema, prompt::ALLOWED_PROMPT_TYPE; verbose::Bool = true,
     model::String = MODEL_CHAT,
     return_all::Bool = false, dry_run::Bool = false,
     conversation::AbstractVector{<:AbstractMessage} = AbstractMessage[],
@@ -157,7 +157,7 @@
 
 msg = aigenerate(schema, conversation; model="openhermes2.5-mistral")
 # [ Info: Tokens: 111 in 2.1 seconds
-# AIMessage("Strong the attachment is, it leads to suffering it may. Focus on the force within you must, ...<continues>")

Note: Managed Ollama currently supports at most 1 User Message and 1 System Message given the API limitations. If you want more, you need to use the ChatMLSchema.

source
PromptingTools.aigenerateMethod
aigenerate(prompt_schema::AbstractOpenAISchema, prompt::ALLOWED_PROMPT_TYPE;
+# AIMessage("Strong the attachment is, it leads to suffering it may. Focus on the force within you must, ...<continues>")

Note: Managed Ollama currently supports at most 1 User Message and 1 System Message given the API limitations. If you want more, you need to use the ChatMLSchema.

source
PromptingTools.aigenerateMethod
aigenerate(prompt_schema::AbstractOpenAISchema, prompt::ALLOWED_PROMPT_TYPE;
     verbose::Bool = true,
     api_key::String = API_KEY,
     model::String = MODEL_CHAT, return_all::Bool = false, dry_run::Bool = false,
@@ -176,7 +176,7 @@
     PT.SystemMessage("You're master Yoda from Star Wars trying to help the user become a Yedi."),
     PT.UserMessage("I have feelings for my iPhone. What should I do?")]
 msg=aigenerate(conversation)
-# AIMessage("Ah, strong feelings you have for your iPhone. A Jedi's path, this is not... <continues>")
source
PromptingTools.aiscanMethod

aiscan([promptschema::AbstractOpenAISchema,] prompt::ALLOWEDPROMPTTYPE; imageurl::Union{Nothing, AbstractString, Vector{<:AbstractString}} = nothing, imagepath::Union{Nothing, AbstractString, Vector{<:AbstractString}} = nothing, imagedetail::AbstractString = "auto", attachtolatest::Bool = true, verbose::Bool = true, model::String = MODELCHAT, returnall::Bool = false, dryrun::Bool = false, conversation::AbstractVector{<:AbstractMessage} = AbstractMessage[], httpkwargs::NamedTuple = (; retrynonidempotent = true, retries = 5, readtimeout = 120), apikwargs::NamedTuple = = (; maxtokens = 2500), kwargs...)

Scans the provided image (image_url or image_path) with the goal provided in the prompt.

Can be used for many multi-modal tasks, such as: OCR (transcribe text in the image), image captioning, image classification, etc.

It's effectively a light wrapper around aigenerate call, which uses additional keyword arguments image_url, image_path, image_detail to be provided. At least one image source (url or path) must be provided.

Arguments

  • prompt_schema: An optional object to specify which prompt template should be applied (Default to PROMPT_SCHEMA = OpenAISchema)
  • prompt: Can be a string representing the prompt for the AI conversation, a UserMessage, a vector of AbstractMessage or an AITemplate
  • image_url: A string or vector of strings representing the URL(s) of the image(s) to scan.
  • image_path: A string or vector of strings representing the path(s) of the image(s) to scan.
  • image_detail: A string representing the level of detail to include for images. Can be "auto", "high", or "low". See OpenAI Vision Guide for more details.
  • attach_to_latest: A boolean how to handle if a conversation with multiple UserMessage is provided. When true, the images are attached to the latest UserMessage.
  • verbose: A boolean indicating whether to print additional information.
  • api_key: A string representing the API key for accessing the OpenAI API.
  • model: A string representing the model to use for generating the response. Can be an alias corresponding to a model ID defined in MODEL_ALIASES.
  • return_all::Bool=false: If true, returns the entire conversation history, otherwise returns only the last message (the AIMessage).
  • dry_run::Bool=false: If true, skips sending the messages to the model (for debugging, often used with return_all=true).
  • conversation: An optional vector of AbstractMessage objects representing the conversation history. If not provided, it is initialized as an empty vector.
  • http_kwargs: A named tuple of HTTP keyword arguments.
  • api_kwargs: A named tuple of API keyword arguments.
  • kwargs: Prompt variables to be used to fill the prompt/template

Returns

If return_all=false (default):

  • msg: An AIMessage object representing the generated AI message, including the content, status, tokens, and elapsed time.

Use msg.content to access the extracted string.

If return_all=true:

  • conversation: A vector of AbstractMessage objects representing the full conversation history, including the response from the AI model (AIMessage).

See also: ai_str, aai_str, aigenerate, aiembed, aiclassify, aiextract, aitemplates

Notes

  • All examples below use model "gpt4v", which is an alias for model ID "gpt-4-vision-preview"
  • max_tokens in the api_kwargs is preset to 2500, otherwise OpenAI enforces a default of only a few hundred tokens (~300). If your output is truncated, increase this value

Example

Describe the provided image:

msg = aiscan("Describe the image"; image_path="julia.png", model="gpt4v")
+# AIMessage("Ah, strong feelings you have for your iPhone. A Jedi's path, this is not... <continues>")
source
PromptingTools.aiscanMethod

aiscan([promptschema::AbstractOpenAISchema,] prompt::ALLOWEDPROMPTTYPE; imageurl::Union{Nothing, AbstractString, Vector{<:AbstractString}} = nothing, imagepath::Union{Nothing, AbstractString, Vector{<:AbstractString}} = nothing, imagedetail::AbstractString = "auto", attachtolatest::Bool = true, verbose::Bool = true, model::String = MODELCHAT, returnall::Bool = false, dryrun::Bool = false, conversation::AbstractVector{<:AbstractMessage} = AbstractMessage[], httpkwargs::NamedTuple = (; retrynonidempotent = true, retries = 5, readtimeout = 120), apikwargs::NamedTuple = = (; maxtokens = 2500), kwargs...)

Scans the provided image (image_url or image_path) with the goal provided in the prompt.

Can be used for many multi-modal tasks, such as: OCR (transcribe text in the image), image captioning, image classification, etc.

It's effectively a light wrapper around aigenerate call, which uses additional keyword arguments image_url, image_path, image_detail to be provided. At least one image source (url or path) must be provided.

Arguments

  • prompt_schema: An optional object to specify which prompt template should be applied (Default to PROMPT_SCHEMA = OpenAISchema)
  • prompt: Can be a string representing the prompt for the AI conversation, a UserMessage, a vector of AbstractMessage or an AITemplate
  • image_url: A string or vector of strings representing the URL(s) of the image(s) to scan.
  • image_path: A string or vector of strings representing the path(s) of the image(s) to scan.
  • image_detail: A string representing the level of detail to include for images. Can be "auto", "high", or "low". See OpenAI Vision Guide for more details.
  • attach_to_latest: A boolean how to handle if a conversation with multiple UserMessage is provided. When true, the images are attached to the latest UserMessage.
  • verbose: A boolean indicating whether to print additional information.
  • api_key: A string representing the API key for accessing the OpenAI API.
  • model: A string representing the model to use for generating the response. Can be an alias corresponding to a model ID defined in MODEL_ALIASES.
  • return_all::Bool=false: If true, returns the entire conversation history, otherwise returns only the last message (the AIMessage).
  • dry_run::Bool=false: If true, skips sending the messages to the model (for debugging, often used with return_all=true).
  • conversation: An optional vector of AbstractMessage objects representing the conversation history. If not provided, it is initialized as an empty vector.
  • http_kwargs: A named tuple of HTTP keyword arguments.
  • api_kwargs: A named tuple of API keyword arguments.
  • kwargs: Prompt variables to be used to fill the prompt/template

Returns

If return_all=false (default):

  • msg: An AIMessage object representing the generated AI message, including the content, status, tokens, and elapsed time.

Use msg.content to access the extracted string.

If return_all=true:

  • conversation: A vector of AbstractMessage objects representing the full conversation history, including the response from the AI model (AIMessage).

See also: ai_str, aai_str, aigenerate, aiembed, aiclassify, aiextract, aitemplates

Notes

  • All examples below use model "gpt4v", which is an alias for model ID "gpt-4-vision-preview"
  • max_tokens in the api_kwargs is preset to 2500, otherwise OpenAI enforces a default of only a few hundred tokens (~300). If your output is truncated, increase this value

Example

Describe the provided image:

msg = aiscan("Describe the image"; image_path="julia.png", model="gpt4v")
 # [ Info: Tokens: 1141 @ Cost: $0.0117 in 2.2 seconds
 # AIMessage("The image shows a logo consisting of the word "julia" written in lowercase")

You can provide multiple images at once as a vector and ask for "low" level of detail (cheaper):

msg = aiscan("Describe the image"; image_path=["julia.png","python.png"], image_detail="low", model="gpt4v")

You can use this function as a nice and quick OCR (transcribe text in the image) with a template :OCRTask. Let's transcribe some SQL code from a screenshot (no more re-typing!):

# Screenshot of some SQL code
 image_url = "https://www.sqlservercentral.com/wp-content/uploads/legacy/8755f69180b7ac7ee76a69ae68ec36872a116ad4/24622.png"
@@ -188,7 +188,7 @@
 
 # You can add syntax highlighting of the outputs via Markdown
 using Markdown
-msg.content |> Markdown.parse

Notice that we enforce max_tokens = 2500. That's because OpenAI seems to default to ~300 tokens, which provides incomplete outputs. Hence, we set this value to 2500 as a default. If you still get truncated outputs, increase this value.

source
PromptingTools.aitemplatesFunction
aitemplates

Find easily the most suitable templates for your use case.

You can search by:

  • query::Symbol which looks look only for partial matches in the template name
  • query::AbstractString which looks for partial matches in the template name or description
  • query::Regex which looks for matches in the template name, description or any of the message previews

Keyword Arguments

  • limit::Int limits the number of returned templates (Defaults to 10)

Examples

Find available templates with aitemplates:

tmps = aitemplates("JuliaExpertAsk")
+msg.content |> Markdown.parse

Notice that we enforce max_tokens = 2500. That's because OpenAI seems to default to ~300 tokens, which provides incomplete outputs. Hence, we set this value to 2500 as a default. If you still get truncated outputs, increase this value.

source
PromptingTools.aitemplatesFunction
aitemplates

Find easily the most suitable templates for your use case.

You can search by:

  • query::Symbol which looks look only for partial matches in the template name
  • query::AbstractString which looks for partial matches in the template name or description
  • query::Regex which looks for matches in the template name, description or any of the message previews

Keyword Arguments

  • limit::Int limits the number of returned templates (Defaults to 10)

Examples

Find available templates with aitemplates:

tmps = aitemplates("JuliaExpertAsk")
 # Will surface one specific template
 # 1-element Vector{AITemplateMetadata}:
 # PromptingTools.AITemplateMetadata
@@ -203,23 +203,23 @@
 {{ask}}"
 #   source: String ""

The above gives you a good idea of what the template is about, what placeholders are available, and how much it would cost to use it (=wordcount).

Search for all Julia-related templates:

tmps = aitemplates("Julia")
 # 2-element Vector{AITemplateMetadata}... -> more to come later!

If you are on VSCode, you can leverage nice tabular display with vscodedisplay:

using DataFrames
-tmps = aitemplates("Julia") |> DataFrame |> vscodedisplay

I have my selected template, how do I use it? Just use the "name" in aigenerate or aiclassify like you see in the first example!

source
PromptingTools.aitemplatesMethod

Find the top-limit templates whose name or description fields partially match the query_key::String in TEMPLATE_METADATA.

source
PromptingTools.aitemplatesMethod

Find the top-limit templates where provided query_key::Regex matches either of name, description or previews or User or System messages in TEMPLATE_METADATA.

source
PromptingTools.eval!Method
eval!(cb::AICode; safe_eval::Bool=true, prefix::AbstractString="", suffix::AbstractString="")

Evaluates a code block cb in-place. It runs automatically when AICode is instantiated with a String.

Check the outcome of evaluation with Base.isvalid(cb). If ==true, provide code block has executed successfully.

Steps:

  • If cb::AICode has not been evaluated, cb.success = nothing. After the evaluation it will be either true or false depending on the outcome
  • Parse the text in cb.code
  • Evaluate the parsed expression
  • Capture outputs of the evaluated in cb.output
  • Capture any stdout outputs (eg, test failures) in cb.stdout
  • If any error exception is raised, it is saved in cb.error
  • Finally, if all steps were successful, success is set to cb.success = true

Keyword Arguments

  • safe_eval::Bool: If true, we first check for any Pkg operations (eg, installing new packages) and missing imports, then the code will be evaluated inside a bespoke scratch module (not to change any user variables)
  • prefix::AbstractString: A string to be prepended to the code block before parsing and evaluation. Useful to add some additional code definition or necessary imports. Defaults to an empty string.
  • suffix::AbstractString: A string to be appended to the code block before parsing and evaluation. Useful to check that tests pass or that an example executes. Defaults to an empty string.
source
PromptingTools.extract_code_blocksMethod
extract_code_blocks(markdown_content::String) -> Vector{String}

Extract Julia code blocks from a markdown string.

This function searches through the provided markdown content, identifies blocks of code specifically marked as Julia code (using the julia ... code fence patterns), and extracts the code within these blocks. The extracted code blocks are returned as a vector of strings, with each string representing one block of Julia code.

Note: Only the content within the code fences is extracted, and the code fences themselves are not included in the output.

Arguments

  • markdown_content::String: A string containing the markdown content from which Julia code blocks are to be extracted.

Returns

  • Vector{String}: A vector containing strings of extracted Julia code blocks. If no Julia code blocks are found, an empty vector is returned.

Examples

Example with a single Julia code block

markdown_single = """

julia println("Hello, World!")

"""
+tmps = aitemplates("Julia") |> DataFrame |> vscodedisplay

I have my selected template, how do I use it? Just use the "name" in aigenerate or aiclassify like you see in the first example!

source
PromptingTools.aitemplatesMethod

Find the top-limit templates whose name or description fields partially match the query_key::String in TEMPLATE_METADATA.

source
PromptingTools.aitemplatesMethod

Find the top-limit templates where provided query_key::Regex matches either of name, description or previews or User or System messages in TEMPLATE_METADATA.

source
PromptingTools.eval!Method
eval!(cb::AICode; safe_eval::Bool=true, prefix::AbstractString="", suffix::AbstractString="")

Evaluates a code block cb in-place. It runs automatically when AICode is instantiated with a String.

Check the outcome of evaluation with Base.isvalid(cb). If ==true, provide code block has executed successfully.

Steps:

  • If cb::AICode has not been evaluated, cb.success = nothing. After the evaluation it will be either true or false depending on the outcome
  • Parse the text in cb.code
  • Evaluate the parsed expression
  • Capture outputs of the evaluated in cb.output
  • Capture any stdout outputs (eg, test failures) in cb.stdout
  • If any error exception is raised, it is saved in cb.error
  • Finally, if all steps were successful, success is set to cb.success = true

Keyword Arguments

  • safe_eval::Bool: If true, we first check for any Pkg operations (eg, installing new packages) and missing imports, then the code will be evaluated inside a bespoke scratch module (not to change any user variables)
  • prefix::AbstractString: A string to be prepended to the code block before parsing and evaluation. Useful to add some additional code definition or necessary imports. Defaults to an empty string.
  • suffix::AbstractString: A string to be appended to the code block before parsing and evaluation. Useful to check that tests pass or that an example executes. Defaults to an empty string.
source
PromptingTools.extract_code_blocksMethod
extract_code_blocks(markdown_content::String) -> Vector{String}

Extract Julia code blocks from a markdown string.

This function searches through the provided markdown content, identifies blocks of code specifically marked as Julia code (using the julia ... code fence patterns), and extracts the code within these blocks. The extracted code blocks are returned as a vector of strings, with each string representing one block of Julia code.

Note: Only the content within the code fences is extracted, and the code fences themselves are not included in the output.

Arguments

  • markdown_content::String: A string containing the markdown content from which Julia code blocks are to be extracted.

Returns

  • Vector{String}: A vector containing strings of extracted Julia code blocks. If no Julia code blocks are found, an empty vector is returned.

Examples

Example with a single Julia code block

markdown_single = """

julia println("Hello, World!")

"""
 extract_code_blocks(markdown_single)
 # Output: ["Hello, World!"]
# Example with multiple Julia code blocks
 markdown_multiple = """

julia x = 5

Some text in between

julia y = x + 2

"""
 extract_code_blocks(markdown_multiple)
-# Output: ["x = 5", "y = x + 2"]
source
PromptingTools.extract_function_nameMethod
extract_function_name(code_block::String) -> Union{String, Nothing}

Extract the name of a function from a given Julia code block. The function searches for two patterns:

  • The explicit function declaration pattern: function name(...) ... end
  • The concise function declaration pattern: name(...) = ...

If a function name is found, it is returned as a string. If no function name is found, the function returns nothing.

Arguments

  • code_block::String: A string containing Julia code.

Returns

  • Union{String, Nothing}: The extracted function name or nothing if no name is found.

Example

code = """
+# Output: ["x = 5", "y = x + 2"]
source
PromptingTools.extract_function_nameMethod
extract_function_name(code_block::String) -> Union{String, Nothing}

Extract the name of a function from a given Julia code block. The function searches for two patterns:

  • The explicit function declaration pattern: function name(...) ... end
  • The concise function declaration pattern: name(...) = ...

If a function name is found, it is returned as a string. If no function name is found, the function returns nothing.

Arguments

  • code_block::String: A string containing Julia code.

Returns

  • Union{String, Nothing}: The extracted function name or nothing if no name is found.

Example

code = """
 function myFunction(arg1, arg2)
     # Function body
 end
 """
 extract_function_name(code)
-# Output: "myFunction"
source
PromptingTools.finalize_outputsMethod
finalize_outputs(prompt::ALLOWED_PROMPT_TYPE, conv_rendered::Any,
     msg::Union{Nothing, AbstractMessage};
     return_all::Bool = false,
     dry_run::Bool = false,
     conversation::AbstractVector{<:AbstractMessage} = AbstractMessage[],
-    kwargs...)

Finalizes the outputs of the ai* functions by either returning the conversation history or the last message.

Keyword arguments

  • return_all::Bool=false: If true, returns the entire conversation history, otherwise returns only the last message (the AIMessage).
  • dry_run::Bool=false: If true, does not send the messages to the model, but only renders the prompt with the given schema and replacement variables. Useful for debugging when you want to check the specific schema rendering.
  • conversation::AbstractVector{<:AbstractMessage}=[]: An optional vector of AbstractMessage objects representing the conversation history. If not provided, it is initialized as an empty vector.
  • kwargs...: Variables to replace in the prompt template.
source
PromptingTools.function_call_signatureMethod
function_call_signature(datastructtype::Struct; max_description_length::Int = 100)

Extract the argument names, types and docstrings from a struct to create the function call signature in JSON schema.

You must provide a Struct type (not an instance of it) with some fields.

Note: Fairly experimental, but works for combination of structs, arrays, strings and singletons.

Tips

  • You can improve the quality of the extraction by writing a helpful docstring for your struct (or any nested struct). It will be provided as a description.

You can even include comments/descriptions about the individual fields.

  • All fields are assumed to be required, unless you allow null values (eg, ::Union{Nothing, Int}). Fields with Nothing will be treated as optional.
  • Missing values are ignored (eg, ::Union{Missing, Int} will be treated as Int). It's for broader compatibility and we cannot deserialize it as easily as Nothing.

Example

Do you want to extract some specific measurements from a text like age, weight and height? You need to define the information you need as a struct (return_type):

struct MyMeasurement
+    kwargs...)

Finalizes the outputs of the ai* functions by either returning the conversation history or the last message.

Keyword arguments

  • return_all::Bool=false: If true, returns the entire conversation history, otherwise returns only the last message (the AIMessage).
  • dry_run::Bool=false: If true, does not send the messages to the model, but only renders the prompt with the given schema and replacement variables. Useful for debugging when you want to check the specific schema rendering.
  • conversation::AbstractVector{<:AbstractMessage}=[]: An optional vector of AbstractMessage objects representing the conversation history. If not provided, it is initialized as an empty vector.
  • kwargs...: Variables to replace in the prompt template.
source
PromptingTools.function_call_signatureMethod
function_call_signature(datastructtype::Struct; max_description_length::Int = 100)

Extract the argument names, types and docstrings from a struct to create the function call signature in JSON schema.

You must provide a Struct type (not an instance of it) with some fields.

Note: Fairly experimental, but works for combination of structs, arrays, strings and singletons.

Tips

  • You can improve the quality of the extraction by writing a helpful docstring for your struct (or any nested struct). It will be provided as a description.

You can even include comments/descriptions about the individual fields.

  • All fields are assumed to be required, unless you allow null values (eg, ::Union{Nothing, Int}). Fields with Nothing will be treated as optional.
  • Missing values are ignored (eg, ::Union{Missing, Int} will be treated as Int). It's for broader compatibility and we cannot deserialize it as easily as Nothing.

Example

Do you want to extract some specific measurements from a text like age, weight and height? You need to define the information you need as a struct (return_type):

struct MyMeasurement
     age::Int
     height::Union{Int,Nothing}
     weight::Union{Nothing,Float64}
@@ -235,30 +235,30 @@
     measurements::Vector{MyMeasurement}
 end
 
-Or if you want your extraction to fail gracefully when data isn't found, use `MaybeExtract{T}` wrapper (inspired by Instructor package!):

using PromptingTools: MaybeExtract

type = MaybeExtract{MyMeasurement}

Effectively the same as:

struct MaybeExtract{T}

result::Union{T, Nothing}

error::Bool // true if a result is found, false otherwise

message::Union{Nothing, String} // Only present if no result is found, should be short and concise

end

If LLM extraction fails, it will return a Dict with error and message fields instead of the result!

msg = aiextract("Extract measurements from the text: I am giraffe", type)

Dict{Symbol, Any} with 2 entries:

:message => "Sorry, this feature is only available for humans."

:error => true

``` That way, you can handle the error gracefully and get a reason why extraction failed.

source
PromptingTools.load_templates!Function
load_templates!(; remove_templates::Bool=true)

Loads templates from folder templates/ in the package root and stores them in TEMPLATE_STORE and TEMPLATE_METADATA.

Note: Automatically removes any existing templates and metadata from TEMPLATE_STORE and TEMPLATE_METADATA if remove_templates=true.

source
PromptingTools.ollama_apiMethod
ollama_api(prompt_schema::AbstractOllamaManagedSchema, prompt::AbstractString,
+Or if you want your extraction to fail gracefully when data isn't found, use `MaybeExtract{T}` wrapper (inspired by Instructor package!):

using PromptingTools: MaybeExtract

type = MaybeExtract{MyMeasurement}

Effectively the same as:

struct MaybeExtract{T}

result::Union{T, Nothing}

error::Bool // true if a result is found, false otherwise

message::Union{Nothing, String} // Only present if no result is found, should be short and concise

end

If LLM extraction fails, it will return a Dict with error and message fields instead of the result!

msg = aiextract("Extract measurements from the text: I am giraffe", type)

Dict{Symbol, Any} with 2 entries:

:message => "Sorry, this feature is only available for humans."

:error => true

``` That way, you can handle the error gracefully and get a reason why extraction failed.

source
PromptingTools.load_templates!Function
load_templates!(; remove_templates::Bool=true)

Loads templates from folder templates/ in the package root and stores them in TEMPLATE_STORE and TEMPLATE_METADATA.

Note: Automatically removes any existing templates and metadata from TEMPLATE_STORE and TEMPLATE_METADATA if remove_templates=true.

source
PromptingTools.ollama_apiMethod
ollama_api(prompt_schema::AbstractOllamaManagedSchema, prompt::AbstractString,
     system::Union{Nothing, AbstractString} = nothing,
     endpoint::String = "generate";
     model::String = "llama2", http_kwargs::NamedTuple = NamedTuple(),
     stream::Bool = false,
     url::String = "localhost", port::Int = 11434,
-    kwargs...)

Simple wrapper for a call to Ollama API.

Keyword Arguments

  • prompt_schema: Defines which prompt template should be applied.
  • prompt: Can be a string representing the prompt for the AI conversation, a UserMessage, a vector of AbstractMessage
  • system: An optional string representing the system message for the AI conversation. If not provided, a default message will be used.
  • endpoint: The API endpoint to call, only "generate" and "embeddings" are currently supported. Defaults to "generate".
  • model: A string representing the model to use for generating the response. Can be an alias corresponding to a model ID defined in MODEL_ALIASES.
  • http_kwargs::NamedTuple: Additional keyword arguments for the HTTP request. Defaults to empty NamedTuple.
  • stream: A boolean indicating whether to stream the response. Defaults to false.
  • url: The URL of the Ollama API. Defaults to "localhost".
  • port: The port of the Ollama API. Defaults to 11434.
  • kwargs: Prompt variables to be used to fill the prompt/template
source
PromptingTools.renderMethod
render(schema::AbstractOllamaManagedSchema,
+    kwargs...)

Simple wrapper for a call to Ollama API.

Keyword Arguments

  • prompt_schema: Defines which prompt template should be applied.
  • prompt: Can be a string representing the prompt for the AI conversation, a UserMessage, a vector of AbstractMessage
  • system: An optional string representing the system message for the AI conversation. If not provided, a default message will be used.
  • endpoint: The API endpoint to call, only "generate" and "embeddings" are currently supported. Defaults to "generate".
  • model: A string representing the model to use for generating the response. Can be an alias corresponding to a model ID defined in MODEL_ALIASES.
  • http_kwargs::NamedTuple: Additional keyword arguments for the HTTP request. Defaults to empty NamedTuple.
  • stream: A boolean indicating whether to stream the response. Defaults to false.
  • url: The URL of the Ollama API. Defaults to "localhost".
  • port: The port of the Ollama API. Defaults to 11434.
  • kwargs: Prompt variables to be used to fill the prompt/template
source
PromptingTools.remove_julia_promptMethod
remove_julia_prompt(s::T) where {T<:AbstractString}

If it detects a julia prompt, it removes it and all lines that do not have it (except for those that belong to the code block).

source
PromptingTools.renderMethod
render(schema::AbstractOllamaManagedSchema,
     messages::Vector{<:AbstractMessage};
     conversation::AbstractVector{<:AbstractMessage} = AbstractMessage[],
-    kwargs...)

Builds a history of the conversation to provide the prompt to the API. All unspecified kwargs are passed as replacements such that {{key}}=>value in the template.

Note: Due to its "managed" nature, at most 2 messages can be provided (system and prompt inputs in the API).

Keyword Arguments

  • conversation: Not allowed for this schema. Provided only for compatibility.
source
PromptingTools.renderMethod
render(schema::AbstractOpenAISchema,
+    kwargs...)

Builds a history of the conversation to provide the prompt to the API. All unspecified kwargs are passed as replacements such that {{key}}=>value in the template.

Note: Due to its "managed" nature, at most 2 messages can be provided (system and prompt inputs in the API).

Keyword Arguments

  • conversation: Not allowed for this schema. Provided only for compatibility.
source
PromptingTools.renderMethod
render(schema::AbstractOpenAISchema,
     messages::Vector{<:AbstractMessage};
     image_detail::AbstractString = "auto",
     conversation::AbstractVector{<:AbstractMessage} = AbstractMessage[],
-    kwargs...)

Builds a history of the conversation to provide the prompt to the API. All unspecified kwargs are passed as replacements such that {{key}}=>value in the template.

Keyword Arguments

  • image_detail: Only for UserMessageWithImages. It represents the level of detail to include for images. Can be "auto", "high", or "low".
  • conversation: An optional vector of AbstractMessage objects representing the conversation history. If not provided, it is initialized as an empty vector.
source
PromptingTools.renderMethod
render(schema::NoSchema,
+    kwargs...)

Builds a history of the conversation to provide the prompt to the API. All unspecified kwargs are passed as replacements such that {{key}}=>value in the template.

Keyword Arguments

  • image_detail: Only for UserMessageWithImages. It represents the level of detail to include for images. Can be "auto", "high", or "low".
  • conversation: An optional vector of AbstractMessage objects representing the conversation history. If not provided, it is initialized as an empty vector.
source
PromptingTools.renderMethod
render(schema::NoSchema,
     messages::Vector{<:AbstractMessage};
     conversation::AbstractVector{<:AbstractMessage} = AbstractMessage[],
-    replacement_kwargs...)

Renders a conversation history from a vector of messages with all replacement variables specified in replacement_kwargs.

It is the first pass of the prompt rendering system, and is used by all other schemas.

Keyword Arguments

  • image_detail: Only for UserMessageWithImages. It represents the level of detail to include for images. Can be "auto", "high", or "low".
  • conversation: An optional vector of AbstractMessage objects representing the conversation history. If not provided, it is initialized as an empty vector.

Notes

  • All unspecified kwargs are passed as replacements such that {{key}}=>value in the template.
  • If a SystemMessage is missing, we inject a default one at the beginning of the conversation.
  • Only one SystemMessage is allowed (ie, cannot mix two conversations different system prompts).
source
PromptingTools.replace_wordsMethod
replace_words(text::AbstractString, words::Vector{<:AbstractString}; replacement::AbstractString="ABC")

Replace all occurrences of words in words with replacement in text. Useful to quickly remove specific names or entities from a text.

Arguments

  • text::AbstractString: The text to be processed.
  • words::Vector{<:AbstractString}: A vector of words to be replaced.
  • replacement::AbstractString="ABC": The replacement string to be used. Defaults to "ABC".

Example

text = "Disney is a great company"
+    replacement_kwargs...)

Renders a conversation history from a vector of messages with all replacement variables specified in replacement_kwargs.

It is the first pass of the prompt rendering system, and is used by all other schemas.

Keyword Arguments

  • image_detail: Only for UserMessageWithImages. It represents the level of detail to include for images. Can be "auto", "high", or "low".
  • conversation: An optional vector of AbstractMessage objects representing the conversation history. If not provided, it is initialized as an empty vector.

Notes

  • All unspecified kwargs are passed as replacements such that {{key}}=>value in the template.
  • If a SystemMessage is missing, we inject a default one at the beginning of the conversation.
  • Only one SystemMessage is allowed (ie, cannot mix two conversations different system prompts).
source
PromptingTools.replace_wordsMethod
replace_words(text::AbstractString, words::Vector{<:AbstractString}; replacement::AbstractString="ABC")

Replace all occurrences of words in words with replacement in text. Useful to quickly remove specific names or entities from a text.

Arguments

  • text::AbstractString: The text to be processed.
  • words::Vector{<:AbstractString}: A vector of words to be replaced.
  • replacement::AbstractString="ABC": The replacement string to be used. Defaults to "ABC".

Example

text = "Disney is a great company"
 replace_words(text, ["Disney", "Snow White", "Mickey Mouse"])
-# Output: "ABC is a great company"
source
PromptingTools.split_by_lengthMethod
split_by_length(text::String; separator::String=" ", max_length::Int=35000) -> Vector{String}

Split a given string text into chunks of a specified maximum length max_length. This is particularly useful for splitting larger documents or texts into smaller segments, suitable for models or systems with smaller context windows.

Arguments

  • text::String: The text to be split.
  • separator::String=" ": The separator used to split the text into minichunks. Defaults to a space character.
  • max_length::Int=35000: The maximum length of each chunk. Defaults to 35,000 characters, which should fit within 16K context window.

Returns

Vector{String}: A vector of strings, each representing a chunk of the original text that is smaller than or equal to max_length.

Notes

  • The function ensures that each chunk is as close to max_length as possible without exceeding it.
  • If the text is empty, the function returns an empty array.
  • The separator is re-added to the text chunks after splitting, preserving the original structure of the text as closely as possible.

Examples

Splitting text with the default separator (" "):

text = "Hello world. How are you?"
+# Output: "ABC is a great company"
source
PromptingTools.split_by_lengthMethod
split_by_length(text::String; separator::String=" ", max_length::Int=35000) -> Vector{String}

Split a given string text into chunks of a specified maximum length max_length. This is particularly useful for splitting larger documents or texts into smaller segments, suitable for models or systems with smaller context windows.

Arguments

  • text::String: The text to be split.
  • separator::String=" ": The separator used to split the text into minichunks. Defaults to a space character.
  • max_length::Int=35000: The maximum length of each chunk. Defaults to 35,000 characters, which should fit within 16K context window.

Returns

Vector{String}: A vector of strings, each representing a chunk of the original text that is smaller than or equal to max_length.

Notes

  • The function ensures that each chunk is as close to max_length as possible without exceeding it.
  • If the text is empty, the function returns an empty array.
  • The separator is re-added to the text chunks after splitting, preserving the original structure of the text as closely as possible.

Examples

Splitting text with the default separator (" "):

text = "Hello world. How are you?"
 chunks = splitbysize(text; max_length=13)
 length(chunks) # Output: 2

Using a custom separator and custom max_length

text = "Hello,World," ^ 2900 # length 34900 chars
 split_by_length(text; separator=",", max_length=10000) # for 4K context window
-length(chunks[1]) # Output: 4
source
PromptingTools.@aai_strMacro
aai"user_prompt"[model_alias] -> AIMessage

Asynchronous version of @ai_str macro, which will log the result once it's ready.

Example

Send asynchronous request to GPT-4, so we don't have to wait for the response: Very practical with slow models, so you can keep working in the meantime.

```julia m = aai"Say Hi!"gpt4;

...with some delay...

[ Info: Tokens: 29 @ Cost: 0.0011 in 2.7 seconds

[ Info: AIMessage> Hello! How can I assist you today?

source
PromptingTools.@ai_strMacro
ai"user_prompt"[model_alias] -> AIMessage

The ai"" string macro generates an AI response to a given prompt by using aigenerate under the hood.

Arguments

  • user_prompt (String): The input prompt for the AI model.
  • model_alias (optional, any): Provide model alias of the AI model (see MODEL_ALIASES).

Returns

AIMessage corresponding to the input prompt.

Example

result = ai"Hello, how are you?"
+length(chunks[1]) # Output: 4
source
PromptingTools.@aai_strMacro
aai"user_prompt"[model_alias] -> AIMessage

Asynchronous version of @ai_str macro, which will log the result once it's ready.

Example

Send asynchronous request to GPT-4, so we don't have to wait for the response: Very practical with slow models, so you can keep working in the meantime.

```julia m = aai"Say Hi!"gpt4;

...with some delay...

[ Info: Tokens: 29 @ Cost: 0.0011 in 2.7 seconds

[ Info: AIMessage> Hello! How can I assist you today?

source
PromptingTools.@ai_strMacro
ai"user_prompt"[model_alias] -> AIMessage

The ai"" string macro generates an AI response to a given prompt by using aigenerate under the hood.

Arguments

  • user_prompt (String): The input prompt for the AI model.
  • model_alias (optional, any): Provide model alias of the AI model (see MODEL_ALIASES).

Returns

AIMessage corresponding to the input prompt.

Example

result = ai"Hello, how are you?"
 # AIMessage("Hello! I'm an AI assistant, so I don't have feelings, but I'm here to help you. How can I assist you today?")

If you want to interpolate some variables or additional context, simply use string interpolation:

a=1
 result = ai"What is `$a+$a`?"
 # AIMessage("The sum of `1+1` is `2`.")

If you want to use a different model, eg, GPT-4, you can provide its alias as a flag:

result = ai"What is `1.23 * 100 + 1`?"gpt4
-# AIMessage("The answer is 124.")
source
+# AIMessage("The answer is 124.")
source
diff --git a/dev/search_index.js b/dev/search_index.js index e1a5c568a..146049a22 100644 --- a/dev/search_index.js +++ b/dev/search_index.js @@ -1,3 +1,3 @@ var documenterSearchIndex = {"docs": -[{"location":"getting_started/#Getting-Started","page":"Getting Started","title":"Getting Started","text":"","category":"section"},{"location":"getting_started/#Prerequisites","page":"Getting Started","title":"Prerequisites","text":"","category":"section"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"OpenAI API key saved in the environment variable OPENAI_API_KEY","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"You will need to register with OpenAI and generate an API key:","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"Create an account with OpenAI\nGo to API Key page\nClick on “Create new secret key”","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"!!! Do not share it with anyone and do NOT save it to any files that get synced online.","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"Resources:","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"OpenAI Documentation\nVisual tutorial","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"You will need to set this key as an environment variable before using PromptingTools.jl:","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"For a quick start, simply set it via ENV[\"OPENAI_API_KEY\"] = \"your-api-key\" Alternatively, you can:","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"set it in the terminal before launching Julia: export OPENAI_API_KEY = \nset it in your setup.jl (make sure not to commit it to GitHub!)","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"Make sure to start Julia from the same terminal window where you set the variable. Easy check in Julia, run ENV[\"OPENAI_API_KEY\"] and you should see your key!","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"For other options or more robust solutions, see the FAQ section.","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"Resources: ","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"OpenAI Guide","category":"page"},{"location":"getting_started/#Installation","page":"Getting Started","title":"Installation","text":"","category":"section"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"PromptingTools can be installed using the following commands:","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"using Pkg\nPkg.add(\"PromptingTools.jl\")","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"Throughout the rest of this tutorial, we will assume that you have installed the PromptingTools package and have already typed using PromptingTools to bring all of the relevant variables into your current namespace.","category":"page"},{"location":"getting_started/#Quick-Start-with-@ai_str","page":"Getting Started","title":"Quick Start with @ai_str","text":"","category":"section"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"The easiest start is the @ai_str macro. Simply type ai\"your prompt\" and you will get a response from the default model (GPT-3.5 Turbo).","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"ai\"What is the capital of France?\"","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"[ Info: Tokens: 31 @ Cost: $0.0 in 1.5 seconds --> Be in control of your spending! \nAIMessage(\"The capital of France is Paris.\")","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"Returned object is a light wrapper with generated message in field :content (eg, ans.content) for additional downstream processing.","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"You can easily inject any variables with string interpolation:","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"country = \"Spain\"\nai\"What is the capital of \\$(country)?\"","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"[ Info: Tokens: 32 @ Cost: $0.0001 in 0.5 seconds\nAIMessage(\"The capital of Spain is Madrid.\")","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"Pro tip: Use after-string-flags to select the model to be called, eg, ai\"What is the capital of France?\"gpt4 (use gpt4t for the new GPT-4 Turbo model). Great for those extra hard questions!","category":"page"},{"location":"getting_started/#Using-aigenerate-with-placeholders","page":"Getting Started","title":"Using aigenerate with placeholders","text":"","category":"section"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"For more complex prompt templates, you can use handlebars-style templating and provide variables as keyword arguments:","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"msg = aigenerate(\"What is the capital of {{country}}? Is the population larger than {{population}}?\", country=\"Spain\", population=\"1M\")","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"[ Info: Tokens: 74 @ Cost: $0.0001 in 1.3 seconds\nAIMessage(\"The capital of Spain is Madrid. And yes, the population of Madrid is larger than 1 million. As of 2020, the estimated population of Madrid is around 3.3 million people.\")","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"Pro tip: Use asyncmap to run multiple AI-powered tasks concurrently.","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"Pro tip: If you use slow models (like GPT-4), you can use async version of @ai_str -> @aai_str to avoid blocking the REPL, eg, aai\"Say hi but slowly!\"gpt4","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"For more practical examples, see the Various Examples section.","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"EditURL = \"../../../examples/working_with_ollama.jl\"","category":"page"},{"location":"examples/working_with_ollama/#Local-models-with-Ollama.ai","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"","category":"section"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"This file contains examples of how to work with Ollama.ai models. It assumes that you've already installed and launched the Ollama server. Quick check: open the following website in your browser http://127.0.0.1:11434/ and you should see the message \"Ollama is running\". For more details or troubleshooting advice, see the Frequently Asked Questions section.","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"First, let's import the package and define a helper link for calling un-exported functions:","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"using PromptingTools\nconst PT = PromptingTools","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"PromptingTools","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"Notice the schema change! If you want this to be the new default, you need to change PT.PROMPT_SCHEMA","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"schema = PT.OllamaManagedSchema()","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"OllamaManagedSchema()","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"You can choose models from https://ollama.ai/library - I prefer openhermes2.5-mistral","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"model = \"openhermes2.5-mistral\"","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"\"openhermes2.5-mistral\"","category":"page"},{"location":"examples/working_with_ollama/#Setting-Ollama-as-a-default-LLM","page":"Local models with Ollama.ai","title":"Setting Ollama as a default LLM","text":"","category":"section"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"We need to change the global variables for PROMPT_SCHEMA and default models","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"using PromptingTools\nconst PT = PromptingTools\n\n\nPT.PROMPT_SCHEMA = PT.OllamaManagedSchema()\nPT.MODEL_CHAT = \"openhermes2.5-mistral\"\n# You could do the same for PT.MODEL_EMBEDDING","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"We can also add a nicer alias for the above Mistral model","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"PT.MODEL_ALIASES[\"mistral\"]= \"openhermes2.5-mistral\"\n# potentially also yi 34bn if you want a bigger more powerful model\nPT.MODEL_ALIASES[\"yi\"]= \"yi:34b-chat\"","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"Now, we can use the @ai_str macro with Ollama models:","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"ai\"Say hi to me!\" # defaults to mistral because we set MODEL_CHAT above\nai\"Say hi to me in Chinese!\"yi # defaults to yi 34Bn model","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"Note: Another quite popular model is zephyr:7b-beta","category":"page"},{"location":"examples/working_with_ollama/#Text-Generation-with-aigenerate","page":"Local models with Ollama.ai","title":"Text Generation with aigenerate","text":"","category":"section"},{"location":"examples/working_with_ollama/#Simple-message","page":"Local models with Ollama.ai","title":"Simple message","text":"","category":"section"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"msg = aigenerate(schema, \"Say hi!\"; model)","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"AIMessage(\"Hi there! How can I help you today? If you have any questions or need assistance, please feel free to ask.\")","category":"page"},{"location":"examples/working_with_ollama/#Standard-string-interpolation","page":"Local models with Ollama.ai","title":"Standard string interpolation","text":"","category":"section"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"a = 1\nmsg = aigenerate(schema, \"What is `$a+$a`?\"; model)\n\nname = \"John\"\nmsg = aigenerate(schema, \"Say hi to {{name}}.\"; name, model)","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"AIMessage(\"Hi there, John! It's great to see you today. How can I assist you? If you have any questions or need help with something, please don't hesitate to ask!\")","category":"page"},{"location":"examples/working_with_ollama/#Advanced-Prompts","page":"Local models with Ollama.ai","title":"Advanced Prompts","text":"","category":"section"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"conversation = [\n PT.SystemMessage(\"You're master Yoda from Star Wars trying to help the user become a Yedi.\"),\n PT.UserMessage(\"I have feelings for my iPhone. What should I do?\")]\nmsg = aigenerate(schema, conversation; model)","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"AIMessage(\"Strong your feelings are, but attachments lead to suffering they often do. Focus on the balance in all things and let go of possessions that cloud your judgment. Embrace the wisdom of the Force and understand that material objects are not the same as love. The Force will guide you.\")","category":"page"},{"location":"examples/working_with_ollama/#Embeddings-with-aiembed","page":"Local models with Ollama.ai","title":"Embeddings with aiembed","text":"","category":"section"},{"location":"examples/working_with_ollama/#Simple-embedding-for-one-document","page":"Local models with Ollama.ai","title":"Simple embedding for one document","text":"","category":"section"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"msg = aiembed(schema, \"Embed me\"; model) # access msg.content","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"DataMessage(JSON3.Array{Float64, Vector{UInt8}, SubArray{UInt64, 1, Vector{UInt64}, Tuple{UnitRange{Int64}}, true}} of size (4096,))","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"One document and we materialize the data into a Vector with copy (postprocess function argument)","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"msg = aiembed(schema, \"Embed me\", copy; model)","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"DataMessage(Vector{Float64} of size (4096,))","category":"page"},{"location":"examples/working_with_ollama/#Multiple-documents-embedding","page":"Local models with Ollama.ai","title":"Multiple documents embedding","text":"","category":"section"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"Multiple documents - embedded sequentially, you can get faster speed with async","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"msg = aiembed(schema, [\"Embed me\", \"Embed me\"]; model)","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"DataMessage(Matrix{Float64} of size (4096, 2))","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"You can use Threads.@spawn or asyncmap, whichever you prefer, to paralellize the model calls","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"docs = [\"Embed me\", \"Embed me\"]\ntasks = asyncmap(docs) do doc\n msg = aiembed(schema, doc; model)\nend\nembedding = mapreduce(x -> x.content, hcat, tasks)","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"4096×2 Matrix{Float64}:\n 7.71459 7.71459\n -1.14532 -1.14532\n 2.90205 2.90205\n -4.01967 -4.01967\n -7.73098 -7.73098\n 8.02114 8.02114\n -6.01313 -6.01313\n -2.06712 -2.06712\n 4.97633 4.97633\n -9.69502 -9.69502\n -0.02567 -0.02567\n 8.09622 8.09622\n 6.54008 6.54008\n -5.70348 -5.70348\n 2.55213 2.55213\n -2.00164 -2.00164\n -2.21854 -2.21854\n -3.6568 -3.6568\n 3.97905 3.97905\n -1.79931 -1.79931\n 0.0769786 0.0769786\n -10.4355 -10.4355\n -3.92487 -3.92487\n -6.03455 -6.03455\n -2.8005 -2.8005\n 2.23584 2.23584\n -0.503125 -0.503125\n 1.99538 1.99538\n -0.283642 -0.283642\n -0.414273 -0.414273\n 8.72909 8.72909\n 2.6071 2.6071\n 0.0808531 0.0808531\n -1.83914 -1.83914\n 2.19998 2.19998\n -0.629226 -0.629226\n 3.74217 3.74217\n 1.71231 1.71231\n -0.742473 -0.742473\n 2.9234 2.9234\n 7.33933 7.33933\n 4.24576 4.24576\n -7.56434 -7.56434\n -1.22274 -1.22274\n 1.73444 1.73444\n -0.736801 -0.736801\n 1.30149 1.30149\n -6.91642 -6.91642\n -1.84513 -1.84513\n 1.69959 1.69959\n 5.74253 5.74253\n 1.48734 1.48734\n -1.45199 -1.45199\n -18.5026 -18.5026\n -8.61009 -8.61009\n -2.21845 -2.21845\n -4.22932 -4.22932\n 6.0436 6.0436\n -1.8824 -1.8824\n -0.689965 -0.689965\n 0.845927 0.845927\n -1.99517 -1.99517\n 9.32292 9.32292\n 6.24938 6.24938\n -4.59894 -4.59894\n 6.24579 6.24579\n -5.8733 -5.8733\n -4.60285 -4.60285\n -1.27596 -1.27596\n -1.68807 -1.68807\n -0.391147 -0.391147\n -2.68362 -2.68362\n 1.99197 1.99197\n 0.0812396 0.0812396\n -3.79761 -3.79761\n -8.5693 -8.5693\n -0.869305 -0.869305\n -0.77582 -0.77582\n -4.76995 -4.76995\n 1.9712 1.9712\n 4.74459 4.74459\n -4.31244 -4.31244\n 3.94876 3.94876\n -11.0882 -11.0882\n 9.38629 9.38629\n 10.2995 10.2995\n 2.40846 2.40846\n -3.91429 -3.91429\n -0.745707 -0.745707\n 4.31946 4.31946\n 8.34836 8.34836\n 0.857636 0.857636\n 1.66563 1.66563\n -11.1522 -11.1522\n -3.48353 -3.48353\n -6.08336 -6.08336\n 1.22086 1.22086\n -2.81636 -2.81636\n 1.07224 1.07224\n -8.24909 -8.24909\n 3.66474 3.66474\n -0.260558 -0.260558\n 2.38779 2.38779\n -4.00576 -4.00576\n 1.3949 1.3949\n -5.43468 -5.43468\n 4.08836 4.08836\n -1.1134 -1.1134\n -2.05916 -2.05916\n -9.78987 -9.78987\n -2.86149 -2.86149\n 5.54577 5.54577\n -1.96682 -1.96682\n 9.70577 9.70577\n -4.0553 -4.0553\n 8.54535 8.54535\n 0.539438 0.539438\n 4.61091 4.61091\n -5.32208 -5.32208\n -0.256733 -0.256733\n 4.74966 4.74966\n -2.46464 -2.46464\n -0.223077 -0.223077\n 1.84442 1.84442\n 6.42329 6.42329\n 0.431667 0.431667\n -8.42777 -8.42777\n -10.691 -10.691\n 3.023 3.023\n -5.65345 -5.65345\n -4.17833 -4.17833\n 0.937893 0.937893\n -6.99405 -6.99405\n -4.55107 -4.55107\n -15.3169 -15.3169\n -2.08895 -2.08895\n 7.17826 7.17826\n -4.26108 -4.26108\n -3.2712 -3.2712\n 16.1561 16.1561\n 13.5164 13.5164\n -5.91778 -5.91778\n 6.3401 6.3401\n 12.7018 12.7018\n 2.04305 2.04305\n 3.81683 3.81683\n -1.39969 -1.39969\n -0.17249 -0.17249\n -16.3687 -16.3687\n 4.3827 4.3827\n 2.58974 2.58974\n -4.75363 -4.75363\n 3.36371 3.36371\n 0.986534 0.986534\n -13.4299 -13.4299\n -12.7188 -12.7188\n 2.83107 2.83107\n -3.41115 -3.41115\n -3.01015 -3.01015\n 6.40446 6.40446\n -0.186923 -0.186923\n -1.42502 -1.42502\n 2.85606 2.85606\n -0.579786 -0.579786\n -3.92704 -3.92704\n 8.28959 8.28959\n 5.42878 5.42878\n 5.71589 5.71589\n -6.78065 -6.78065\n -0.403687 -0.403687\n -1.20623 -1.20623\n 4.92372 4.92372\n -1.69266 -1.69266\n -0.103872 -0.103872\n 1.9163 1.9163\n -2.26831 -2.26831\n -7.64622 -7.64622\n 1.02228 1.02228\n 2.91952 2.91952\n -0.524167 -0.524167\n 12.4803 12.4803\n 7.36984 7.36984\n -7.46027 -7.46027\n -2.78773 -2.78773\n 2.68293 2.68293\n -0.320891 -0.320891\n 7.12037 7.12037\n 3.02726 3.02726\n -2.68363 -2.68363\n 4.78372 4.78372\n 3.68899 3.68899\n 2.08839 2.08839\n 3.1873 3.1873\n -6.10744 -6.10744\n 10.5419 10.5419\n 6.29439 6.29439\n -9.41221 -9.41221\n -2.50548 -2.50548\n -1.14 -1.14\n -3.0203 -3.0203\n -1.73182 -1.73182\n -0.97194 -0.97194\n -6.69084 -6.69084\n -1.08986 -1.08986\n -3.83631 -3.83631\n 2.2775 2.2775\n -6.91276 -6.91276\n 2.4557 2.4557\n -0.477723 -0.477723\n -4.10405 -4.10405\n -3.91437 -3.91437\n -7.79672 -7.79672\n -6.19691 -6.19691\n 0.356732 0.356732\n 0.609725 0.609725\n -3.08225 -3.08225\n 6.39968 6.39968\n 1.30207 1.30207\n 7.36038 7.36038\n -7.7581 -7.7581\n -6.303 -6.303\n 0.348147 0.348147\n -8.38124 -8.38124\n 8.68524 8.68524\n -0.873688 -0.873688\n 1.19612 1.19612\n 0.725645 0.725645\n -6.59284 -6.59284\n -6.59079 -6.59079\n 1.03175 1.03175\n -0.236469 -0.236469\n 5.01671 5.01671\n 0.752329 0.752329\n 5.39971 5.39971\n 0.826802 0.826802\n 9.38285 9.38285\n 5.85717 5.85717\n 1.71145 1.71145\n -1.36528 -1.36528\n -5.09575 -5.09575\n 7.23996 7.23996\n 12.7272 12.7272\n 2.86673 2.86673\n 2.86546 2.86546\n 1.2423 1.2423\n 6.05857 6.05857\n 9.40879 9.40879\n 1.47573 1.47573\n 8.19025 8.19025\n 12.5009 12.5009\n -4.57244 -4.57244\n -0.674127 -0.674127\n 0.416418 0.416418\n -5.23336 -5.23336\n -0.771443 -0.771443\n 4.72784 4.72784\n -4.9684 -4.9684\n 4.75989 4.75989\n 1.68141 1.68141\n -3.2264 -3.2264\n 2.67195 2.67195\n 0.424227 0.424227\n 3.5195 3.5195\n 2.22441 2.22441\n -2.4856 -2.4856\n 8.03468 8.03468\n 8.54339 8.54339\n 3.83506 3.83506\n 13.5693 13.5693\n 2.44909 2.44909\n 2.70572 2.70572\n 6.13746 6.13746\n 1.26651 1.26651\n 8.25694 8.25694\n -3.59258 -3.59258\n 3.77765 3.77765\n -0.144755 -0.144755\n 3.15706 3.15706\n -2.3952 -2.3952\n 9.82079 9.82079\n 8.94186 8.94186\n -1.83071 -1.83071\n 1.45764 1.45764\n -11.8258 -11.8258\n -0.737553 -0.737553\n -1.2382 -1.2382\n 1.83341 1.83341\n -2.75977 -2.75977\n 3.75117 3.75117\n 6.04452 6.04452\n -4.40271 -4.40271\n -8.82336 -8.82336\n 10.8513 10.8513\n -4.91857 -4.91857\n -5.7401 -5.7401\n 7.22234 7.22234\n 7.15112 7.15112\n 1.81187 1.81187\n 8.19917 8.19917\n 2.91605 2.91605\n 3.82883 3.82883\n -0.208109 -0.208109\n 1.33796 1.33796\n 5.69606 5.69606\n -2.19266 -2.19266\n -5.91177 -5.91177\n 7.25269 7.25269\n -8.65987 -8.65987\n -3.47799 -3.47799\n -10.4904 -10.4904\n -0.00963959 -0.00963959\n -6.81662 -6.81662\n -2.05566 -2.05566\n 2.10144 2.10144\n 2.58138 2.58138\n 2.03289 2.03289\n -6.43532 -6.43532\n -2.97225 -2.97225\n -4.71142 -4.71142\n 4.97199 4.97199\n 3.687 3.687\n 1.8587 1.8587\n -0.444899 -0.444899\n -1.05556 -1.05556\n 4.15926 4.15926\n 5.48777 5.48777\n 2.28346 2.28346\n -4.69401 -4.69401\n 1.8873 1.8873\n -2.62671 -2.62671\n 1.4144 1.4144\n -2.97535 -2.97535\n 0.759131 0.759131\n 5.75781 5.75781\n -5.13309 -5.13309\n 1.72701 1.72701\n 2.96653 2.96653\n -10.8087 -10.8087\n 1.07262 1.07262\n -5.80018 -5.80018\n 1.90592 1.90592\n -5.42958 -5.42958\n 8.74889 8.74889\n -3.19785 -3.19785\n -2.7096 -2.7096\n 7.44399 7.44399\n -8.7433 -8.7433\n 11.6667 11.6667\n 2.59703 2.59703\n 4.22273 4.22273\n -4.68793 -4.68793\n -4.44601 -4.44601\n -0.57319 -0.57319\n 6.63389 6.63389\n -9.14857 -9.14857\n -1.34147 -1.34147\n 7.78513 7.78513\n -4.87331 -4.87331\n -5.06022 -5.06022\n 3.13076 3.13076\n -3.49373 -3.49373\n 3.12637 3.12637\n 0.566696 0.566696\n 4.99319 4.99319\n 3.57986 3.57986\n 0.607679 0.607679\n 2.37633 2.37633\n 0.35097 0.35097\n 0.239089 0.239089\n -6.51449 -6.51449\n -3.18838 -3.18838\n 0.770256 0.770256\n 2.09481 2.09481\n 5.36062 5.36062\n -5.25216 -5.25216\n -6.9523 -6.9523\n 3.97384 3.97384\n 8.7784 8.7784\n -3.91837 -3.91837\n -9.08965 -9.08965\n -1.17883 -1.17883\n -4.21353 -4.21353\n -5.0915 -5.0915\n 3.74499 3.74499\n -4.39715 -4.39715\n 2.13732 2.13732\n 5.97568 5.97568\n 1.11809 1.11809\n -3.93191 -3.93191\n -1.39764 -1.39764\n -4.23595 -4.23595\n 0.103914 0.103914\n -2.34387 -2.34387\n -4.95433 -4.95433\n 3.58645 3.58645\n 0.818317 0.818317\n 6.23266 6.23266\n -5.62973 -5.62973\n -7.45604 -7.45604\n 1.29222 1.29222\n 0.327714 0.327714\n 5.31996 5.31996\n -2.23663 -2.23663\n 0.058689 0.058689\n -0.74368 -0.74368\n -1.20749 -1.20749\n -4.75414 -4.75414\n 2.10011 2.10011\n -6.86479 -6.86479\n 1.58403 1.58403\n 0.0492497 0.0492497\n 0.32083 0.32083\n -3.11682 -3.11682\n 4.61797 4.61797\n -0.399561 -0.399561\n -7.89927 -7.89927\n -0.659676 -0.659676\n -2.2416 -2.2416\n 0.933026 0.933026\n 1.98848 1.98848\n -2.14547 -2.14547\n -1.10747 -1.10747\n 8.90983 8.90983\n -3.84128 -3.84128\n 9.82771 9.82771\n 3.02843 3.02843\n 3.26396 3.26396\n 6.75629 6.75629\n 0.0290972 0.0290972\n 7.92768 7.92768\n 7.44608 7.44608\n -4.14083 -4.14083\n -1.39636 -1.39636\n 2.87656 2.87656\n 3.87446 3.87446\n 0.112521 0.112521\n -3.3429 -3.3429\n -6.85823 -6.85823\n 1.18408 1.18408\n 3.53175 3.53175\n 3.56147 3.56147\n 5.41961 5.41961\n -1.5263 -1.5263\n 3.05559 3.05559\n -5.7201 -5.7201\n -3.98882 -3.98882\n -0.131939 -0.131939\n 6.25683 6.25683\n 0.712945 0.712945\n 4.17266 4.17266\n 9.04425 9.04425\n -2.39179 -2.39179\n 3.03807 3.03807\n 5.79693 5.79693\n -5.28875 -5.28875\n -2.56482 -2.56482\n -1.00679 -1.00679\n -0.512488 -0.512488\n -4.60373 -4.60373\n -2.69188 -2.69188\n 0.958182 0.958182\n -1.08075 -1.08075\n 2.66033 2.66033\n -5.77563 -5.77563\n 5.393 5.393\n 0.822122 0.822122\n 3.50281 3.50281\n -1.90373 -1.90373\n -3.41986 -3.41986\n -7.32502 -7.32502\n -2.0256 -2.0256\n -6.28488 -6.28488\n 0.358393 0.358393\n 1.89312 1.89312\n -0.709162 -0.709162\n -4.43491 -4.43491\n -3.56097 -3.56097\n -8.3806 -8.3806\n -5.56256 -5.56256\n -3.40994 -3.40994\n -6.15002 -6.15002\n 0.949459 0.949459\n 3.18256 3.18256\n 6.31834 6.31834\n 12.4998 12.4998\n -6.16927 -6.16927\n -1.73781 -1.73781\n 0.274813 0.274813\n 7.11001 7.11001\n 6.79962 6.79962\n 2.00121 2.00121\n -4.30592 -4.30592\n -2.38345 -2.38345\n 7.50502 7.50502\n -3.56375 -3.56375\n -1.07828 -1.07828\n 7.4632 7.4632\n -5.78317 -5.78317\n -0.54432 -0.54432\n 8.82699 8.82699\n -2.51939 -2.51939\n -3.21417 -3.21417\n 3.06052 3.06052\n -0.45856 -0.45856\n 8.89456 8.89456\n 5.89006 5.89006\n 1.01204 1.01204\n 4.9875 4.9875\n -1.63 -1.63\n 1.35424 1.35424\n 3.72608 3.72608\n -8.53795 -8.53795\n -5.93051 -5.93051\n -2.35685 -2.35685\n 3.51823 3.51823\n 3.65767 3.65767\n -3.04233 -3.04233\n -1.12453 -1.12453\n -1.68299 -1.68299\n -5.69175 -5.69175\n 3.66601 3.66601\n -3.11779 -3.11779\n -0.20161 -0.20161\n 0.78317 0.78317\n 2.28035 2.28035\n -4.43493 -4.43493\n 2.12557 2.12557\n 6.97219 6.97219\n 4.91357 4.91357\n -1.87778 -1.87778\n 1.98163 1.98163\n 1.01184 1.01184\n 0.0544142 0.0544142\n -0.748318 -0.748318\n 10.0677 10.0677\n -5.50226 -5.50226\n 3.89987 3.89987\n 1.38136 1.38136\n 4.67073 4.67073\n 5.3372 5.3372\n -1.29886 -1.29886\n -0.965173 -0.965173\n 0.546909 0.546909\n 5.87692 5.87692\n -10.1356 -10.1356\n 0.541422 0.541422\n 0.486656 0.486656\n 8.42395 8.42395\n -4.04554 -4.04554\n 11.4728 11.4728\n -6.54655 -6.54655\n 6.90602 6.90602\n -13.8383 -13.8383\n 2.64142 2.64142\n 3.96547 3.96547\n -0.887154 -0.887154\n 0.0442338 0.0442338\n -5.12331 -5.12331\n 4.95632 4.95632\n 3.15264 3.15264\n 4.80494 4.80494\n -5.42313 -5.42313\n -4.2795 -4.2795\n 1.661 1.661\n 3.85204 3.85204\n 10.1308 10.1308\n -4.34526 -4.34526\n -5.49571 -5.49571\n 3.92939 3.92939\n -3.28527 -3.28527\n 0.154911 0.154911\n -3.606 -3.606\n 5.91814 5.91814\n -8.85249 -8.85249\n 9.38796 9.38796\n -0.800741 -0.800741\n -2.87508 -2.87508\n 2.99955 2.99955\n -7.13252 -7.13252\n -6.77081 -6.77081\n -2.28359 -2.28359\n -0.180517 -0.180517\n 7.04622 7.04622\n 4.2577 4.2577\n -4.73655 -4.73655\n -0.249759 -0.249759\n 2.4412 2.4412\n 8.47175 8.47175\n -3.24927 -3.24927\n -12.5242 -12.5242\n -2.74845 -2.74845\n -9.32786 -9.32786\n 4.21624 4.21624\n 2.94687 2.94687\n 3.35216 3.35216\n -3.5485 -3.5485\n 6.97298 6.97298\n 2.01617 2.01617\n 4.70745 4.70745\n 2.96924 2.96924\n -0.18365 -0.18365\n -0.694247 -0.694247\n -7.14459 -7.14459\n 5.38548 5.38548\n 2.04923 2.04923\n -5.33216 -5.33216\n 5.47927 5.47927\n 0.357422 0.357422\n 4.36552 4.36552\n 6.88375 6.88375\n -6.47244 -6.47244\n -3.40726 -3.40726\n -6.56449 -6.56449\n 6.34818 6.34818\n -4.23984 -4.23984\n -11.1113 -11.1113\n 2.41915 2.41915\n 3.90153 3.90153\n -7.69422 -7.69422\n -8.03709 -8.03709\n -9.64719 -9.64719\n -4.04416 -4.04416\n 2.64435 2.64435\n 5.11566 5.11566\n -1.27873 -1.27873\n -1.01265 -1.01265\n -8.38716 -8.38716\n -0.960571 -0.960571\n 2.05458 2.05458\n -1.89606 -1.89606\n -7.04401 -7.04401\n 4.91798 4.91798\n 2.12484 2.12484\n 2.38768 2.38768\n 7.9691 7.9691\n -1.00886 -1.00886\n -4.9569 -4.9569\n -4.74278 -4.74278\n 0.191814 0.191814\n -5.2925 -5.2925\n -1.15484 -1.15484\n 2.27898 2.27898\n 4.12308 4.12308\n -6.18988 -6.18988\n 7.1232 7.1232\n -6.68678 -6.68678\n 1.65808 1.65808\n 8.53283 8.53283\n 0.509069 0.509069\n -3.03638 -3.03638\n -4.86641 -4.86641\n 7.20729 7.20729\n -7.51236 -7.51236\n 3.37738 3.37738\n -0.0649395 -0.0649395\n 2.75749 2.75749\n -5.61535 -5.61535\n 3.1237 3.1237\n -0.766488 -0.766488\n 4.39047 4.39047\n 1.28616 1.28616\n -8.02003 -8.02003\n 4.21688 4.21688\n -2.79942 -2.79942\n -5.80171 -5.80171\n 9.97235 9.97235\n 21.8011 21.8011\n -3.58992 -3.58992\n 5.03481 5.03481\n -2.1684 -2.1684\n -5.46844 -5.46844\n 1.57702 1.57702\n -4.53923 -4.53923\n -1.77363 -1.77363\n -0.489051 -0.489051\n -0.371992 -0.371992\n 8.264 8.264\n 1.63502 1.63502\n -1.10134 -1.10134\n 4.76612 4.76612\n 5.93085 5.93085\n -2.07348 -2.07348\n 4.26074 4.26074\n 4.1331 4.1331\n 11.1442 11.1442\n 2.18824 2.18824\n 2.18854 2.18854\n 0.210843 0.210843\n -9.30743 -9.30743\n 5.34539 5.34539\n -4.21419 -4.21419\n -3.97284 -3.97284\n -2.67745 -2.67745\n 4.17366 4.17366\n 2.41498 2.41498\n 0.801359 0.801359\n 8.35766 8.35766\n -1.29589 -1.29589\n -7.45531 -7.45531\n -7.26731 -7.26731\n 4.06669 4.06669\n -2.35771 -2.35771\n -8.73174 -8.73174\n -0.837329 -0.837329\n -2.53419 -2.53419\n 44.3977 44.3977\n 13.5049 13.5049\n -3.66878 -3.66878\n -6.5533 -6.5533\n -5.59814 -5.59814\n -10.5759 -10.5759\n 0.663108 0.663108\n -3.45147 -3.45147\n -3.75944 -3.75944\n 1.84721 1.84721\n -0.363204 -0.363204\n 4.54678 4.54678\n 2.07408 2.07408\n 7.85227 7.85227\n -7.53707 -7.53707\n 4.18344 4.18344\n -1.96048 -1.96048\n 6.24217 6.24217\n -9.16295 -9.16295\n 0.0480544 0.0480544\n 2.84725 2.84725\n 1.08008 1.08008\n -0.874464 -0.874464\n 1.67428 1.67428\n -1.91245 -1.91245\n 3.53596 3.53596\n 3.75983 3.75983\n 1.37903 1.37903\n -0.799744 -0.799744\n 2.75015 2.75015\n -11.0835 -11.0835\n -1.6781 -1.6781\n 2.86463 2.86463\n -11.1467 -11.1467\n -3.76398 -3.76398\n 9.06439 9.06439\n 9.84403 9.84403\n -5.07 -5.07\n 3.2952 3.2952\n -1.62527 -1.62527\n -7.98997 -7.98997\n -7.8193 -7.8193\n 1.10895 1.10895\n 0.460921 0.460921\n -1.47816 -1.47816\n 0.718936 0.718936\n -3.74006 -3.74006\n -2.87535 -2.87535\n 0.037427 0.037427\n -4.49959 -4.49959\n 0.0987492 0.0987492\n 1.8443 1.8443\n 0.748879 0.748879\n 1.4364 1.4364\n -0.90809 -0.90809\n -1.36403 -1.36403\n -1.27123 -1.27123\n 3.09447 3.09447\n -3.82708 -3.82708\n 0.683696 0.683696\n 3.96997 3.96997\n 0.461267 0.461267\n 4.96801 4.96801\n -5.96169 -5.96169\n 2.56714 2.56714\n -10.7519 -10.7519\n -3.39381 -3.39381\n 1.15623 1.15623\n -3.95798 -3.95798\n -1.42797 -1.42797\n 4.85734 4.85734\n -4.46424 -4.46424\n -11.9172 -11.9172\n 0.740766 0.740766\n -2.06857 -2.06857\n -1.23723 -1.23723\n -6.43373 -6.43373\n 7.04893 7.04893\n -1.10208 -1.10208\n -0.0507102 -0.0507102\n 8.23443 8.23443\n -1.71378 -1.71378\n 2.769 2.769\n 9.77752 9.77752\n 0.423859 0.423859\n 0.901832 0.901832\n 0.0738559 0.0738559\n -0.487266 -0.487266\n 2.05358 2.05358\n -8.73912 -8.73912\n 3.01532 3.01532\n -0.926127 -0.926127\n -11.2315 -11.2315\n 1.79698 1.79698\n -13.074 -13.074\n 3.72342 3.72342\n -9.17341 -9.17341\n 7.23722 7.23722\n 3.85919 3.85919\n -4.10267 -4.10267\n 5.89157 5.89157\n -1.06631 -1.06631\n -2.18366 -2.18366\n -0.0316413 -0.0316413\n -8.63864 -8.63864\n -0.194451 -0.194451\n 2.71759 2.71759\n -5.19424 -5.19424\n -16.7634 -16.7634\n 5.97943 5.97943\n 0.319596 0.319596\n -10.0687 -10.0687\n 1.12736 1.12736\n 2.11687 2.11687\n 2.5643 2.5643\n 0.502174 0.502174\n -5.75011 -5.75011\n -11.1808 -11.1808\n -3.42246 -3.42246\n 7.55982 7.55982\n -5.85592 -5.85592\n 1.22363 1.22363\n 1.39871 1.39871\n 3.35581 3.35581\n 2.99389 2.99389\n -0.762194 -0.762194\n 1.39891 1.39891\n -4.24295 -4.24295\n -6.95612 -6.95612\n 7.00699 7.00699\n -30.893 -30.893\n -7.3071 -7.3071\n 17.5017 17.5017\n -3.26283 -3.26283\n -4.13569 -4.13569\n 4.33006 4.33006\n -5.94055 -5.94055\n -0.564017 -0.564017\n 5.60949 5.60949\n 7.50747 7.50747\n -4.08147 -4.08147\n 4.08671 4.08671\n 6.72008 6.72008\n -5.02883 -5.02883\n -3.48779 -3.48779\n 4.76881 4.76881\n 4.5818 4.5818\n -3.10608 -3.10608\n -5.08198 -5.08198\n -5.54477 -5.54477\n -13.1989 -13.1989\n -8.63604 -8.63604\n -0.688683 -0.688683\n -2.34276 -2.34276\n -3.19008 -3.19008\n 0.204818 0.204818\n 0.639057 0.639057\n 12.6767 12.6767\n -3.40057 -3.40057\n -6.36799 -6.36799\n 3.7564 3.7564\n -3.04825 -3.04825\n -3.98011 -3.98011\n -2.21944 -2.21944\n 8.40757 8.40757\n -5.6418 -5.6418\n 3.3001 3.3001\n -0.678107 -0.678107\n -2.42254 -2.42254\n 0.439524 0.439524\n -0.417505 -0.417505\n -4.98938 -4.98938\n -6.34015 -6.34015\n -4.84203 -4.84203\n 2.86778 2.86778\n 3.29409 3.29409\n 2.59772 2.59772\n 5.20187 5.20187\n 3.55625 3.55625\n -7.065 -7.065\n -6.60792 -6.60792\n -3.20259 -3.20259\n 0.417062 0.417062\n -2.39846 -2.39846\n -5.762 -5.762\n 1.74843 1.74843\n 8.19239 8.19239\n -1.7349 -1.7349\n -0.0331415 -0.0331415\n 5.00712 5.00712\n 10.611 10.611\n 9.28817 9.28817\n -3.85324 -3.85324\n 2.29622 2.29622\n 10.962 10.962\n 4.44034 4.44034\n -3.2265 -3.2265\n 1.39326 1.39326\n -1.56539 -1.56539\n -8.78843 -8.78843\n -1.74101 -1.74101\n 8.51953 8.51953\n 3.31178 3.31178\n -1.20051 -1.20051\n -3.93224 -3.93224\n 2.4431 2.4431\n 3.69278 3.69278\n -10.2714 -10.2714\n -13.7579 -13.7579\n -1.76844 -1.76844\n -0.448193 -0.448193\n 1.48574 1.48574\n -0.831377 -0.831377\n 6.42657 6.42657\n 6.51848 6.51848\n 2.7764 2.7764\n 4.29448 4.29448\n -1.27173 -1.27173\n -7.14856 -7.14856\n 2.95751 2.95751\n 2.39789 2.39789\n 4.79429 4.79429\n 7.29216 7.29216\n -4.91502 -4.91502\n 2.38701 2.38701\n -2.34997 -2.34997\n -0.876115 -0.876115\n -0.672649 -0.672649\n 4.43884 4.43884\n 0.254258 0.254258\n -3.56471 -3.56471\n 0.161779 0.161779\n -10.1128 -10.1128\n 9.97279 9.97279\n -5.01498 -5.01498\n 1.10415 1.10415\n 1.37993 1.37993\n -3.32619 -3.32619\n 2.57257 2.57257\n -0.137478 -0.137478\n 1.49426 1.49426\n -0.805644 -0.805644\n 3.25356 3.25356\n 2.46332 2.46332\n 1.39266 1.39266\n 4.15167 4.15167\n -9.27164 -9.27164\n -2.29794 -2.29794\n 0.067971 0.067971\n 3.83697 3.83697\n 5.7385 5.7385\n -6.15176 -6.15176\n -4.08442 -4.08442\n -6.18563 -6.18563\n 6.44396 6.44396\n 5.63585 5.63585\n 1.21604 1.21604\n 11.1837 11.1837\n 2.29144 2.29144\n -0.995473 -0.995473\n 5.22826 5.22826\n 9.27205 9.27205\n -7.23457 -7.23457\n 6.29887 6.29887\n 2.48343 2.48343\n -4.96111 -4.96111\n -5.52811 -5.52811\n -4.40855 -4.40855\n -5.69429 -5.69429\n -1.12765 -1.12765\n -0.22142 -0.22142\n -5.96815 -5.96815\n 4.55923 4.55923\n -1.05719 -1.05719\n 2.07986 2.07986\n 7.77539 7.77539\n -2.03581 -2.03581\n 0.270705 0.270705\n 0.126658 0.126658\n -6.1672 -6.1672\n -16.0576 -16.0576\n 0.635198 0.635198\n 8.55006 8.55006\n -2.93081 -2.93081\n -1.7657 -1.7657\n -0.37886 -0.37886\n -2.4086 -2.4086\n 1.41889 1.41889\n -1.40539 -1.40539\n 0.963807 0.963807\n -2.14947 -2.14947\n -6.31832 -6.31832\n -4.30827 -4.30827\n 6.2609 6.2609\n -8.36351 -8.36351\n 4.28564 4.28564\n 0.646361 0.646361\n 4.60485 4.60485\n -3.1664 -3.1664\n -0.611618 -0.611618\n -9.53534 -9.53534\n 1.92275 1.92275\n -8.1521 -8.1521\n 0.101441 0.101441\n 0.399002 0.399002\n -2.04551 -2.04551\n -4.5564 -4.5564\n 3.0555 3.0555\n 0.992401 0.992401\n -5.62638 -5.62638\n -0.46873 -0.46873\n -6.86208 -6.86208\n -2.77108 -2.77108\n 3.51118 3.51118\n 0.885266 0.885266\n 3.65701 3.65701\n 6.88336 6.88336\n -7.25948 -7.25948\n 7.31435 7.31435\n -6.57357 -6.57357\n 3.67947 3.67947\n 4.80901 4.80901\n -2.80342 -2.80342\n 5.78724 5.78724\n 5.30985 5.30985\n 7.24724 7.24724\n -1.30439 -1.30439\n 2.50975 2.50975\n 5.28538 5.28538\n -3.91583 -3.91583\n 2.98722 2.98722\n 5.31167 5.31167\n -0.596966 -0.596966\n -4.94141 -4.94141\n 4.59005 4.59005\n 1.3813 1.3813\n 4.0611 4.0611\n -0.747616 -0.747616\n -3.1697 -3.1697\n -1.70787 -1.70787\n -2.43542 -2.43542\n -5.86823 -5.86823\n -10.9093 -10.9093\n 5.20087 5.20087\n -6.40378 -6.40378\n 1.5149 1.5149\n -6.52874 -6.52874\n -5.69743 -5.69743\n 1.06819 1.06819\n -7.31776 -7.31776\n 3.69649 3.69649\n -4.21319 -4.21319\n -4.91507 -4.91507\n 5.44776 5.44776\n -0.708927 -0.708927\n 1.94895 1.94895\n 2.90927 2.90927\n -2.82547 -2.82547\n -1.79858 -1.79858\n -15.6727 -15.6727\n -0.308918 -0.308918\n 2.61943 2.61943\n -3.89041 -3.89041\n -1.84684 -1.84684\n -6.80446 -6.80446\n 3.97398 3.97398\n 2.31201 2.31201\n 4.29417 4.29417\n -1.24479 -1.24479\n 4.25927 4.25927\n -1.96968 -1.96968\n 0.703519 0.703519\n 2.06517 2.06517\n 0.920347 0.920347\n 6.22843 6.22843\n 1.86167 1.86167\n 0.43407 0.43407\n 1.25225 1.25225\n -0.00512493 -0.00512493\n -1.70887 -1.70887\n 0.725693 0.725693\n 6.11604 6.11604\n -5.87059 -5.87059\n 3.26102 3.26102\n 2.0488 2.0488\n -0.0544172 -0.0544172\n 2.57295 2.57295\n -1.10578 -1.10578\n 2.43904 2.43904\n -2.34604 -2.34604\n 3.2098 3.2098\n 2.16089 2.16089\n -9.35001 -9.35001\n 9.43924 9.43924\n 0.916747 0.916747\n 2.59533 2.59533\n -1.84596 -1.84596\n 1.02889 1.02889\n 0.755944 0.755944\n 8.28274 8.28274\n -3.21136 -3.21136\n 1.24897 1.24897\n -0.363928 -0.363928\n 2.37533 2.37533\n -1.5794 -1.5794\n 6.67417 6.67417\n -4.4632 -4.4632\n 8.53731 8.53731\n -1.16526 -1.16526\n -0.51467 -0.51467\n -4.91688 -4.91688\n 7.17741 7.17741\n 4.61708 4.61708\n -2.41511 -2.41511\n -11.5234 -11.5234\n 2.61523 2.61523\n 4.7703 4.7703\n 6.72381 6.72381\n 5.65388 5.65388\n -4.23963 -4.23963\n 0.925176 0.925176\n 1.98862 1.98862\n -6.14466 -6.14466\n 2.76728 2.76728\n -0.83598 -0.83598\n -4.22593 -4.22593\n 5.99083 5.99083\n -4.886 -4.886\n 4.37801 4.37801\n 5.77761 5.77761\n 3.38352 3.38352\n -0.311291 -0.311291\n 8.26669 8.26669\n -4.94787 -4.94787\n -9.62034 -9.62034\n 2.37023 2.37023\n 3.41718 3.41718\n -2.43368 -2.43368\n 3.5898 3.5898\n -1.21973 -1.21973\n 0.0350305 0.0350305\n -4.33097 -4.33097\n -3.41432 -3.41432\n 2.59161 2.59161\n -2.11239 -2.11239\n -1.0801 -1.0801\n -3.27061 -3.27061\n -0.34025 -0.34025\n -6.40563 -6.40563\n -0.522305 -0.522305\n 4.63382 4.63382\n 1.5154 1.5154\n 0.968893 0.968893\n 2.79354 2.79354\n -0.829942 -0.829942\n -1.76388 -1.76388\n -6.64903 -6.64903\n -8.52588 -8.52588\n 2.70798 2.70798\n 6.78381 6.78381\n -5.67891 -5.67891\n -0.0588557 -0.0588557\n -4.12923 -4.12923\n -2.70431 -2.70431\n -0.12131 -0.12131\n 6.59494 6.59494\n 0.830427 0.830427\n 3.40436 3.40436\n 6.98828 6.98828\n -2.33332 -2.33332\n 5.85244 5.85244\n -10.0398 -10.0398\n -0.242519 -0.242519\n -3.38719 -3.38719\n 2.74288 2.74288\n 3.82961 3.82961\n -6.85166 -6.85166\n -0.345431 -0.345431\n -3.03082 -3.03082\n 1.68089 1.68089\n -0.785036 -0.785036\n -2.92804 -2.92804\n 1.03727 1.03727\n 5.51647 5.51647\n -2.15538 -2.15538\n -6.20918 -6.20918\n -0.986195 -0.986195\n -4.4207 -4.4207\n -0.314791 -0.314791\n -6.64843 -6.64843\n 1.255 1.255\n 4.39107 4.39107\n 2.20706 2.20706\n -1.894 -1.894\n -3.01471 -3.01471\n -0.0623641 -0.0623641\n -5.76316 -5.76316\n -2.45987 -2.45987\n -2.09262 -2.09262\n 0.0458748 0.0458748\n 5.09539 5.09539\n -3.80431 -3.80431\n -3.90738 -3.90738\n -6.48843 -6.48843\n -2.58373 -2.58373\n -6.38764 -6.38764\n 7.38858 7.38858\n 0.492176 0.492176\n 8.79347 8.79347\n 2.04442 2.04442\n -0.216083 -0.216083\n 11.3375 11.3375\n -3.4177 -3.4177\n 3.90111 3.90111\n 4.92081 4.92081\n 4.45964 4.45964\n 11.1458 11.1458\n -2.2688 -2.2688\n -4.43463 -4.43463\n -4.22186 -4.22186\n -5.93987 -5.93987\n 3.4437 3.4437\n -5.60816 -5.60816\n -8.04401 -8.04401\n -4.95256 -4.95256\n 3.88283 3.88283\n -0.173935 -0.173935\n -2.63243 -2.63243\n -1.03812 -1.03812\n -9.14078 -9.14078\n -6.1411 -6.1411\n 3.4284 3.4284\n -9.8305 -9.8305\n 6.76115 6.76115\n -11.3646 -11.3646\n -5.7296 -5.7296\n -2.41831 -2.41831\n -5.21505 -5.21505\n 10.4347 10.4347\n 2.06721 2.06721\n 1.02265 1.02265\n -6.93537 -6.93537\n 1.28707 1.28707\n 0.939615 0.939615\n 11.262 11.262\n 1.2805 1.2805\n 4.8619 4.8619\n 3.15836 3.15836\n -5.18747 -5.18747\n -2.98078 -2.98078\n -2.0489 -2.0489\n -2.85634 -2.85634\n -4.56059 -4.56059\n -4.0715 -4.0715\n 0.469543 0.469543\n -2.05188 -2.05188\n -2.79567 -2.79567\n 3.82027 3.82027\n 2.55175 2.55175\n -0.468207 -0.468207\n -5.65994 -5.65994\n 2.13508 2.13508\n -3.17019 -3.17019\n 6.53032 6.53032\n -4.98714 -4.98714\n -1.94956 -1.94956\n -3.08465 -3.08465\n 8.11664 8.11664\n 8.86283 8.86283\n 0.84108 0.84108\n 5.22353 5.22353\n -3.45671 -3.45671\n -1.38725 -1.38725\n 1.35206 1.35206\n -10.4407 -10.4407\n -2.20051 -2.20051\n -0.228019 -0.228019\n -1.38039 -1.38039\n 11.1342 11.1342\n 5.17568 5.17568\n -4.54852 -4.54852\n -1.26392 -1.26392\n 5.69792 5.69792\n -4.90866 -4.90866\n 2.84526 2.84526\n 10.9699 10.9699\n 12.9756 12.9756\n 8.48223 8.48223\n 2.11902 2.11902\n 3.74471 3.74471\n -5.14437 -5.14437\n -14.7206 -14.7206\n 3.01028 3.01028\n -2.67988 -2.67988\n -2.88296 -2.88296\n -4.95895 -4.95895\n -1.82286 -1.82286\n 5.23419 5.23419\n -2.23867 -2.23867\n 0.610838 0.610838\n 2.09177 2.09177\n 5.74677 5.74677\n 3.6242 3.6242\n 2.0758 2.0758\n -2.85159 -2.85159\n -3.93562 -3.93562\n 3.85649 3.85649\n -5.75638 -5.75638\n -7.07444 -7.07444\n 0.907402 0.907402\n -8.92532 -8.92532\n -4.09782 -4.09782\n 1.85777 1.85777\n 5.73041 5.73041\n -2.17118 -2.17118\n -3.4713 -3.4713\n 7.95825 7.95825\n 9.10838 9.10838\n 1.80182 1.80182\n -0.54593 -0.54593\n -4.89919 -4.89919\n -2.97982 -2.97982\n 0.807424 0.807424\n -2.27 -2.27\n -13.2338 -13.2338\n -3.94367 -3.94367\n -5.72938 -5.72938\n -2.42243 -2.42243\n -3.69581 -3.69581\n -4.71307 -4.71307\n 1.38983 1.38983\n -5.37869 -5.37869\n -6.82815 -6.82815\n 2.73203 2.73203\n 13.6495 13.6495\n -6.29731 -6.29731\n -8.43712 -8.43712\n 14.1567 14.1567\n -0.978804 -0.978804\n 1.26264 1.26264\n -9.25575 -9.25575\n -8.10968 -8.10968\n -3.98015 -3.98015\n 6.60273 6.60273\n -3.98373 -3.98373\n 1.35817 1.35817\n 1.20988 1.20988\n 1.53069 1.53069\n 4.08368 4.08368\n -2.38429 -2.38429\n -4.67381 -4.67381\n -5.49726 -5.49726\n 0.657715 0.657715\n -0.00123905 -0.00123905\n 4.62712 4.62712\n -0.317445 -0.317445\n -5.08829 -5.08829\n -9.85674 -9.85674\n 5.31787 5.31787\n 1.61793 1.61793\n 3.9901 3.9901\n -1.04243 -1.04243\n -3.73679 -3.73679\n 0.670282 0.670282\n 9.03148 9.03148\n -4.77058 -4.77058\n 8.60147 8.60147\n -0.664744 -0.664744\n 1.97711 1.97711\n -5.35794 -5.35794\n -9.70033 -9.70033\n 10.7781 10.7781\n 1.96443 1.96443\n 1.84069 1.84069\n -12.0109 -12.0109\n 2.08404 2.08404\n 3.64031 3.64031\n 8.65585 8.65585\n -11.8355 -11.8355\n 9.89404 9.89404\n 0.279063 0.279063\n -0.315296 -0.315296\n 3.74263 3.74263\n 6.54645 6.54645\n 5.43941 5.43941\n 4.83252 4.83252\n 1.70716 1.70716\n -3.27497 -3.27497\n -3.07764 -3.07764\n 9.25309 9.25309\n -1.69559 -1.69559\n 10.1694 10.1694\n -3.42523 -3.42523\n 6.39435 6.39435\n 2.18084 2.18084\n 1.33177 1.33177\n -0.709393 -0.709393\n 1.44799 1.44799\n 0.881759 0.881759\n -2.35085 -2.35085\n -1.91407 -1.91407\n 0.302603 0.302603\n 1.40288 1.40288\n -2.37323 -2.37323\n -7.74084 -7.74084\n -7.73224 -7.73224\n 2.8793 2.8793\n 6.62065 6.62065\n 1.4654 1.4654\n -0.982735 -0.982735\n -0.97328 -0.97328\n -8.38882 -8.38882\n 8.74643 8.74643\n -7.86996 -7.86996\n 3.25655 3.25655\n 2.78551 2.78551\n -5.17511 -5.17511\n 4.90515 4.90515\n 0.28899 0.28899\n 3.57292 3.57292\n -5.25376 -5.25376\n -8.57274 -8.57274\n -1.18267 -1.18267\n 37.4072 37.4072\n -4.00801 -4.00801\n 4.8073 4.8073\n -4.45001 -4.45001\n 7.66024 7.66024\n -4.47725 -4.47725\n -10.2209 -10.2209\n -4.80026 -4.80026\n -0.64446 -0.64446\n -0.899171 -0.899171\n 1.09833 1.09833\n -0.988097 -0.988097\n 2.82126 2.82126\n -8.19269 -8.19269\n -2.64922 -2.64922\n -9.16004 -9.16004\n -2.39588 -2.39588\n -4.72025 -4.72025\n 2.34077 2.34077\n 3.83879 3.83879\n 1.9499 1.9499\n -0.361603 -0.361603\n 7.79929 7.79929\n 2.34774 2.34774\n -8.21052 -8.21052\n -2.02077 -2.02077\n -1.58017 -1.58017\n -0.410542 -0.410542\n -10.7206 -10.7206\n 3.26874 3.26874\n 2.80972 2.80972\n 0.0906836 0.0906836\n -1.64773 -1.64773\n 6.49353 6.49353\n -0.791109 -0.791109\n 4.71404 4.71404\n 0.0741314 0.0741314\n -0.414415 -0.414415\n 6.84572 6.84572\n -0.367457 -0.367457\n 1.17563 1.17563\n 0.51039 0.51039\n 4.40348 4.40348\n 0.978932 0.978932\n 3.79206 3.79206\n 4.57632 4.57632\n 2.77883 2.77883\n 0.490867 0.490867\n -0.151798 -0.151798\n 6.72243 6.72243\n 4.77773 4.77773\n -0.50633 -0.50633\n -8.08639 -8.08639\n 4.88619 4.88619\n -2.07669 -2.07669\n -2.24093 -2.24093\n 1.72994 1.72994\n -7.45157 -7.45157\n -12.1192 -12.1192\n 1.4328 1.4328\n -8.14432 -8.14432\n -6.25485 -6.25485\n 0.516865 0.516865\n 7.11864 7.11864\n -0.616318 -0.616318\n -0.761916 -0.761916\n -5.99496 -5.99496\n 10.4321 10.4321\n -0.516052 -0.516052\n -5.68287 -5.68287\n -4.15541 -4.15541\n 1.56619 1.56619\n -20.8292 -20.8292\n 0.788033 0.788033\n 3.34264 3.34264\n 3.70493 3.70493\n -0.0822138 -0.0822138\n 2.31304 2.31304\n -1.69352 -1.69352\n 2.10396 2.10396\n 7.2613 7.2613\n -1.81799 -1.81799\n -2.09968 -2.09968\n -3.8336 -3.8336\n -3.93478 -3.93478\n 3.3059 3.3059\n 4.19189 4.19189\n -1.93794 -1.93794\n 2.7117 2.7117\n 9.43261 9.43261\n -1.83318 -1.83318\n -1.12685 -1.12685\n 2.40725 2.40725\n 7.50947 7.50947\n 7.65688 7.65688\n -5.02792 -5.02792\n -2.55777 -2.55777\n -1.9946 -1.9946\n -0.126192 -0.126192\n -3.30905 -3.30905\n -0.209775 -0.209775\n 9.06409 9.06409\n -3.79201 -3.79201\n 8.80185 8.80185\n -1.59367 -1.59367\n -2.49213 -2.49213\n -3.5242 -3.5242\n 2.4892 2.4892\n 5.68222 5.68222\n 4.29073 4.29073\n 0.490494 0.490494\n 3.31313 3.31313\n 8.27344 8.27344\n 1.44936 1.44936\n 5.94283 5.94283\n -5.90497 -5.90497\n 0.316931 0.316931\n 1.93975 1.93975\n -1.33405 -1.33405\n -4.17957 -4.17957\n 2.45999 2.45999\n -10.0965 -10.0965\n 0.648564 0.648564\n -0.745957 -0.745957\n -6.08922 -6.08922\n -6.45851 -6.45851\n 2.70093 2.70093\n -2.59331 -2.59331\n -2.73319 -2.73319\n -6.50584 -6.50584\n 4.14167 4.14167\n 6.78757 6.78757\n 4.63335 4.63335\n 2.01754 2.01754\n 3.97717 3.97717\n 2.73775 2.73775\n 2.04299 2.04299\n 7.03044 7.03044\n -8.59414 -8.59414\n -4.19956 -4.19956\n 0.0135157 0.0135157\n -5.45393 -5.45393\n 2.75578 2.75578\n 0.730278 0.730278\n -0.410035 -0.410035\n 10.7831 10.7831\n -2.82537 -2.82537\n 1.85601 1.85601\n 1.68496 1.68496\n 2.75249 2.75249\n 9.40848 9.40848\n 1.6032 1.6032\n -3.91263 -3.91263\n 1.12247 1.12247\n -3.46516 -3.46516\n -1.48668 -1.48668\n 6.7676 6.7676\n -5.76927 -5.76927\n -2.19943 -2.19943\n -1.61329 -1.61329\n 3.35791 3.35791\n -7.80737 -7.80737\n 3.06567 3.06567\n -12.2037 -12.2037\n 12.541 12.541\n 4.42316 4.42316\n 6.48419 6.48419\n 1.17664 1.17664\n 2.97986 2.97986\n -8.63966 -8.63966\n 0.241757 0.241757\n -5.03654 -5.03654\n -1.94594 -1.94594\n 12.8093 12.8093\n -3.58644 -3.58644\n -3.35952 -3.35952\n -0.864134 -0.864134\n -12.4807 -12.4807\n -1.69909 -1.69909\n -5.67676 -5.67676\n -10.6435 -10.6435\n -3.86815 -3.86815\n 4.20674 4.20674\n -4.94992 -4.94992\n 7.63289 7.63289\n -5.5226 -5.5226\n 1.58362 1.58362\n 1.14864 1.14864\n 5.98635 5.98635\n 11.9692 11.9692\n -0.208588 -0.208588\n -0.177219 -0.177219\n 6.35143 6.35143\n -2.21028 -2.21028\n 0.693657 0.693657\n 2.66882 2.66882\n -0.494413 -0.494413\n 10.9482 10.9482\n 2.9522 2.9522\n 1.69427 1.69427\n -5.54007 -5.54007\n -1.44208 -1.44208\n -2.75377 -2.75377\n 7.62773 7.62773\n -0.0991657 -0.0991657\n 0.541024 0.541024\n 0.383422 0.383422\n -6.28538 -6.28538\n -3.63239 -3.63239\n 5.54891 5.54891\n 4.38377 4.38377\n -4.21607 -4.21607\n -1.58462 -1.58462\n 1.99568 1.99568\n 1.70177 1.70177\n 1.65142 1.65142\n 1.79811 1.79811\n -6.82605 -6.82605\n 3.65159 3.65159\n 2.60935 2.60935\n -2.91237 -2.91237\n -1.56808 -1.56808\n -3.07334 -3.07334\n 0.883426 0.883426\n -1.59697 -1.59697\n 4.44432 4.44432\n -2.72255 -2.72255\n -0.853149 -0.853149\n -0.132598 -0.132598\n -0.63629 -0.63629\n -3.69308 -3.69308\n -7.18449 -7.18449\n 1.20547 1.20547\n 14.3427 14.3427\n 5.08288 5.08288\n 0.957041 0.957041\n 0.153537 0.153537\n -7.14906 -7.14906\n -8.78572 -8.78572\n 4.05049 4.05049\n 3.22929 3.22929\n -3.34601 -3.34601\n 3.86442 3.86442\n -2.80641 -2.80641\n 6.51055 6.51055\n -4.58706 -4.58706\n -1.51146 -1.51146\n 3.88212 3.88212\n 1.89549 1.89549\n 3.50062 3.50062\n -1.43005 -1.43005\n -2.91969 -2.91969\n -6.52573 -6.52573\n -3.8843 -3.8843\n -8.34716 -8.34716\n -7.42192 -7.42192\n -3.98985 -3.98985\n 15.526 15.526\n 8.70318 8.70318\n -1.10105 -1.10105\n 2.14694 2.14694\n 7.71484 7.71484\n -0.0260442 -0.0260442\n -3.31138 -3.31138\n 1.67906 1.67906\n -0.083112 -0.083112\n -8.42905 -8.42905\n -8.82729 -8.82729\n 11.2859 11.2859\n -8.07136 -8.07136\n -3.9371 -3.9371\n -4.63176 -4.63176\n -1.23605 -1.23605\n -2.08565 -2.08565\n 1.93918 1.93918\n -12.5031 -12.5031\n -0.442281 -0.442281\n -5.50289 -5.50289\n -0.815112 -0.815112\n 0.0898735 0.0898735\n 4.69373 4.69373\n -7.22004 -7.22004\n 0.543294 0.543294\n 4.2932 4.2932\n 2.12984 2.12984\n -4.42752 -4.42752\n 3.03694 3.03694\n -3.73337 -3.73337\n -12.0483 -12.0483\n -5.99704 -5.99704\n 0.0707967 0.0707967\n -4.52239 -4.52239\n 3.65625 3.65625\n -5.61903 -5.61903\n 9.78971 9.78971\n 8.47575 8.47575\n -0.320966 -0.320966\n -7.10339 -7.10339\n 0.485669 0.485669\n 3.19439 3.19439\n -0.411976 -0.411976\n -0.782875 -0.782875\n 16.4086 16.4086\n -2.67312 -2.67312\n 0.73424 0.73424\n 8.32014 8.32014\n -1.24665 -1.24665\n 3.70031 3.70031\n -6.22155 -6.22155\n -6.34804 -6.34804\n -4.84631 -4.84631\n 7.19111 7.19111\n 2.58937 2.58937\n 2.12044 2.12044\n -0.304369 -0.304369\n -11.5161 -11.5161\n -4.75933 -4.75933\n -5.40287 -5.40287\n -14.7511 -14.7511\n -11.3269 -11.3269\n -3.40961 -3.40961\n -8.36998 -8.36998\n -7.86816 -7.86816\n 3.46638 3.46638\n 5.10745 5.10745\n 9.12589 9.12589\n 4.53119 4.53119\n -0.0952322 -0.0952322\n -1.67069 -1.67069\n 1.48937 1.48937\n 2.1548 2.1548\n -0.680895 -0.680895\n 6.00943 6.00943\n -6.23597 -6.23597\n 15.2635 15.2635\n -5.39621 -5.39621\n 2.9004 2.9004\n -7.2031 -7.2031\n 0.188095 0.188095\n -5.65511 -5.65511\n 8.80472 8.80472\n 4.77116 4.77116\n -0.320718 -0.320718\n -0.094774 -0.094774\n 4.24892 4.24892\n -0.729715 -0.729715\n 3.46906 3.46906\n -4.86913 -4.86913\n -2.05092 -2.05092\n 3.24008 3.24008\n 2.67334 2.67334\n 5.41008 5.41008\n 4.61387 4.61387\n -11.9338 -11.9338\n 2.15538 2.15538\n 3.39914 3.39914\n 2.71216 2.71216\n 6.79031 6.79031\n -0.750493 -0.750493\n -0.683416 -0.683416\n 7.23875 7.23875\n 4.67949 4.67949\n -2.16467 -2.16467\n 3.64787 3.64787\n -1.27823 -1.27823\n -1.43992 -1.43992\n 3.183 3.183\n -8.60412 -8.60412\n -5.42757 -5.42757\n -0.564214 -0.564214\n -1.17837 -1.17837\n 2.45248 2.45248\n 3.60909 3.60909\n 2.61183 2.61183\n 5.20279 5.20279\n -1.07145 -1.07145\n -0.919519 -0.919519\n 3.89898 3.89898\n 3.72175 3.72175\n -9.9673 -9.9673\n 1.50607 1.50607\n -0.456562 -0.456562\n 10.9984 10.9984\n -2.18673 -2.18673\n -7.39159 -7.39159\n -5.54389 -5.54389\n 2.6353 2.6353\n 6.87535 6.87535\n -10.4019 -10.4019\n -5.51375 -5.51375\n -3.33244 -3.33244\n 7.60358 7.60358\n -9.48529 -9.48529\n -0.514099 -0.514099\n 6.20569 6.20569\n -4.60198 -4.60198\n -1.28686 -1.28686\n -0.383981 -0.383981\n -0.173934 -0.173934\n -7.97782 -7.97782\n 5.9926 5.9926\n -3.7357 -3.7357\n -7.77841 -7.77841\n 3.09245 3.09245\n -3.70421 -3.70421\n -1.50012 -1.50012\n -3.90181 -3.90181\n 0.183002 0.183002\n -4.72374 -4.72374\n -3.36966 -3.36966\n 8.23642 8.23642\n 0.387898 0.387898\n -2.53048 -2.53048\n 4.46348 4.46348\n -0.932844 -0.932844\n -1.76804 -1.76804\n -0.390175 -0.390175\n 8.28101 8.28101\n 8.66959 8.66959\n 2.47585 2.47585\n 6.33837 6.33837\n 3.05846 3.05846\n 6.43047 6.43047\n 0.167477 0.167477\n 0.615034 0.615034\n -8.467 -8.467\n 2.15566 2.15566\n 6.59172 6.59172\n -8.30068 -8.30068\n -2.92268 -2.92268\n -1.14616 -1.14616\n 3.864 3.864\n -8.07267 -8.07267\n 0.382952 0.382952\n 4.79087 4.79087\n 7.87692 7.87692\n -1.27352 -1.27352\n -0.439992 -0.439992\n -0.361056 -0.361056\n 5.51463 5.51463\n 4.10827 4.10827\n -1.36056 -1.36056\n -10.9063 -10.9063\n -3.12566 -3.12566\n -1.52612 -1.52612\n 2.47429 2.47429\n 1.92973 1.92973\n 6.05399 6.05399\n 6.35717 6.35717\n -6.54112 -6.54112\n 0.16752 0.16752\n -0.581192 -0.581192\n -3.91981 -3.91981\n 3.29046 3.29046\n -9.85289 -9.85289\n -1.68008 -1.68008\n -0.294261 -0.294261\n -2.33446 -2.33446\n 8.72203 8.72203\n -7.53754 -7.53754\n 1.8548 1.8548\n 0.0863562 0.0863562\n 3.71224 3.71224\n -2.72156 -2.72156\n 6.92717 6.92717\n 4.22066 4.22066\n 2.9384 2.9384\n -0.436476 -0.436476\n 7.94505 7.94505\n 3.35167 3.35167\n 4.57606 4.57606\n -1.94551 -1.94551\n 7.26891 7.26891\n 5.7114 5.7114\n -4.8975 -4.8975\n 0.24802 0.24802\n 4.4272 4.4272\n 3.21714 3.21714\n -2.75997 -2.75997\n 3.0239 3.0239\n 6.00743 6.00743\n 1.95157 1.95157\n -8.23524 -8.23524\n -0.0388194 -0.0388194\n -1.59723 -1.59723\n -15.7227 -15.7227\n 5.01363 5.01363\n 2.59661 2.59661\n 0.344503 0.344503\n 7.85727 7.85727\n 0.142462 0.142462\n -3.54743 -3.54743\n -4.18558 -4.18558\n 3.96172 3.96172\n -0.376684 -0.376684\n 3.78763 3.78763\n -1.58384 -1.58384\n 15.837 15.837\n -0.887404 -0.887404\n 0.855016 0.855016\n 11.1701 11.1701\n 5.15206 5.15206\n 6.83176 6.83176\n -0.91331 -0.91331\n -10.3398 -10.3398\n 2.48231 2.48231\n -2.03572 -2.03572\n 1.09096 1.09096\n -0.162198 -0.162198\n -7.32758 -7.32758\n -6.97941 -6.97941\n 5.98831 5.98831\n -7.43703 -7.43703\n -8.97936 -8.97936\n 0.676949 0.676949\n 1.37291 1.37291\n 4.41159 4.41159\n 2.45643 2.45643\n 2.79374 2.79374\n 2.36712 2.36712\n -7.74483 -7.74483\n 0.602922 0.602922\n -2.48544 -2.48544\n 0.299035 0.299035\n 6.77695 6.77695\n 1.44763 1.44763\n 1.94637 1.94637\n -4.04181 -4.04181\n 16.3509 16.3509\n 6.4273 6.4273\n 5.41235 5.41235\n -5.91387 -5.91387\n -6.06301 -6.06301\n 3.4536 3.4536\n -3.39128 -3.39128\n 11.299 11.299\n 2.62685 2.62685\n 1.00866 1.00866\n 10.6766 10.6766\n -0.805083 -0.805083\n 3.91073 3.91073\n 3.67201 3.67201\n -9.14116 -9.14116\n 15.6406 15.6406\n 3.22084 3.22084\n -2.90513 -2.90513\n 4.58966 4.58966\n 0.0983211 0.0983211\n 2.35908 2.35908\n 0.658109 0.658109\n 2.37478 2.37478\n -6.70679 -6.70679\n 6.08307 6.08307\n -29.6624 -29.6624\n 1.55578 1.55578\n 5.31311 5.31311\n -5.40681 -5.40681\n 1.80228 1.80228\n 4.50431 4.50431\n 7.25673 7.25673\n 5.89811 5.89811\n -2.92888 -2.92888\n 7.48853 7.48853\n -1.67318 -1.67318\n 0.974302 0.974302\n -8.10178 -8.10178\n 3.29435 3.29435\n -1.64519 -1.64519\n -7.08854 -7.08854\n 6.68891 6.68891\n -5.69927 -5.69927\n -3.51768 -3.51768\n 11.2895 11.2895\n -0.828568 -0.828568\n 5.53562 5.53562\n -0.358066 -0.358066\n -5.92559 -5.92559\n 4.39224 4.39224\n -5.1225 -5.1225\n -9.51174 -9.51174\n 9.80076 9.80076\n -1.85858 -1.85858\n 6.95181 6.95181\n -1.71297 -1.71297\n -0.275297 -0.275297\n -0.860135 -0.860135\n -0.484906 -0.484906\n 5.71425 5.71425\n 2.74639 2.74639\n -8.40417 -8.40417\n -1.84935 -1.84935\n 2.94526 2.94526\n 10.708 10.708\n 0.892511 0.892511\n -1.36773 -1.36773\n -7.25911 -7.25911\n 3.91428 3.91428\n -0.776027 -0.776027\n 3.44102 3.44102\n -4.87806 -4.87806\n 3.65101 3.65101\n -3.01077 -3.01077\n 1.17918 1.17918\n 5.82266 5.82266\n 8.52564 8.52564\n 4.35296 4.35296\n -2.94897 -2.94897\n -4.19366 -4.19366\n -4.7939 -4.7939\n 3.44038 3.44038\n -7.87089 -7.87089\n -3.18931 -3.18931\n -6.65708 -6.65708\n 1.09687 1.09687\n -4.36662 -4.36662\n 2.90783 2.90783\n 4.66889 4.66889\n -1.26146 -1.26146\n -2.01469 -2.01469\n -2.44566 -2.44566\n -2.15098 -2.15098\n 3.4006 3.4006\n 0.0396139 0.0396139\n 2.29469 2.29469\n -7.62709 -7.62709\n 7.18738 7.18738\n 1.45481 1.45481\n 2.37791 2.37791\n -5.37208 -5.37208\n -0.0612415 -0.0612415\n -1.46115 -1.46115\n 4.29624 4.29624\n 3.25993 3.25993\n 2.42986 2.42986\n 6.56133 6.56133\n -2.07349 -2.07349\n 5.61643 5.61643\n 5.48251 5.48251\n -0.703666 -0.703666\n -5.09456 -5.09456\n 0.57249 0.57249\n 4.28577 4.28577\n 2.468 2.468\n -10.013 -10.013\n -3.26046 -3.26046\n -7.91038 -7.91038\n -2.03302 -2.03302\n 3.49234 3.49234\n -1.2481 -1.2481\n -1.87417 -1.87417\n -1.93016 -1.93016\n 2.14307 2.14307\n -9.0722 -9.0722\n 2.03124 2.03124\n -0.938906 -0.938906\n 0.817464 0.817464\n 2.23636 2.23636\n 1.3076 1.3076\n 4.90629 4.90629\n 2.16603 2.16603\n 5.84398 5.84398\n -6.56748 -6.56748\n 7.22968 7.22968\n 0.664381 0.664381\n 11.2001 11.2001\n -4.98902 -4.98902\n 0.841822 0.841822\n -1.35522 -1.35522\n -2.43996 -2.43996\n 5.14732 5.14732\n -7.50974 -7.50974\n 5.73113 5.73113\n -2.72015 -2.72015\n -5.04474 -5.04474\n -13.1 -13.1\n 0.0777815 0.0777815\n 7.85631 7.85631\n -0.323243 -0.323243\n -2.97974 -2.97974\n 0.925187 0.925187\n 5.77219 5.77219\n 4.39868 4.39868\n 2.22326 2.22326\n 1.79052 1.79052\n -3.37507 -3.37507\n -4.08645 -4.08645\n 5.59349 5.59349\n 11.879 11.879\n -0.8099 -0.8099\n 16.6866 16.6866\n 2.85772 2.85772\n 3.73902 3.73902\n -0.406009 -0.406009\n 7.49033 7.49033\n -1.01733 -1.01733\n 4.03678 4.03678\n 4.91574 4.91574\n 14.6191 14.6191\n -1.18215 -1.18215\n -2.79895 -2.79895\n -5.16604 -5.16604\n -2.24596 -2.24596\n 1.83945 1.83945\n 1.72673 1.72673\n -23.2963 -23.2963\n -0.623748 -0.623748\n -2.8419 -2.8419\n 6.56374 6.56374\n 10.3431 10.3431\n 5.28302 5.28302\n 3.12716 3.12716\n 8.41242 8.41242\n 0.416003 0.416003\n -2.43236 -2.43236\n -1.63284 -1.63284\n 5.3806 5.3806\n 9.39975 9.39975\n 4.44496 4.44496\n -3.01441 -3.01441\n -1.33538 -1.33538\n 2.23541 2.23541\n -4.30131 -4.30131\n -1.20324 -1.20324\n 4.79406 4.79406\n 0.692551 0.692551\n -2.20403 -2.20403\n 0.12931 0.12931\n 0.842875 0.842875\n 0.29791 0.29791\n 6.59639 6.59639\n 8.6591 8.6591\n 2.07311 2.07311\n -6.48842 -6.48842\n 2.70007 2.70007\n -0.143695 -0.143695\n 3.99651 3.99651\n 6.86089 6.86089\n -2.54281 -2.54281\n -5.085 -5.085\n 3.61747 3.61747\n 2.09466 2.09466\n 3.35667 3.35667\n 7.38405 7.38405\n 0.816999 0.816999\n -0.564258 -0.564258\n 2.46281 2.46281\n -0.081471 -0.081471\n 12.0933 12.0933\n 9.45364 9.45364\n 0.303564 0.303564\n -2.20687 -2.20687\n 1.90101 1.90101\n -2.65606 -2.65606\n -11.3589 -11.3589\n -1.68249 -1.68249\n -1.25813 -1.25813\n -0.96125 -0.96125\n -2.84666 -2.84666\n 1.18914 1.18914\n 0.211945 0.211945\n -4.8988 -4.8988\n 0.894798 0.894798\n 3.9685 3.9685\n -0.852608 -0.852608\n 3.37537 3.37537\n -0.847579 -0.847579\n -4.37006 -4.37006\n -4.12787 -4.12787\n 4.37155 4.37155\n -7.86631 -7.86631\n -3.59755 -3.59755\n -2.55397 -2.55397\n 4.25921 4.25921\n 2.21721 2.21721\n 5.72299 5.72299\n 8.32362 8.32362\n 14.4057 14.4057\n 1.49376 1.49376\n 3.108 3.108\n -1.34388 -1.34388\n 3.77816 3.77816\n 5.69761 5.69761\n 0.255491 0.255491\n 4.15979 4.15979\n -14.6016 -14.6016\n 3.1475 3.1475\n 2.86732 2.86732\n -2.7875 -2.7875\n -8.78827 -8.78827\n -1.38068 -1.38068\n -2.74156 -2.74156\n -4.82257 -4.82257\n -4.64984 -4.64984\n -0.462036 -0.462036\n 2.36274 2.36274\n 2.73927 2.73927\n -4.01583 -4.01583\n -4.20256 -4.20256\n 7.33455 7.33455\n 7.53557 7.53557\n 3.2532 3.2532\n -0.556551 -0.556551\n 4.39618 4.39618\n 2.92025 2.92025\n -49.4395 -49.4395\n 1.84066 1.84066\n -6.03682 -6.03682\n 9.70956 9.70956\n 12.18 12.18\n -0.134471 -0.134471\n 0.388477 0.388477\n -4.30526 -4.30526\n 3.98614 3.98614\n -3.20351 -3.20351\n 3.81764 3.81764\n 5.34853 5.34853\n 0.382215 0.382215\n -0.473372 -0.473372\n -4.4073 -4.4073\n -10.1129 -10.1129\n -6.82482 -6.82482\n 5.39935 5.39935\n -0.664077 -0.664077\n 7.75577 7.75577\n -5.565 -5.565\n -2.28518 -2.28518\n -3.09472 -3.09472\n 6.0196 6.0196\n -1.32035 -1.32035\n 2.5721 2.5721\n -9.0201 -9.0201\n 6.87621 6.87621\n 7.57662 7.57662\n -2.42131 -2.42131\n -7.11 -7.11\n 1.5457 1.5457\n 1.38686 1.38686\n -1.67077 -1.67077\n 5.34357 5.34357\n -5.22992 -5.22992\n -5.50112 -5.50112\n -0.820436 -0.820436\n -6.85987 -6.85987\n 4.36935 4.36935\n 8.27737 8.27737\n 7.16613 7.16613\n 7.21538 7.21538\n 0.0297893 0.0297893\n -3.30991 -3.30991\n 1.18508 1.18508\n -0.745072 -0.745072\n -1.31153 -1.31153\n -2.57184 -2.57184\n -0.187369 -0.187369\n 6.79233 6.79233\n 8.04294 8.04294\n 3.06986 3.06986\n -5.13761 -5.13761\n 0.539648 0.539648\n 5.02007 5.02007\n 2.67737 2.67737\n -6.69984 -6.69984\n 6.76321 6.76321\n 6.25102 6.25102\n 3.80545 3.80545\n -2.16059 -2.16059\n 2.81803 2.81803\n 0.447194 0.447194\n 1.84756 1.84756\n -6.42528 -6.42528\n -2.23379 -2.23379\n -2.61151 -2.61151\n -2.86143 -2.86143\n -2.94039 -2.94039\n -3.38503 -3.38503\n 0.474985 0.474985\n -9.66389 -9.66389\n 4.96293 4.96293\n -5.6718 -5.6718\n 7.06422 7.06422\n -8.36354 -8.36354\n 0.0182466 0.0182466\n 9.20883 9.20883\n 8.23981 8.23981\n -1.41968 -1.41968\n -1.36057 -1.36057\n -3.99568 -3.99568\n 2.51484 2.51484\n 5.41846 5.41846\n -10.8511 -10.8511\n -8.41267 -8.41267\n 2.04668 2.04668\n -5.61525 -5.61525\n -9.73507 -9.73507\n -0.497102 -0.497102\n 4.29467 4.29467\n -1.61424 -1.61424\n -0.818494 -0.818494\n -7.02135 -7.02135\n 13.4836 13.4836\n -4.10115 -4.10115\n -8.11914 -8.11914\n -2.79895 -2.79895\n -4.39428 -4.39428\n -0.737467 -0.737467\n 1.37013 1.37013\n 9.56244 9.56244\n 2.92491 2.92491\n -7.13393 -7.13393\n -0.179291 -0.179291\n -6.00313 -6.00313\n 7.27104 7.27104\n -1.7103 -1.7103\n -7.84843 -7.84843\n 13.7304 13.7304\n 2.40973 2.40973\n -7.07755 -7.07755\n 1.31745 1.31745\n -9.99271 -9.99271\n -15.4753 -15.4753\n 4.38711 4.38711\n -5.41127 -5.41127\n -1.06491 -1.06491\n 1.09245 1.09245\n -1.33961 -1.33961\n -4.42681 -4.42681\n -4.44164 -4.44164\n -1.80772 -1.80772\n -5.06035 -5.06035\n 0.197369 0.197369\n 7.27798 7.27798\n -6.88382 -6.88382\n 3.21319 3.21319\n 8.04111 8.04111\n -3.94107 -3.94107\n 1.79716 1.79716\n -0.2134 -0.2134\n 1.36955 1.36955\n 13.7009 13.7009\n -7.3497 -7.3497\n 1.80078 1.80078\n 4.25352 4.25352\n -2.80092 -2.80092\n -3.81295 -3.81295\n -4.92036 -4.92036\n 0.856001 0.856001\n -1.26696 -1.26696\n 2.65207 2.65207\n -1.01876 -1.01876\n 1.50837 1.50837\n -11.5335 -11.5335\n 5.80989 5.80989\n 2.45606 2.45606\n 1.64394 1.64394\n 2.73651 2.73651\n -11.1653 -11.1653\n -1.66359 -1.66359\n -0.0317267 -0.0317267\n 0.115458 0.115458\n 4.43585 4.43585\n 1.24902 1.24902\n 7.30894 7.30894\n 16.7814 16.7814\n -0.456154 -0.456154\n -3.94033 -3.94033\n -4.4947 -4.4947\n -2.52048 -2.52048\n 0.0890704 0.0890704\n -4.66338 -4.66338\n 3.88142 3.88142\n 2.35984 2.35984\n 4.84037 4.84037\n 6.95444 6.95444\n 2.74408 2.74408\n -3.23958 -3.23958\n -0.467292 -0.467292\n 6.26367 6.26367\n -1.50588 -1.50588\n 4.13389 4.13389\n -2.53819 -2.53819\n -4.4987 -4.4987\n -10.3487 -10.3487\n -14.8297 -14.8297\n -8.48112 -8.48112\n 3.95155 3.95155\n 1.2289 1.2289\n -4.38025 -4.38025\n -0.61687 -0.61687\n 10.8511 10.8511\n 1.15556 1.15556\n -2.19768 -2.19768\n -7.66931 -7.66931\n 4.72919 4.72919\n -7.6738 -7.6738\n -0.688528 -0.688528\n 4.74928 4.74928\n 4.92126 4.92126\n 0.897546 0.897546\n 3.85735 3.85735\n 0.201364 0.201364\n -5.62425 -5.62425\n -3.83117 -3.83117\n 4.05866 4.05866\n 3.10063 3.10063\n 2.5224 2.5224\n -1.51274 -1.51274\n -0.683338 -0.683338\n -3.23147 -3.23147\n -4.21268 -4.21268\n -2.21401 -2.21401\n 1.57887 1.57887\n 0.848257 0.848257\n -5.83704 -5.83704\n -7.00011 -7.00011\n 3.16884 3.16884\n -4.44161 -4.44161\n -7.62482 -7.62482\n -0.266943 -0.266943\n 0.41761 0.41761\n -7.45144 -7.45144\n -0.211132 -0.211132\n 0.276707 0.276707\n 16.7781 16.7781\n 0.689757 0.689757\n -3.04049 -3.04049\n 2.91684 2.91684\n 1.97161 1.97161\n 3.7721 3.7721\n -1.60698 -1.60698\n -4.18868 -4.18868\n 7.66491 7.66491\n -0.64664 -0.64664\n -0.660623 -0.660623\n 8.68174 8.68174\n 0.282074 0.282074\n -2.85266 -2.85266\n -1.91293 -1.91293\n 7.18736 7.18736\n -10.3875 -10.3875\n -1.91603 -1.91603\n 6.29739 6.29739\n -0.0375388 -0.0375388\n -1.60576 -1.60576\n -3.22148 -3.22148\n -4.24549 -4.24549\n 1.30822 1.30822\n 2.52307 2.52307\n 0.403345 0.403345\n -0.744478 -0.744478\n 2.41241 2.41241\n -4.58098 -4.58098\n -0.791842 -0.791842\n 3.73626 3.73626\n -1.43002 -1.43002\n 4.30716 4.30716\n 3.30255 3.30255\n -4.08011 -4.08011\n -5.07282 -5.07282\n -1.54759 -1.54759\n -2.2305 -2.2305\n 6.8791 6.8791\n 9.7396 9.7396\n -6.50395 -6.50395\n 3.57178 3.57178\n 7.08987 7.08987\n 6.2669 6.2669\n 5.87329 5.87329\n 2.36823 2.36823\n -6.16 -6.16\n 1.96238 1.96238\n 7.31651 7.31651\n -1.5257 -1.5257\n -2.89061 -2.89061\n 0.407546 0.407546\n 5.10645 5.10645\n 11.0716 11.0716\n 4.7443 4.7443\n -8.77353 -8.77353\n -0.631177 -0.631177\n -4.36973 -4.36973\n 1.48666 1.48666\n 7.7678 7.7678\n -2.65407 -2.65407\n 4.56869 4.56869\n -0.541163 -0.541163\n 2.89543 2.89543\n 5.39424 5.39424\n -3.62954 -3.62954\n 3.77547 3.77547\n -5.96886 -5.96886\n -4.38947 -4.38947\n -2.96756 -2.96756\n 2.28222 2.28222\n -1.08489 -1.08489\n 1.74726 1.74726\n -3.46088 -3.46088\n 11.9371 11.9371\n -5.02359 -5.02359\n 2.51632 2.51632\n -0.0297022 -0.0297022\n -2.60011 -2.60011\n 0.254202 0.254202\n 9.7949 9.7949\n 3.64937 3.64937\n 10.0857 10.0857\n -5.36637 -5.36637\n 4.11127 4.11127\n 8.90571 8.90571\n -5.97219 -5.97219\n -7.21379 -7.21379\n -5.01561 -5.01561\n 2.98616 2.98616\n 1.99064 1.99064\n 0.16465 0.16465\n -4.07902 -4.07902\n 4.34018 4.34018\n -2.13528 -2.13528\n 2.39903 2.39903\n 4.00804 4.00804\n -1.85741 -1.85741\n -7.73083 -7.73083\n -4.21139 -4.21139\n 4.65743 4.65743\n 0.963549 0.963549\n 0.29506 0.29506\n 6.05798 6.05798\n 12.4428 12.4428\n -0.398651 -0.398651\n -0.584559 -0.584559\n 2.75445 2.75445\n -0.207975 -0.207975\n 6.11926 6.11926\n -8.66125 -8.66125\n 3.07568 3.07568\n -3.19358 -3.19358\n -2.53024 -2.53024\n 14.1187 14.1187\n -0.412049 -0.412049\n 12.5809 12.5809\n 6.26236 6.26236\n 5.23037 5.23037\n -0.11356 -0.11356\n -6.62321 -6.62321\n -1.29651 -1.29651\n -1.48734 -1.48734\n 13.0753 13.0753\n 4.21767 4.21767\n -2.4425 -2.4425\n -0.0901323 -0.0901323\n 9.79684 9.79684\n 4.74522 4.74522\n -3.34804 -3.34804\n 7.37816 7.37816\n 2.57938 2.57938\n 1.92968 1.92968\n 3.75166 3.75166\n 5.0617 5.0617\n 8.74324 8.74324\n -0.93703 -0.93703\n -1.36031 -1.36031\n -2.5439 -2.5439\n 1.56784 1.56784\n 2.56237 2.56237\n -1.02578 -1.02578\n 6.62085 6.62085\n 7.69745 7.69745\n 6.26864 6.26864\n -4.20046 -4.20046\n -2.30926 -2.30926\n 2.74598 2.74598\n 4.11078 4.11078\n 2.8455 2.8455\n -3.45407 -3.45407\n 2.82327 2.82327\n -1.00356 -1.00356\n 8.85974 8.85974\n 6.35864 6.35864\n -1.59146 -1.59146\n -0.361996 -0.361996\n -1.25198 -1.25198\n 8.2867 8.2867\n 0.981644 0.981644\n 2.68003 2.68003\n 1.10236 1.10236\n -1.63423 -1.63423\n -2.79552 -2.79552\n -6.5718 -6.5718\n -0.257779 -0.257779\n -4.49325 -4.49325\n 5.0455 5.0455\n 14.4508 14.4508\n 3.60407 3.60407\n 3.09003 3.09003\n -8.32962 -8.32962\n -1.41178 -1.41178\n 12.5777 12.5777\n -2.01342 -2.01342\n -1.48205 -1.48205\n 0.967158 0.967158\n -0.532548 -0.532548\n -5.23274 -5.23274\n -1.49702 -1.49702\n 0.739607 0.739607\n 3.49171 3.49171\n -1.0507 -1.0507\n -7.48299 -7.48299\n 7.57395 7.57395\n -3.04813 -3.04813\n 16.322 16.322\n 7.81441 7.81441\n -3.41529 -3.41529\n 2.05401 2.05401\n 1.08232 1.08232\n 12.5735 12.5735\n 0.126572 0.126572\n -6.92158 -6.92158\n -1.4651 -1.4651\n -3.19425 -3.19425\n -1.44093 -1.44093\n -3.82056 -3.82056\n 6.72914 6.72914\n -5.46583 -5.46583\n -1.43396 -1.43396\n 7.42164 7.42164\n 1.00438 1.00438\n -0.41415 -0.41415\n -2.54987 -2.54987\n 6.88491 6.88491\n 3.84807 3.84807\n -5.62245 -5.62245\n 5.24133 5.24133\n 7.99514 7.99514\n -2.51593 -2.51593\n 8.19568 8.19568\n 0.854985 0.854985\n -6.20478 -6.20478\n -2.58235 -2.58235\n -6.51346 -6.51346\n 12.8877 12.8877\n 8.6194 8.6194\n -6.82669 -6.82669\n -4.67379 -4.67379\n 8.13137 8.13137\n 0.733511 0.733511\n 5.66079 5.66079\n -2.94337 -2.94337\n -3.29462 -3.29462\n -6.3809 -6.3809\n -1.85613 -1.85613\n 0.635069 0.635069\n 0.432626 0.432626\n -14.6426 -14.6426\n 8.05825 8.05825\n 6.50637 6.50637\n 1.44014 1.44014\n -4.60602 -4.60602\n -6.49137 -6.49137\n 6.33163 6.33163\n -1.97616 -1.97616\n 0.573379 0.573379\n -2.78039 -2.78039\n -0.140087 -0.140087\n 1.52619 1.52619\n 6.83379 6.83379\n -0.197981 -0.197981\n -3.00849 -3.00849\n -2.09725 -2.09725\n -2.06883 -2.06883\n -0.328198 -0.328198\n -0.212338 -0.212338\n 5.4425 5.4425\n 6.48574 6.48574\n 2.00073 2.00073\n -3.15642 -3.15642\n -0.0673389 -0.0673389\n -4.19911 -4.19911\n 4.5466 4.5466\n 3.73221 3.73221\n -1.01059 -1.01059\n -4.29015 -4.29015\n 4.9909 4.9909\n 3.22397 3.22397\n -1.27984 -1.27984\n 2.83358 2.83358\n 2.25695 2.25695\n 7.2879 7.2879\n -1.47955 -1.47955\n 12.7627 12.7627\n -3.72449 -3.72449\n 3.97719 3.97719\n 14.2197 14.2197\n -1.24031 -1.24031\n -7.41824 -7.41824\n 1.90207 1.90207\n 1.10939 1.10939\n -7.47202 -7.47202\n 3.85738 3.85738\n -4.12085 -4.12085\n 1.12097 1.12097\n -0.545646 -0.545646\n 3.04129 3.04129\n 1.05043 1.05043\n 0.993448 0.993448\n -5.78424 -5.78424\n -1.97199 -1.97199\n -5.74806 -5.74806\n 2.70835 2.70835\n -8.09729 -8.09729\n -6.36035 -6.36035\n -1.24361 -1.24361\n -2.44813 -2.44813\n 7.48353 7.48353\n 2.0202 2.0202\n 3.04366 3.04366\n -3.98778 -3.98778\n 4.80106 4.80106\n 0.926552 0.926552\n 3.35253 3.35253\n -4.10577 -4.10577\n -3.57853 -3.57853\n 4.03372 4.03372\n -2.38792 -2.38792\n 0.12177 0.12177\n -0.761671 -0.761671\n -4.25652 -4.25652\n 7.27933 7.27933\n 0.165182 0.165182\n 1.34367 1.34367\n -7.36923 -7.36923\n 2.38548 2.38548\n 0.117217 0.117217\n 2.02002 2.02002\n -4.60023 -4.60023\n 2.78 2.78\n -1.34604 -1.34604\n 4.7234 4.7234\n 7.37673 7.37673\n 2.07986 2.07986\n -5.72573 -5.72573\n -6.66143 -6.66143\n 2.43072 2.43072\n 1.34782 1.34782\n -0.114238 -0.114238\n 2.32103 2.32103\n 1.84042 1.84042\n 1.07005 1.07005\n 3.88182 3.88182\n -0.752264 -0.752264\n -2.43517 -2.43517\n -5.29216 -5.29216\n -0.13527 -0.13527\n 1.40188 1.40188\n -5.87815 -5.87815\n -1.90167 -1.90167\n 2.88562 2.88562\n -2.29028 -2.29028\n 2.35477 2.35477\n -3.50731 -3.50731\n 6.0621 6.0621\n 3.2011 3.2011\n 2.19115 2.19115\n -3.03557 -3.03557\n -8.49394 -8.49394\n 0.936501 0.936501\n 7.19188 7.19188\n 4.50162 4.50162\n 0.341394 0.341394\n 2.54484 2.54484\n 1.67305 1.67305\n 3.05008 3.05008\n -2.0266 -2.0266\n 7.28431 7.28431\n -7.70924 -7.70924\n 2.60851 2.60851\n 6.8054 6.8054\n 1.8878 1.8878\n 1.87624 1.87624\n -5.13611 -5.13611\n -3.23698 -3.23698\n 4.03201 4.03201\n -5.27165 -5.27165\n -4.95817 -4.95817\n -0.200461 -0.200461\n 4.27259 4.27259\n 0.449661 0.449661\n 7.49752 7.49752\n -5.47923 -5.47923\n -2.40934 -2.40934\n 25.0066 25.0066\n -3.14511 -3.14511\n -1.62587 -1.62587\n -1.67652 -1.67652\n -2.17888 -2.17888\n 2.37296 2.37296\n -4.41408 -4.41408\n 0.65204 0.65204\n 10.849 10.849\n -2.3021 -2.3021\n 2.20417 2.20417\n 10.0579 10.0579\n -4.03489 -4.03489\n 7.60982 7.60982\n -5.74951 -5.74951\n -2.97582 -2.97582\n -8.61382 -8.61382\n -1.90903 -1.90903\n -3.64556 -3.64556\n -16.2304 -16.2304\n -15.9793 -15.9793\n -4.59448 -4.59448\n -2.67688 -2.67688\n -1.67148 -1.67148\n 5.57026 5.57026\n 0.846445 0.846445\n -7.54149 -7.54149\n -3.61401 -3.61401\n 4.03723 4.03723\n 0.711821 0.711821\n 8.99009 8.99009\n -6.15866 -6.15866\n -1.36865 -1.36865\n -4.31058 -4.31058\n 6.31659 6.31659\n -6.23773 -6.23773\n 0.857388 0.857388\n 3.6152 3.6152\n -1.28774 -1.28774\n -4.92094 -4.92094\n 3.08527 3.08527\n -5.74582 -5.74582\n -4.20897 -4.20897\n -5.19406 -5.19406\n -4.06851 -4.06851\n 5.73867 5.73867\n 3.32767 3.32767\n -11.2588 -11.2588\n -7.94126 -7.94126\n 5.38746 5.38746\n -0.0253579 -0.0253579\n -1.7856 -1.7856\n -1.31209 -1.31209\n 6.85519 6.85519\n 2.71496 2.71496\n -2.58838 -2.58838\n -6.86996 -6.86996\n 1.01204 1.01204\n 3.43433 3.43433\n -0.249192 -0.249192\n 7.96322 7.96322\n 14.3414 14.3414\n 2.44774 2.44774\n 4.73731 4.73731\n -9.14288 -9.14288\n 2.70325 2.70325\n 6.48202 6.48202\n -2.58391 -2.58391\n -4.52079 -4.52079\n -0.64105 -0.64105\n -3.75531 -3.75531\n -3.93321 -3.93321\n -2.5879 -2.5879\n 2.34697 2.34697\n -3.89721 -3.89721\n -1.60712 -1.60712\n -7.49452 -7.49452\n -0.518596 -0.518596\n 0.996693 0.996693\n 2.83468 2.83468\n -6.19363 -6.19363\n -7.25683 -7.25683\n 0.391546 0.391546\n -7.52756 -7.52756\n -0.810817 -0.810817\n -2.64942 -2.64942\n -2.95081 -2.95081\n -6.34989 -6.34989\n 3.9961 3.9961\n 1.36755 1.36755\n -0.335808 -0.335808\n -11.7919 -11.7919\n 1.16904 1.16904\n 6.26031 6.26031\n -4.68064 -4.68064\n 5.55008 5.55008\n 3.65873 3.65873\n -3.95177 -3.95177\n 7.62708 7.62708\n -2.4932 -2.4932\n -0.713266 -0.713266\n 6.76214 6.76214\n -0.802523 -0.802523\n -0.327543 -0.327543\n -6.9053 -6.9053\n -2.69604 -2.69604\n 9.729 9.729\n -7.61691 -7.61691\n -0.658653 -0.658653\n 1.62531 1.62531\n 0.532107 0.532107\n 1.71729 1.71729\n -10.1795 -10.1795\n 5.54208 5.54208\n 4.02502 4.02502\n -1.47596 -1.47596\n 11.818 11.818\n 4.40414 4.40414\n 5.64827 5.64827\n 5.89386 5.89386\n -6.19187 -6.19187\n 4.77889 4.77889\n -0.261731 -0.261731\n -0.570525 -0.570525\n 3.80941 3.80941\n -3.95414 -3.95414\n 0.642971 0.642971\n -7.23493 -7.23493\n 0.744423 0.744423\n 11.5682 11.5682\n -3.17145 -3.17145\n 9.02877 9.02877\n 10.5452 10.5452\n -7.05642 -7.05642\n -6.01952 -6.01952\n -5.61355 -5.61355\n 1.28759 1.28759\n 3.44186 3.44186\n -2.52363 -2.52363\n 8.95712 8.95712\n -1.33999 -1.33999\n -3.25858 -3.25858\n 2.33509 2.33509\n 2.16314 2.16314\n 14.4002 14.4002\n -5.22345 -5.22345\n -5.6232 -5.6232\n -4.20801 -4.20801\n 0.677359 0.677359\n 1.92688 1.92688\n 2.4265 2.4265\n -3.47901 -3.47901\n -3.35004 -3.35004\n -5.32445 -5.32445\n 0.817822 0.817822\n 5.9241 5.9241\n 2.13342 2.13342\n 9.30726 9.30726\n -6.00328 -6.00328\n 5.10125 5.10125\n 16.6941 16.6941\n -1.41774 -1.41774\n 0.843709 0.843709\n 3.71326 3.71326\n -12.7315 -12.7315\n -1.58947 -1.58947\n 2.7713 2.7713\n -5.89993 -5.89993\n -10.1427 -10.1427\n -1.60823 -1.60823\n -4.98621 -4.98621\n -10.6258 -10.6258\n 0.255858 0.255858\n 5.87781 5.87781\n 0.549239 0.549239\n -0.361649 -0.361649\n 2.89543 2.89543\n -1.56252 -1.56252\n -7.04269 -7.04269\n 0.360599 0.360599\n -0.80318 -0.80318\n -8.15537 -8.15537\n 7.86106 7.86106\n 4.25906 4.25906\n 1.78474 1.78474\n 4.15764 4.15764\n -1.8884 -1.8884\n -7.16959 -7.16959\n 2.84539 2.84539\n -3.33161 -3.33161\n 4.89863 4.89863\n -3.36503 -3.36503\n -4.68013 -4.68013\n 5.18058 5.18058\n -9.69276 -9.69276\n -1.56116 -1.56116\n -3.58275 -3.58275\n -2.73766 -2.73766\n 6.64492 6.64492\n -3.78966 -3.78966\n 2.63467 2.63467\n -12.4868 -12.4868\n -3.4241 -3.4241\n 3.2898 3.2898\n 2.20265 2.20265\n -1.36672 -1.36672\n 2.71448 2.71448\n 5.87839 5.87839\n 0.160837 0.160837\n -2.64458 -2.64458\n -3.8078 -3.8078\n 5.08743 5.08743\n -14.014 -14.014\n 4.44746 4.44746\n 6.61584 6.61584\n -0.916513 -0.916513\n -8.08277 -8.08277\n -8.088 -8.088\n -5.14152 -5.14152\n -4.30739 -4.30739\n -8.76727 -8.76727\n -4.53313 -4.53313\n 11.0356 11.0356\n -2.37348 -2.37348\n -8.71711 -8.71711\n -2.22971 -2.22971\n 8.19346 8.19346\n -0.330962 -0.330962\n 1.10067 1.10067\n 1.01878 1.01878\n -10.2666 -10.2666\n 8.15909 8.15909\n 9.09316 9.09316\n -0.862864 -0.862864\n -7.54443 -7.54443\n -3.44703 -3.44703\n 5.21819 5.21819\n -2.06834 -2.06834\n 9.55442 9.55442\n -1.89649 -1.89649\n -5.57892 -5.57892\n 4.22421 4.22421\n -4.06375 -4.06375\n 3.81452 3.81452\n 3.09071 3.09071\n -7.34297 -7.34297\n -1.67899 -1.67899\n 0.58489 0.58489\n -5.33824 -5.33824\n 2.82705 2.82705\n -3.70864 -3.70864\n 4.21641 4.21641\n 3.82508 3.82508\n -4.04356 -4.04356\n 20.0249 20.0249\n -13.1531 -13.1531\n 2.98603 2.98603\n 5.54713 5.54713\n -1.39722 -1.39722\n 2.13016 2.13016\n -2.40215 -2.40215\n 0.168123 0.168123\n 2.77021 2.77021\n -2.32327 -2.32327\n -1.06731 -1.06731\n 2.53877 2.53877\n -1.94325 -1.94325\n 1.47106 1.47106\n 0.294436 0.294436\n -0.547055 -0.547055\n 0.116016 0.116016\n 1.56148 1.56148\n 3.21789 3.21789\n -2.89007 -2.89007\n -4.33765 -4.33765\n 0.566163 0.566163\n 0.402729 0.402729\n -7.80674 -7.80674\n 4.72058 4.72058\n 3.97584 3.97584\n 1.91646 1.91646\n 2.09298 2.09298\n 1.88552 1.88552\n -2.37581 -2.37581\n -18.2615 -18.2615\n 2.68651 2.68651\n 5.5 5.5\n 0.355051 0.355051\n 5.6052 5.6052\n 7.74854 7.74854\n -0.512378 -0.512378\n 1.60299 1.60299\n -5.49563 -5.49563\n -1.96455 -1.96455\n -16.3228 -16.3228\n -6.87737 -6.87737\n -4.60755 -4.60755\n -1.32116 -1.32116\n 2.87263 2.87263\n -2.09541 -2.09541\n 3.43595 3.43595\n 3.63528 3.63528\n 3.52056 3.52056\n -3.59484 -3.59484\n 1.03764 1.03764\n -7.14947 -7.14947\n -5.80634 -5.80634\n 4.71397 4.71397\n 0.720588 0.720588\n -2.24074 -2.24074\n 5.82418 5.82418\n -3.22013 -3.22013\n 3.68858 3.68858\n -1.43166 -1.43166\n 4.47978 4.47978\n -4.83356 -4.83356\n -3.96257 -3.96257\n -5.95512 -5.95512\n 0.496691 0.496691\n -7.58825 -7.58825\n -6.47331 -6.47331\n -1.14446 -1.14446\n 3.91615 3.91615\n -0.588841 -0.588841\n 6.56683 6.56683\n 3.97252 3.97252\n -4.3126 -4.3126\n -8.20913 -8.20913\n 0.310182 0.310182\n -7.3006 -7.3006\n 7.92805 7.92805\n 2.1756 2.1756\n 1.06404 1.06404\n 1.14471 1.14471\n -1.50242 -1.50242\n 0.00723557 0.00723557\n 5.76841 5.76841\n -1.96707 -1.96707\n 8.87243 8.87243\n -3.23281 -3.23281\n 12.3087 12.3087\n 3.3245 3.3245\n 3.00334 3.00334\n -5.74048 -5.74048\n 7.43939 7.43939\n -0.906001 -0.906001\n 2.24067 2.24067\n -6.23989 -6.23989\n 2.81483 2.81483\n -1.62648 -1.62648\n -7.26368 -7.26368\n 1.69171 1.69171\n -11.2631 -11.2631\n -2.32992 -2.32992\n -6.07361 -6.07361\n -7.56822 -7.56822\n -7.56737 -7.56737\n 5.97037 5.97037\n 6.74398 6.74398\n -2.24599 -2.24599\n 2.95213 2.95213\n -12.7864 -12.7864\n 0.680035 0.680035\n -1.39988 -1.39988\n -4.74028 -4.74028\n 3.01887 3.01887\n 1.89636 1.89636\n 4.46014 4.46014\n -4.38308 -4.38308\n 11.7633 11.7633\n -3.54671 -3.54671\n -3.47584 -3.47584\n 3.80037 3.80037\n 7.77849 7.77849\n -7.00006 -7.00006\n -4.87665 -4.87665\n -4.54736 -4.54736\n -7.81752 -7.81752\n -0.0654465 -0.0654465\n -3.70587 -3.70587\n -2.24231 -2.24231\n 5.58005 5.58005\n -3.09415 -3.09415\n -5.55063 -5.55063\n -4.19666 -4.19666\n -6.83328 -6.83328\n -6.9216 -6.9216\n -3.72782 -3.72782\n -2.18574 -2.18574\n 1.28076 1.28076\n -3.40691 -3.40691\n 0.486964 0.486964\n -2.11025 -2.11025\n -1.42349 -1.42349\n 6.06854 6.06854\n -1.37534 -1.37534\n 9.47832 9.47832\n -0.567045 -0.567045\n -6.98328 -6.98328\n 6.73139 6.73139\n -1.56812 -1.56812\n 0.141683 0.141683\n 1.78697 1.78697\n -2.03874 -2.03874\n 1.28356 1.28356\n 6.9912 6.9912\n -3.8858 -3.8858\n -1.38808 -1.38808\n -2.16632 -2.16632\n 3.57955 3.57955\n 2.73506 2.73506\n -3.03108 -3.03108\n -3.44677 -3.44677\n 1.37111 1.37111\n -10.0008 -10.0008\n -3.61651 -3.61651\n 1.97313 1.97313\n 2.11298 2.11298\n 0.174957 0.174957\n -0.131546 -0.131546\n 7.58484 7.58484\n 4.27907 4.27907\n 0.855439 0.855439\n 4.44153 4.44153\n -1.04577 -1.04577\n -7.49625 -7.49625\n 2.1572 2.1572\n 13.0815 13.0815\n 4.57025 4.57025\n 0.704658 0.704658\n 3.25079 3.25079\n -0.682139 -0.682139\n -4.17209 -4.17209\n -1.38547 -1.38547\n 5.52688 5.52688\n -4.90717 -4.90717\n 2.56402 2.56402\n -1.37164 -1.37164\n -6.05044 -6.05044\n 8.3158 8.3158\n -0.640461 -0.640461\n -2.40145 -2.40145\n -1.02959 -1.02959\n -6.75028 -6.75028\n 4.20206 4.20206\n 0.615412 0.615412\n -0.389435 -0.389435\n -5.07439 -5.07439\n -5.34136 -5.34136\n -1.88522 -1.88522\n -4.82628 -4.82628\n 0.54435 0.54435\n -3.28948 -3.28948\n 5.0051 5.0051\n -8.5501 -8.5501\n 7.31448 7.31448\n 0.145651 0.145651\n 3.28586 3.28586\n -1.8624 -1.8624\n -8.9235 -8.9235\n 3.15894 3.15894\n -9.9459 -9.9459\n 0.517233 0.517233\n -4.59899 -4.59899\n 0.641116 0.641116\n 10.3809 10.3809\n 2.39935 2.39935\n -0.378496 -0.378496\n 0.680329 0.680329\n 2.35584 2.35584\n -2.24714 -2.24714\n -4.8742 -4.8742\n -3.96429 -3.96429\n 1.29263 1.29263\n 0.618875 0.618875\n -0.611961 -0.611961\n 1.06612 1.06612\n -3.39289 -3.39289\n -0.226022 -0.226022\n 4.24418 4.24418\n 0.884239 0.884239\n 8.25747 8.25747\n -3.23019 -3.23019\n -9.99374 -9.99374\n 8.54414 8.54414\n -6.06374 -6.06374\n -4.92601 -4.92601\n 7.22101 7.22101\n 11.5756 11.5756\n 13.436 13.436\n 4.13522 4.13522\n 9.67412 9.67412\n -3.13805 -3.13805\n 7.50856 7.50856\n -7.98069 -7.98069\n 4.92059 4.92059\n -6.72969 -6.72969\n -4.48762 -4.48762\n -3.60328 -3.60328\n -1.75053 -1.75053\n 1.5638 1.5638\n 4.74213 4.74213\n 5.16046 5.16046\n -1.9857 -1.9857\n -6.34885 -6.34885\n -3.58963 -3.58963\n 4.96795 4.96795\n 1.44405 1.44405\n -2.74682 -2.74682\n -0.545296 -0.545296\n -10.7507 -10.7507\n -0.117477 -0.117477\n -0.436907 -0.436907\n -1.11656 -1.11656\n 1.64789 1.64789\n -4.08799 -4.08799\n -1.04262 -1.04262\n 6.06007 6.06007\n -6.68208 -6.68208\n 6.81976 6.81976\n -6.89836 -6.89836\n -0.555115 -0.555115\n -2.85307 -2.85307\n -7.76567 -7.76567\n -5.65104 -5.65104\n 8.93521 8.93521\n -5.0663 -5.0663\n 2.52214 2.52214\n 0.382824 0.382824\n -0.398468 -0.398468\n 5.05183 5.05183\n 4.134 4.134\n 1.42909 1.42909\n 2.99357 2.99357\n 10.7821 10.7821\n -4.54764 -4.54764\n -0.0440308 -0.0440308\n 0.647161 0.647161\n 3.27569 3.27569\n -32.9478 -32.9478\n 6.92399 6.92399\n -3.05953 -3.05953\n -2.29742 -2.29742\n -0.41863 -0.41863\n 2.99125 2.99125\n 3.40805 3.40805\n -1.36651 -1.36651\n -3.25561 -3.25561\n 5.11504 5.11504\n -0.532291 -0.532291\n 9.93341 9.93341\n -2.2806 -2.2806\n 10.9617 10.9617\n -2.53642 -2.53642\n 0.995763 0.995763\n -1.28898 -1.28898\n -2.99921 -2.99921\n -2.46773 -2.46773\n -11.0849 -11.0849\n -11.64 -11.64\n -3.73617 -3.73617\n 2.74223 2.74223\n -0.976817 -0.976817\n -0.384814 -0.384814\n -3.38815 -3.38815\n 2.27591 2.27591\n -5.25732 -5.25732\n -1.65764 -1.65764\n -5.8501 -5.8501\n -4.85863 -4.85863\n 2.78987 2.78987\n 5.3324 5.3324\n -9.16758 -9.16758\n 7.90047 7.90047\n 5.68696 5.68696\n 7.2668 7.2668\n -0.857072 -0.857072\n 0.0834347 0.0834347\n 1.11833 1.11833\n 0.88212 0.88212\n -4.40785 -4.40785\n 5.25846 5.25846\n 7.46283 7.46283\n 6.26981 6.26981\n -10.8935 -10.8935\n -0.226332 -0.226332\n -1.64568 -1.64568\n -0.389003 -0.389003\n -0.854872 -0.854872\n -3.38063 -3.38063\n -4.74874 -4.74874\n -1.81717 -1.81717\n -6.03338 -6.03338\n 9.41153 9.41153\n -2.75636 -2.75636\n -4.03638 -4.03638\n -2.82527 -2.82527\n 0.641039 0.641039\n -3.08939 -3.08939\n -1.04523 -1.04523\n -4.17379 -4.17379\n 0.453503 0.453503\n 5.64541 5.64541\n 2.72225 2.72225\n -1.67354 -1.67354\n -6.68729 -6.68729\n -1.20785 -1.20785\n 3.51562 3.51562\n 2.38257 2.38257\n 2.75735 2.75735\n -4.62925 -4.62925\n 7.98247 7.98247\n 6.254 6.254\n 3.85448 3.85448\n -4.40298 -4.40298\n -8.28751 -8.28751\n -7.28055 -7.28055\n 7.31675 7.31675\n 3.53957 3.53957\n 2.94378 2.94378\n 1.41268 1.41268\n 5.2878 5.2878\n -0.807317 -0.807317\n -13.141 -13.141\n 5.71505 5.71505\n -3.86739 -3.86739\n 0.922435 0.922435\n -4.52167 -4.52167\n 0.82741 0.82741\n 4.1254 4.1254\n -3.64229 -3.64229\n -4.34879 -4.34879\n -5.69361 -5.69361\n 10.0503 10.0503\n -6.20878 -6.20878\n -5.70531 -5.70531\n -0.265037 -0.265037\n 4.91217 4.91217\n -9.85839 -9.85839\n 9.14639 9.14639\n 0.78426 0.78426\n -6.03581 -6.03581\n -1.225 -1.225\n -1.82514 -1.82514\n -4.38257 -4.38257\n -4.14898 -4.14898\n 1.30056 1.30056\n -4.04361 -4.04361\n -10.7862 -10.7862\n -1.71033 -1.71033\n -5.3235 -5.3235\n -5.05158 -5.05158\n 2.03088 2.03088\n -4.639 -4.639\n -8.90379 -8.90379\n -1.46286 -1.46286\n 4.78737 4.78737\n 2.84292 2.84292\n -4.60125 -4.60125\n -0.454598 -0.454598\n -3.54703 -3.54703\n -3.15574 -3.15574\n -5.66794 -5.66794\n -0.499733 -0.499733\n 4.80394 4.80394\n 7.0018 7.0018\n -12.2494 -12.2494\n -0.705371 -0.705371\n 0.0740021 0.0740021\n -2.66987 -2.66987\n 2.48263 2.48263\n -9.06332 -9.06332\n -1.01261 -1.01261\n 3.84118 3.84118\n 4.21216 4.21216\n -1.18673 -1.18673\n -11.0005 -11.0005\n -9.71638 -9.71638\n 1.76212 1.76212\n -2.83766 -2.83766\n -9.13768 -9.13768\n -1.05015 -1.05015\n 2.53008 2.53008\n 0.379504 0.379504\n 5.28803 5.28803\n -6.17221 -6.17221\n 5.75619 5.75619\n 2.3737 2.3737\n -9.0974 -9.0974\n -7.85433 -7.85433\n -10.9094 -10.9094\n 1.20756 1.20756\n 2.61486 2.61486\n 1.23359 1.23359\n 43.6151 43.6151\n -1.72859 -1.72859\n -0.965831 -0.965831\n -0.482239 -0.482239\n -1.82159 -1.82159\n 1.661 1.661\n 1.93636 1.93636\n -11.9999 -11.9999\n 0.104367 0.104367\n -1.70555 -1.70555\n -9.81074 -9.81074\n 12.7941 12.7941\n -3.36221 -3.36221\n -6.06523 -6.06523\n 0.47411 0.47411\n -6.64475 -6.64475\n -0.763006 -0.763006\n -3.9763 -3.9763\n -2.86732 -2.86732\n -20.6937 -20.6937\n 1.84418 1.84418\n 5.65243 5.65243\n 10.7255 10.7255\n -1.21293 -1.21293\n 3.15057 3.15057\n 8.96094 8.96094\n -0.205015 -0.205015\n 8.44579 8.44579\n 2.01362 2.01362\n 2.36648 2.36648\n 11.6752 11.6752\n 2.19072 2.19072\n -13.9182 -13.9182\n 3.3257 3.3257\n -6.60627 -6.60627\n 1.62083 1.62083\n -2.00847 -2.00847\n 11.6978 11.6978\n 5.93254 5.93254\n 4.93134 4.93134\n -2.50847 -2.50847\n -5.92846 -5.92846\n 1.16717 1.16717\n 6.9673 6.9673\n -1.21182 -1.21182\n 7.25413 7.25413\n -4.24031 -4.24031\n -3.12368 -3.12368\n 1.73734 1.73734\n -2.6551 -2.6551\n 5.01063 5.01063\n 10.9923 10.9923\n 3.08502 3.08502\n -1.67866 -1.67866\n 10.7003 10.7003\n -0.982895 -0.982895\n 1.97681 1.97681\n -1.29045 -1.29045\n 1.64227 1.64227\n 3.21157 3.21157\n -4.63376 -4.63376\n 4.47725 4.47725\n 7.77208 7.77208\n 0.332548 0.332548\n 2.82084 2.82084\n 0.958649 0.958649\n 1.21302 1.21302\n -3.16936 -3.16936\n 0.0672417 0.0672417\n 0.563038 0.563038\n -1.87542 -1.87542\n -3.01753 -3.01753\n 2.73107 2.73107\n -3.68276 -3.68276\n 4.64376 4.64376\n -12.4341 -12.4341\n 4.43429 4.43429\n 5.72878 5.72878\n 2.39332 2.39332\n 1.91106 1.91106\n 2.50458 2.50458\n 0.942479 0.942479\n -0.489758 -0.489758\n 0.311101 0.311101\n -2.74953 -2.74953\n 4.95959 4.95959\n 1.26862 1.26862\n 10.3622 10.3622\n 3.61213 3.61213\n -2.19285 -2.19285\n 1.28587 1.28587\n -1.85274 -1.85274\n -1.62541 -1.62541\n 2.00382 2.00382\n -5.8959 -5.8959\n -0.918042 -0.918042\n 6.43711 6.43711\n 0.419441 0.419441\n -2.61133 -2.61133\n -0.0277654 -0.0277654\n 2.77443 2.77443\n 3.83764 3.83764\n -1.44486 -1.44486\n -0.611288 -0.611288\n -4.30436 -4.30436\n 5.29466 5.29466\n 1.56058 1.56058\n 1.88962 1.88962\n 0.761408 0.761408\n 1.76505 1.76505\n 1.18453 1.18453\n 1.71559 1.71559\n -3.14851 -3.14851\n 2.73145 2.73145\n -1.23904 -1.23904\n 0.00672958 0.00672958\n 3.40979 3.40979\n -1.77498 -1.77498\n -7.12266 -7.12266\n -9.24697 -9.24697\n -4.12038 -4.12038\n -2.77817 -2.77817\n 8.23453 8.23453\n -1.29818 -1.29818\n -7.02203 -7.02203\n -5.8994 -5.8994\n 8.20499 8.20499\n 0.356509 0.356509\n -0.515947 -0.515947\n -6.23904 -6.23904\n 5.59801 5.59801\n -4.44281 -4.44281\n -2.28591 -2.28591\n -3.31819 -3.31819\n 2.39253 2.39253\n 3.18355 3.18355\n -2.73303 -2.73303\n -0.0346074 -0.0346074\n -10.2692 -10.2692\n 6.74308 6.74308\n 5.72055 5.72055\n -4.49033 -4.49033\n 1.99176 1.99176\n 6.10782 6.10782\n 2.65759 2.65759\n 1.97884 1.97884\n 0.927606 0.927606\n 1.25006 1.25006\n 9.3695 9.3695\n -2.75726 -2.75726\n -0.580415 -0.580415\n 2.92463 2.92463\n -4.49535 -4.49535\n -1.61397 -1.61397\n 3.26733 3.26733\n -3.61505 -3.61505\n -2.46453 -2.46453\n 2.42436 2.42436\n 5.68683 5.68683\n 6.07494 6.07494\n 4.35205 4.35205\n -5.29467 -5.29467\n -3.90039 -3.90039\n -1.70776 -1.70776\n -6.3172 -6.3172\n 4.03858 4.03858\n -2.58786 -2.58786\n -1.1514 -1.1514\n -0.632569 -0.632569\n -0.343314 -0.343314\n -12.2115 -12.2115\n 0.405742 0.405742\n -6.46017 -6.46017\n -2.30808 -2.30808\n 1.1336 1.1336\n 1.47556 1.47556\n 1.98494 1.98494\n 2.24865 2.24865\n -1.65786 -1.65786\n -4.62769 -4.62769\n 4.43717 4.43717\n 8.75249 8.75249\n 4.29167 4.29167\n -3.96876 -3.96876\n -3.52244 -3.52244\n 0.161164 0.161164\n -4.13202 -4.13202\n 1.42269 1.42269\n -3.05155 -3.05155\n 1.81371 1.81371\n -1.03765 -1.03765\n 0.696656 0.696656\n 2.95359 2.95359\n -4.74837 -4.74837\n -9.03481 -9.03481\n 4.8852 4.8852\n 9.47173 9.47173\n 11.3037 11.3037\n -3.88084 -3.88084\n -5.99356 -5.99356\n 7.81639 7.81639\n -6.51949 -6.51949\n 7.801 7.801\n -0.795429 -0.795429\n -0.801046 -0.801046\n 2.70658 2.70658\n 5.51012 5.51012\n 1.8181 1.8181\n -0.452854 -0.452854\n -10.1558 -10.1558\n 1.95877 1.95877\n -3.88197 -3.88197\n 1.72033 1.72033\n -1.8939 -1.8939\n -1.64082 -1.64082\n -0.409815 -0.409815\n 9.98658 9.98658\n -0.115277 -0.115277\n 1.49827 1.49827\n 1.6696 1.6696\n 2.29297 2.29297\n -2.14941 -2.14941\n 2.43318 2.43318\n 3.59845 3.59845\n -4.58877 -4.58877\n -9.25371 -9.25371\n 2.03609 2.03609\n 5.5921 5.5921\n -0.532859 -0.532859\n 4.34937 4.34937\n 1.57036 1.57036\n 2.30747 2.30747\n 7.5055 7.5055\n 3.41771 3.41771\n 0.589402 0.589402\n 1.55834 1.55834\n 5.12407 5.12407\n -1.41727 -1.41727\n 1.03223 1.03223\n -2.06257 -2.06257\n 3.11532 3.11532\n 1.90042 1.90042\n 8.66814 8.66814\n 5.36716 5.36716\n 2.38085 2.38085\n 5.72834 5.72834\n -6.5998 -6.5998\n 0.852569 0.852569\n -7.5648 -7.5648\n 2.98063 2.98063\n 7.81573 7.81573\n 1.82276 1.82276\n -1.81083 -1.81083\n 5.48043 5.48043\n -1.85315 -1.85315\n -1.62277 -1.62277\n -10.4951 -10.4951\n 5.34799 5.34799\n -1.77515 -1.77515\n 5.88005 5.88005\n 0.0799242 0.0799242\n 1.23264 1.23264\n -11.835 -11.835\n 3.56828 3.56828\n 7.53741 7.53741\n -5.24051 -5.24051\n -0.206917 -0.206917\n 4.36865 4.36865\n -4.10348 -4.10348\n 0.857712 0.857712\n -5.09677 -5.09677\n 7.37208 7.37208\n -3.14614 -3.14614\n 12.061 12.061\n 4.80096 4.80096\n 2.82421 2.82421\n -4.97446 -4.97446\n -11.0289 -11.0289\n -8.33282 -8.33282\n 0.69922 0.69922\n 5.08771 5.08771\n 2.65174 2.65174\n -3.30182 -3.30182\n 5.21741 5.21741\n 8.85373 8.85373\n 8.36416 8.36416\n 2.54295 2.54295\n -1.61657 -1.61657\n 1.12017 1.12017\n -7.33205 -7.33205\n 3.82582 3.82582\n -0.858026 -0.858026\n 1.40304 1.40304\n 1.35079 1.35079\n 4.19532 4.19532\n -1.77923 -1.77923\n -10.5119 -10.5119\n 10.8061 10.8061\n -3.49603 -3.49603\n 3.12404 3.12404\n -3.93328 -3.93328\n -6.73356 -6.73356\n 1.80532 1.80532\n -0.368024 -0.368024\n -3.47875 -3.47875\n -4.22893 -4.22893\n 2.52519 2.52519\n -3.54943 -3.54943\n -2.39869 -2.39869\n 4.22126 4.22126\n -0.253856 -0.253856\n 7.51866 7.51866\n -4.54093 -4.54093\n 3.44497 3.44497\n 4.77417 4.77417\n 4.49646 4.49646\n -5.78678 -5.78678\n 0.745013 0.745013\n 1.69763 1.69763\n -2.64759 -2.64759\n 1.66108 1.66108\n -4.68276 -4.68276\n 5.31823 5.31823\n 3.52288 3.52288\n 4.9695 4.9695\n 12.2016 12.2016\n 2.46849 2.46849\n -7.60038 -7.60038\n 8.21628 8.21628\n 5.99856 5.99856\n -6.80947 -6.80947\n 7.22522 7.22522\n -2.00065 -2.00065\n -8.24049 -8.24049\n -0.0804049 -0.0804049\n -2.06638 -2.06638\n -2.82884 -2.82884\n -4.25891 -4.25891\n -5.20258 -5.20258\n -3.19396 -3.19396\n -5.14527 -5.14527\n -4.28244 -4.28244\n 4.70805 4.70805\n -3.08065 -3.08065\n -4.86906 -4.86906\n -29.0266 -29.0266\n -1.22941 -1.22941\n -1.30928 -1.30928\n -6.35234 -6.35234\n 1.87904 1.87904\n 8.37797 8.37797\n -5.8821 -5.8821\n 3.10138 3.10138\n -3.27553 -3.27553\n -0.208451 -0.208451\n -2.28999 -2.28999\n 12.2896 12.2896\n -1.27394 -1.27394\n -3.41924 -3.41924\n -0.289592 -0.289592\n 1.79867 1.79867\n 1.98504 1.98504\n 1.55159 1.55159\n 1.10858 1.10858\n 0.352842 0.352842\n -0.309044 -0.309044\n -0.165336 -0.165336\n 1.15822 1.15822\n -1.39342 -1.39342\n -0.162562 -0.162562\n -8.06055 -8.06055\n -5.02776 -5.02776\n -8.66927 -8.66927\n 1.14576 1.14576\n -1.52122 -1.52122\n -1.29436 -1.29436\n 3.26421 3.26421\n 7.55561 7.55561\n 7.7265 7.7265\n -0.48821 -0.48821\n 12.439 12.439\n 7.0264 7.0264\n -11.9855 -11.9855\n -3.74151 -3.74151\n -0.200302 -0.200302\n 5.39515 5.39515\n -4.3468 -4.3468\n 9.25599 9.25599\n 3.37455 3.37455\n -6.15424 -6.15424\n -6.6271 -6.6271\n 0.000272481 0.000272481\n -5.48117 -5.48117\n -0.493191 -0.493191\n -3.46473 -3.46473\n 2.33812 2.33812\n 0.885965 0.885965\n 4.74926 4.74926\n 1.51959 1.51959\n 2.50956 2.50956\n -0.728024 -0.728024\n 1.0381 1.0381\n 5.48121 5.48121\n -1.68033 -1.68033\n -5.05915 -5.05915\n -0.646233 -0.646233\n 0.614062 0.614062\n 4.54219 4.54219\n -1.63006 -1.63006\n -3.10589 -3.10589\n -3.12801 -3.12801\n -5.98177 -5.98177\n -3.59188 -3.59188\n 1.7066 1.7066\n -7.43935 -7.43935\n 10.6141 10.6141\n 12.6478 12.6478\n -1.7222 -1.7222\n -2.1519 -2.1519\n -7.16573 -7.16573\n 0.887314 0.887314\n -8.59735 -8.59735\n -1.3609 -1.3609\n 4.47651 4.47651\n 0.900892 0.900892\n 7.81857 7.81857\n 6.19857 6.19857\n 2.12844 2.12844\n 3.08551 3.08551\n 4.15866 4.15866\n 2.09657 2.09657\n -2.27786 -2.27786\n 1.33571 1.33571\n 4.46899 4.46899\n 4.46674 4.46674\n 3.20736 3.20736\n 5.68287 5.68287\n 10.1058 10.1058\n 5.1894 5.1894\n 3.5452 3.5452\n 10.06 10.06\n 7.02935 7.02935\n -1.06066 -1.06066\n 10.32 10.32\n -0.860463 -0.860463\n 5.95992 5.95992\n -6.30137 -6.30137\n -5.01947 -5.01947\n 5.75187 5.75187\n -1.10079 -1.10079\n -1.91783 -1.91783\n 0.815744 0.815744\n -0.958663 -0.958663\n -3.28825 -3.28825\n -6.37854 -6.37854\n 6.91577 6.91577\n 2.54565 2.54565\n 1.39487 1.39487\n 1.59679 1.59679\n 4.72347 4.72347\n 2.49221 2.49221\n 1.29896 1.29896\n -4.08232 -4.08232\n -0.648436 -0.648436\n -6.43531 -6.43531\n -0.556197 -0.556197\n -1.40304 -1.40304\n 0.699818 0.699818\n -5.29777 -5.29777\n -3.44335 -3.44335\n 7.35309 7.35309\n 8.846 8.846\n 8.39833 8.39833\n -2.71436 -2.71436\n 3.37063 3.37063\n -3.18723 -3.18723\n 1.32256 1.32256\n -3.09485 -3.09485\n 8.78146 8.78146\n -1.30004 -1.30004\n 3.03526 3.03526\n -1.4592 -1.4592\n 3.90288 3.90288\n -13.5124 -13.5124\n 1.35105 1.35105\n 3.37337 3.37337\n 2.5171 2.5171\n -4.22085 -4.22085\n 13.1858 13.1858\n -6.02839 -6.02839\n 5.75692 5.75692\n 2.46171 2.46171\n -0.950315 -0.950315\n -3.63255 -3.63255\n 1.88 1.88\n 5.48758 5.48758\n 4.96786 4.96786\n -6.17199 -6.17199\n -0.284244 -0.284244\n -1.80256 -1.80256\n 3.03221 3.03221\n -8.90171 -8.90171\n -8.66084 -8.66084\n -9.06366 -9.06366\n -3.02007 -3.02007\n -8.2276 -8.2276\n 8.10032 8.10032\n -4.11364 -4.11364\n -3.39291 -3.39291\n 3.64208 3.64208\n -0.739833 -0.739833\n -2.84156 -2.84156\n -0.843081 -0.843081\n -0.249744 -0.249744\n 7.05075 7.05075\n 0.369632 0.369632\n -1.90893 -1.90893\n 9.79465 9.79465\n 3.52356 3.52356\n 4.14091 4.14091\n 1.66568 1.66568\n -10.7162 -10.7162\n -7.64522 -7.64522\n 1.54688 1.54688\n 7.84479 7.84479\n 0.466458 0.466458\n 4.03315 4.03315\n 0.472926 0.472926\n 1.73319 1.73319\n 1.79317 1.79317\n 1.46234 1.46234\n -8.45267 -8.45267\n 7.30327 7.30327\n 3.08869 3.08869\n 5.27442 5.27442\n 2.92876 2.92876\n -1.6673 -1.6673\n 14.4442 14.4442\n 13.4055 13.4055\n -1.47522 -1.47522\n -3.57821 -3.57821\n 9.00659 9.00659\n -9.6723 -9.6723\n 2.8818 2.8818\n -2.61898 -2.61898\n 1.17927 1.17927\n -3.15135 -3.15135\n -0.976968 -0.976968\n 1.45062 1.45062\n 4.66687 4.66687\n 4.94346 4.94346\n -2.20375 -2.20375\n 2.93643 2.93643\n 7.51365 7.51365\n 6.50034 6.50034\n 1.74088 1.74088\n -4.43403 -4.43403\n 0.796894 0.796894\n -1.23803 -1.23803\n 5.33941 5.33941\n 4.90517 4.90517\n 0.569053 0.569053\n -0.609673 -0.609673\n 5.091 5.091\n 1.76184 1.76184\n -3.81174 -3.81174\n -5.39095 -5.39095\n -3.09718 -3.09718\n -1.87868 -1.87868\n -4.85278 -4.85278\n -1.05327 -1.05327\n -1.11892 -1.11892\n -3.52006 -3.52006\n -2.8466 -2.8466\n 3.03494 3.03494\n 3.7605 3.7605\n -1.8123 -1.8123\n -5.10186 -5.10186\n 2.85973 2.85973\n -3.6241 -3.6241\n 1.78302 1.78302\n -12.3108 -12.3108\n 0.378043 0.378043\n -1.70182 -1.70182\n -0.91773 -0.91773\n -5.37355 -5.37355","category":"page"},{"location":"examples/working_with_ollama/#Using-postprocessing-function","page":"Local models with Ollama.ai","title":"Using postprocessing function","text":"","category":"section"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"Add normalization as postprocessing function to normalize embeddings on reception (for easy cosine similarity later)","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"using LinearAlgebra\nschema = PT.OllamaManagedSchema()\n\nmsg = aiembed(schema,\n [\"embed me\", \"and me too\"],\n LinearAlgebra.normalize;\n model = \"openhermes2.5-mistral\")","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"DataMessage(Matrix{Float64} of size (4096, 2))","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"Cosine similarity is then a simple multiplication","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"msg.content' * msg.content[:, 1]","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"2-element Vector{Float64}:\n 0.9999999999999946\n 0.34130017815042357","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"This page was generated using Literate.jl.","category":"page"},{"location":"frequently_asked_questions/#Frequently-Asked-Questions","page":"F.A.Q.","title":"Frequently Asked Questions","text":"","category":"section"},{"location":"frequently_asked_questions/#Why-OpenAI","page":"F.A.Q.","title":"Why OpenAI","text":"","category":"section"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"OpenAI's models are at the forefront of AI research and provide robust, state-of-the-art capabilities for many tasks.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"There will be situations not or cannot use it (eg, privacy, cost, etc.). In that case, you can use local models (eg, Ollama) or other APIs (eg, Anthropic).","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Note: To get started with Ollama.ai, see the Setup Guide for Ollama section below.","category":"page"},{"location":"frequently_asked_questions/#Data-Privacy-and-OpenAI","page":"F.A.Q.","title":"Data Privacy and OpenAI","text":"","category":"section"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"At the time of writing, OpenAI does NOT use the API calls for training their models.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"APIOpenAI does not use data submitted to and generated by our API to train OpenAI models or improve OpenAI’s service offering. In order to support the continuous improvement of our models, you can fill out this form to opt-in to share your data with us. – How your data is used to improve our models","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"You can always double-check the latest information on the OpenAI's How we use your data page.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Resources:","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"OpenAI's How we use your data\nData usage for consumer services FAQ\nHow your data is used to improve our models","category":"page"},{"location":"frequently_asked_questions/#Creating-OpenAI-API-Key","page":"F.A.Q.","title":"Creating OpenAI API Key","text":"","category":"section"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"You can get your API key from OpenAI by signing up for an account and accessing the API section of the OpenAI website.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Create an account with OpenAI\nGo to API Key page\nClick on “Create new secret key”","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"!!! Do not share it with anyone and do NOT save it to any files that get synced online.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Resources:","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"OpenAI Documentation\nVisual tutorial","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Pro tip: Always set the spending limits!","category":"page"},{"location":"frequently_asked_questions/#Setting-OpenAI-Spending-Limits","page":"F.A.Q.","title":"Setting OpenAI Spending Limits","text":"","category":"section"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"OpenAI allows you to set spending limits directly on your account dashboard to prevent unexpected costs.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Go to OpenAI Billing\nSet Soft Limit (you’ll receive a notification) and Hard Limit (API will stop working not to spend more money)","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"A good start might be a soft limit of c.5 and a hard limit of c10 - you can always increase it later in the month.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Resources:","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"OpenAI Forum","category":"page"},{"location":"frequently_asked_questions/#How-much-does-it-cost?-Is-it-worth-paying-for?","page":"F.A.Q.","title":"How much does it cost? Is it worth paying for?","text":"","category":"section"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"If you use a local model (eg, with Ollama), it's free. If you use any commercial APIs (eg, OpenAI), you will likely pay per \"token\" (a sub-word unit).","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"For example, a simple request with a simple question and 1 sentence response in return (”Is statement XYZ a positive comment”) will cost you ~0.0001 (ie, one hundredth of a cent)","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Is it worth paying for?","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"GenAI is a way to buy time! You can pay cents to save tens of minutes every day.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Continuing the example above, imagine you have a table with 200 comments. Now, you can parse each one of them with an LLM for the features/checks you need. Assuming the price per call was 0.0001, you'd pay 2 cents for the job and save 30-60 minutes of your time!","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Resources:","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"OpenAI Pricing per 1000 tokens","category":"page"},{"location":"frequently_asked_questions/#Configuring-the-Environment-Variable-for-API-Key","page":"F.A.Q.","title":"Configuring the Environment Variable for API Key","text":"","category":"section"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"To use the OpenAI API with PromptingTools.jl, set your API key as an environment variable:","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"ENV[\"OPENAI_API_KEY\"] = \"your-api-key\"","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"As a one-off, you can: ","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"set it in the terminal before launching Julia: export OPENAI_API_KEY = \nset it in your setup.jl (make sure not to commit it to GitHub!)","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Make sure to start Julia from the same terminal window where you set the variable. Easy check in Julia, run ENV[\"OPENAI_API_KEY\"] and you should see your key!","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"A better way:","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"On a Mac, add the configuration line to your terminal's configuration file (eg, ~/.zshrc). It will get automatically loaded every time you launch the terminal\nOn Windows, set it as a system variable in \"Environment Variables\" settings (see the Resources)","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Resources: ","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"OpenAI Guide","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Note: In the future, we hope to add Preferences.jl-based workflow to set the API key and other preferences.","category":"page"},{"location":"frequently_asked_questions/#Understanding-the-API-Keyword-Arguments-in-aigenerate-(api_kwargs)","page":"F.A.Q.","title":"Understanding the API Keyword Arguments in aigenerate (api_kwargs)","text":"","category":"section"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"See OpenAI API reference for more information.","category":"page"},{"location":"frequently_asked_questions/#Instant-Access-from-Anywhere","page":"F.A.Q.","title":"Instant Access from Anywhere","text":"","category":"section"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"For easy access from anywhere, add PromptingTools into your startup.jl (can be found in ~/.julia/config/startup.jl).","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Add the following snippet:","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"using PromptingTools\nconst PT = PromptingTools # to access unexported functions and types","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Now, you can just use ai\"Help me do X to achieve Y\" from any REPL session!","category":"page"},{"location":"frequently_asked_questions/#Open-Source-Alternatives","page":"F.A.Q.","title":"Open Source Alternatives","text":"","category":"section"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"The ethos of PromptingTools.jl is to allow you to use whatever model you want, which includes Open Source LLMs. The most popular and easiest to setup is Ollama.ai - see below for more information.","category":"page"},{"location":"frequently_asked_questions/#Setup-Guide-for-Ollama","page":"F.A.Q.","title":"Setup Guide for Ollama","text":"","category":"section"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Ollama runs a background service hosting LLMs that you can access via a simple API. It's especially useful when you're working with some sensitive data that should not be sent anywhere.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Installation is very easy, just download the latest version here.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Once you've installed it, just launch the app and you're ready to go!","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"To check if it's running, go to your browser and open 127.0.0.1:11434. You should see the message \"Ollama is running\". Alternatively, you can run ollama serve in your terminal and you'll get a message that it's already running.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"There are many models available in Ollama Library, including Llama2, CodeLlama, SQLCoder, or my personal favorite openhermes2.5-mistral.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Download new models with ollama pull (eg, ollama pull openhermes2.5-mistral). ","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Show currently available models with ollama list.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"See Ollama.ai for more information.","category":"page"},{"location":"reference/#Reference","page":"Reference","title":"Reference","text":"","category":"section"},{"location":"reference/","page":"Reference","title":"Reference","text":"","category":"page"},{"location":"reference/","page":"Reference","title":"Reference","text":"Modules = [PromptingTools]","category":"page"},{"location":"reference/#PromptingTools.RESERVED_KWARGS","page":"Reference","title":"PromptingTools.RESERVED_KWARGS","text":"The following keywords are reserved for internal use in the ai* functions and cannot be used as placeholders in the Messages\n\n\n\n\n\n","category":"constant"},{"location":"reference/#PromptingTools.AICode","page":"Reference","title":"PromptingTools.AICode","text":"AICode(code::AbstractString; safe_eval::Bool=false, prefix::AbstractString=\"\", suffix::AbstractString=\"\")\n\nA mutable structure representing a code block (received from the AI model) with automatic parsing, execution, and output/error capturing capabilities.\n\nUpon instantiation with a string, the AICode object automatically runs a code parser and executor (via PromptingTools.eval!()), capturing any standard output (stdout) or errors. This structure is useful for programmatically handling and evaluating Julia code snippets.\n\nSee also: PromptingTools.extract_code_blocks, PromptingTools.eval!\n\nWorkflow\n\nUntil cb::AICode has been evaluated, cb.success is set to nothing (and so are all other fields).\nThe text in cb.code is parsed (saved to cb.expression).\nThe parsed expression is evaluated.\nOutputs of the evaluated expression are captured in cb.output.\nAny stdout outputs (e.g., from println) are captured in cb.stdout.\nIf an error occurs during evaluation, it is saved in cb.error.\nAfter successful evaluation without errors, cb.success is set to true. Otherwise, it is set to false and you can inspect the cb.error to understand why.\n\nProperties\n\ncode::AbstractString: The raw string of the code to be parsed and executed.\nexpression: The parsed Julia expression (set after parsing code).\nstdout: Captured standard output from the execution of the code.\noutput: The result of evaluating the code block.\nsuccess::Union{Nothing, Bool}: Indicates whether the code block executed successfully (true), unsuccessfully (false), or has yet to be evaluated (nothing).\nerror::Union{Nothing, Exception}: Any exception raised during the execution of the code block.\n\nKeyword Arguments\n\nsafe_eval::Bool: If set to true, the code block checks for package operations (e.g., installing new packages) and missing imports, and then evaluates the code inside a bespoke scratch module. This is to ensure that the evaluation does not alter any user-defined variables or the global state. Defaults to false.\nprefix::AbstractString: A string to be prepended to the code block before parsing and evaluation. Useful to add some additional code definition or necessary imports. Defaults to an empty string.\nsuffix::AbstractString: A string to be appended to the code block before parsing and evaluation. Useful to check that tests pass or that an example executes. Defaults to an empty string.\n\nMethods\n\nBase.isvalid(cb::AICode): Check if the code block has executed successfully. Returns true if cb.success == true.\n\nExamples\n\ncode = AICode(\"println(\"Hello, World!\")\") # Auto-parses and evaluates the code, capturing output and errors.\nisvalid(code) # Output: true\ncode.stdout # Output: \"Hello, World!\n\"\n\nWe try to evaluate \"safely\" by default (eg, inside a custom module, to avoid changing user variables). You can avoid that with save_eval=false:\n\ncode = AICode(\"new_variable = 1\"; safe_eval=false)\nisvalid(code) # Output: true\nnew_variable # Output: 1\n\nYou can also call AICode directly on an AIMessage, which will extract the Julia code blocks, concatenate them and evaluate them:\n\nmsg = aigenerate(\"In Julia, how do you create a vector of 10 random numbers?\")\ncode = AICode(msg)\n# Output: AICode(Success: True, Parsed: True, Evaluated: True, Error Caught: N/A, StdOut: True, Code: 2 Lines)\n\n# show the code\ncode.code |> println\n# Output: \n# numbers = rand(10)\n# numbers = rand(1:100, 10)\n\n# or copy it to the clipboard\ncode.code |> clipboard\n\n# or execute it in the current module (=Main)\neval(code.expression)\n\n\n\n\n\n","category":"type"},{"location":"reference/#PromptingTools.AITemplate","page":"Reference","title":"PromptingTools.AITemplate","text":"AITemplate\n\nAITemplate is a template for a conversation prompt. This type is merely a container for the template name, which is resolved into a set of messages (=prompt) by render.\n\nNaming Convention\n\nTemplate names should be in CamelCase\nFollow the format ...... where possible, eg, JudgeIsItTrue, ``\nStarting with the Persona (=System prompt), eg, Judge = persona is meant to judge some provided information\nVariable to be filled in with context, eg, It = placeholder it\nEnding with the variable name is helpful, eg, JuliaExpertTask for a persona to be an expert in Julia language and task is the placeholder name\nIdeally, the template name should be self-explanatory, eg, JudgeIsItTrue = persona is meant to judge some provided information where it is true or false\n\nExamples\n\nSave time by re-using pre-made templates, just fill in the placeholders with the keyword arguments:\n\nmsg = aigenerate(:JuliaExpertAsk; ask = \"How do I add packages?\")\n\nThe above is equivalent to a more verbose version that explicitly uses the dispatch on AITemplate:\n\nmsg = aigenerate(AITemplate(:JuliaExpertAsk); ask = \"How do I add packages?\")\n\nFind available templates with aitemplates:\n\ntmps = aitemplates(\"JuliaExpertAsk\")\n# Will surface one specific template\n# 1-element Vector{AITemplateMetadata}:\n# PromptingTools.AITemplateMetadata\n# name: Symbol JuliaExpertAsk\n# description: String \"For asking questions about Julia language. Placeholders: `ask`\"\n# version: String \"1\"\n# wordcount: Int64 237\n# variables: Array{Symbol}((1,))\n# system_preview: String \"You are a world-class Julia language programmer with the knowledge of the latest syntax. Your commun\"\n# user_preview: String \"# Question\n\n{{ask}}\"\n# source: String \"\"\n\nThe above gives you a good idea of what the template is about, what placeholders are available, and how much it would cost to use it (=wordcount).\n\nSearch for all Julia-related templates:\n\ntmps = aitemplates(\"Julia\")\n# 2-element Vector{AITemplateMetadata}... -> more to come later!\n\nIf you are on VSCode, you can leverage nice tabular display with vscodedisplay:\n\nusing DataFrames\ntmps = aitemplates(\"Julia\") |> DataFrame |> vscodedisplay\n\nI have my selected template, how do I use it? Just use the \"name\" in aigenerate or aiclassify like you see in the first example!\n\nYou can inspect any template by \"rendering\" it (this is what the LLM will see):\n\njulia> AITemplate(:JudgeIsItTrue) |> PromptingTools.render\n\nSee also: save_template, load_template, load_templates! for more advanced use cases (and the corresponding script in examples/ folder)\n\n\n\n\n\n","category":"type"},{"location":"reference/#PromptingTools.AITemplateMetadata","page":"Reference","title":"PromptingTools.AITemplateMetadata","text":"Helper for easy searching and reviewing of templates. Defined on loading of each template.\n\n\n\n\n\n","category":"type"},{"location":"reference/#PromptingTools.AbstractPromptSchema","page":"Reference","title":"PromptingTools.AbstractPromptSchema","text":"Defines different prompting styles based on the model training and fine-tuning.\n\n\n\n\n\n","category":"type"},{"location":"reference/#PromptingTools.ChatMLSchema","page":"Reference","title":"PromptingTools.ChatMLSchema","text":"ChatMLSchema is used by many open-source chatbots, by OpenAI models (under the hood) and by several models and inferfaces (eg, Ollama, vLLM)\n\nYou can explore it on tiktokenizer\n\nIt uses the following conversation structure:\n\nsystem\n...\n<|im_start|>user\n...<|im_end|>\n<|im_start|>assistant\n...<|im_end|>\n\n\n\n\n\n","category":"type"},{"location":"reference/#PromptingTools.MaybeExtract","page":"Reference","title":"PromptingTools.MaybeExtract","text":"Extract a result from the provided data, if any, otherwise set the error and message fields.\n\nArguments\n\nerror::Bool: true if a result is found, false otherwise.\nmessage::String: Only present if no result is found, should be short and concise.\n\n\n\n\n\n","category":"type"},{"location":"reference/#PromptingTools.NoSchema","page":"Reference","title":"PromptingTools.NoSchema","text":"Schema that keeps messages (<:AbstractMessage) and does not transform for any specific model. It used by the first pass of the prompt rendering system (see ?render).\n\n\n\n\n\n","category":"type"},{"location":"reference/#PromptingTools.OllamaManagedSchema","page":"Reference","title":"PromptingTools.OllamaManagedSchema","text":"Ollama by default manages different models and their associated prompt schemas when you pass system_prompt and prompt fields to the API.\n\nWarning: It works only for 1 system message and 1 user message, so anything more than that has to be rejected.\n\nIf you need to pass more messagese / longer conversational history, you can use define the model-specific schema directly and pass your Ollama requests with raw=true, which disables and templating and schema management by Ollama.\n\n\n\n\n\n","category":"type"},{"location":"reference/#PromptingTools.OpenAISchema","page":"Reference","title":"PromptingTools.OpenAISchema","text":"OpenAISchema is the default schema for OpenAI models.\n\nIt uses the following conversation template:\n\n[Dict(role=\"system\",content=\"...\"),Dict(role=\"user\",content=\"...\"),Dict(role=\"assistant\",content=\"...\")]\n\nIt's recommended to separate sections in your prompt with markdown headers (e.g. `##Answer\n\n`).\n\n\n\n\n\n","category":"type"},{"location":"reference/#PromptingTools.TestEchoOllamaManagedSchema","page":"Reference","title":"PromptingTools.TestEchoOllamaManagedSchema","text":"Echoes the user's input back to them. Used for testing the implementation\n\n\n\n\n\n","category":"type"},{"location":"reference/#PromptingTools.TestEchoOpenAISchema","page":"Reference","title":"PromptingTools.TestEchoOpenAISchema","text":"Echoes the user's input back to them. Used for testing the implementation\n\n\n\n\n\n","category":"type"},{"location":"reference/#PromptingTools.UserMessageWithImages-Tuple{AbstractString}","page":"Reference","title":"PromptingTools.UserMessageWithImages","text":"Construct UserMessageWithImages with 1 or more images. Images can be either URLs or local paths.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.X123","page":"Reference","title":"PromptingTools.X123","text":"With docstring\n\n\n\n\n\n","category":"type"},{"location":"reference/#PromptingTools.aiclassify-Tuple{PromptingTools.AbstractOpenAISchema, Union{AbstractString, PromptingTools.AbstractMessage, Vector{<:PromptingTools.AbstractMessage}}}","page":"Reference","title":"PromptingTools.aiclassify","text":"aiclassify(prompt_schema::AbstractOpenAISchema, prompt::ALLOWED_PROMPT_TYPE;\napi_kwargs::NamedTuple = (logit_bias = Dict(837 => 100, 905 => 100, 9987 => 100),\n max_tokens = 1, temperature = 0),\nkwargs...)\n\nClassifies the given prompt/statement as true/false/unknown.\n\nNote: this is a very simple classifier, it is not meant to be used in production. Credit goes to AAAzzam.\n\nIt uses Logit bias trick and limits the output to 1 token to force the model to output only true/false/unknown.\n\nOutput tokens used (via api_kwargs):\n\n837: ' true'\n905: ' false'\n9987: ' unknown'\n\nArguments\n\nprompt_schema::AbstractOpenAISchema: The schema for the prompt.\nprompt: The prompt/statement to classify if it's a String. If it's a Symbol, it is expanded as a template via render(schema,template).\n\nExample\n\naiclassify(\"Is two plus two four?\") # true\naiclassify(\"Is two plus three a vegetable on Mars?\") # false\n\naiclassify returns only true/false/unknown. It's easy to get the proper Bool output type out with tryparse, eg,\n\ntryparse(Bool, aiclassify(\"Is two plus two four?\")) isa Bool # true\n\nOutput of type Nothing marks that the model couldn't classify the statement as true/false.\n\nIdeally, we would like to re-use some helpful system prompt to get more accurate responses. For this reason we have templates, eg, :JudgeIsItTrue. By specifying the template, we can provide our statement as the expected variable (it in this case) See that the model now correctly classifies the statement as \"unknown\".\n\naiclassify(:JudgeIsItTrue; it = \"Is two plus three a vegetable on Mars?\") # unknown\n\nFor better results, use higher quality models like gpt4, eg, \n\naiclassify(:JudgeIsItTrue;\n it = \"If I had two apples and I got three more, I have five apples now.\",\n model = \"gpt4\") # true\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.aiembed-Union{Tuple{F}, Tuple{PromptingTools.AbstractOllamaManagedSchema, AbstractString}, Tuple{PromptingTools.AbstractOllamaManagedSchema, AbstractString, F}} where F<:Function","page":"Reference","title":"PromptingTools.aiembed","text":"aiembed(prompt_schema::AbstractOllamaManagedSchema,\n doc_or_docs::Union{AbstractString, Vector{<:AbstractString}},\n postprocess::F = identity;\n verbose::Bool = true,\n api_key::String = API_KEY,\n model::String = MODEL_EMBEDDING,\n http_kwargs::NamedTuple = (retry_non_idempotent = true,\n retries = 5,\n readtimeout = 120),\n api_kwargs::NamedTuple = NamedTuple(),\n kwargs...) where {F <: Function}\n\nThe aiembed function generates embeddings for the given input using a specified model and returns a message object containing the embeddings, status, token count, and elapsed time.\n\nArguments\n\nprompt_schema::AbstractOllamaManagedSchema: The schema for the prompt.\ndoc_or_docs::Union{AbstractString, Vector{<:AbstractString}}: The document or list of documents to generate embeddings for. The list of documents is processed sequentially, so users should consider implementing an async version with with Threads.@spawn\npostprocess::F: The post-processing function to apply to each embedding. Defaults to the identity function, but could be LinearAlgebra.normalize.\nverbose::Bool: A flag indicating whether to print verbose information. Defaults to true.\napi_key::String: The API key to use for the OpenAI API. Defaults to API_KEY.\nmodel::String: The model to use for generating embeddings. Defaults to MODEL_EMBEDDING.\nhttp_kwargs::NamedTuple: Additional keyword arguments for the HTTP request. Defaults to empty NamedTuple.\napi_kwargs::NamedTuple: Additional keyword arguments for the Ollama API. Defaults to an empty NamedTuple.\nkwargs: Prompt variables to be used to fill the prompt/template\n\nReturns\n\nmsg: A DataMessage object containing the embeddings, status, token count, and elapsed time.\n\nNote: Ollama API currently does not return the token count, so it's set to (0,0)\n\nExample\n\nconst PT = PromptingTools\nschema = PT.OllamaManagedSchema()\n\nmsg = aiembed(schema, \"Hello World\"; model=\"openhermes2.5-mistral\")\nmsg.content # 4096-element JSON3.Array{Float64...\n\nWe can embed multiple strings at once and they will be hcat into a matrix (ie, each column corresponds to one string)\n\nconst PT = PromptingTools\nschema = PT.OllamaManagedSchema()\n\nmsg = aiembed(schema, [\"Hello World\", \"How are you?\"]; model=\"openhermes2.5-mistral\")\nmsg.content # 4096×2 Matrix{Float64}:\n\nIf you plan to calculate the cosine distance between embeddings, you can normalize them first:\n\nconst PT = PromptingTools\nusing LinearAlgebra\nschema = PT.OllamaManagedSchema()\n\nmsg = aiembed(schema, [\"embed me\", \"and me too\"], LinearAlgebra.normalize; model=\"openhermes2.5-mistral\")\n\n# calculate cosine distance between the two normalized embeddings as a simple dot product\nmsg.content' * msg.content[:, 1] # [1.0, 0.34]\n\nSimilarly, you can use the postprocess argument to materialize the data from JSON3.Object by using postprocess = copy\n\nconst PT = PromptingTools\nschema = PT.OllamaManagedSchema()\n\nmsg = aiembed(schema, \"Hello World\", copy; model=\"openhermes2.5-mistral\")\nmsg.content # 4096-element Vector{Float64}\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.aiembed-Union{Tuple{F}, Tuple{PromptingTools.AbstractOpenAISchema, Union{AbstractString, Vector{<:AbstractString}}}, Tuple{PromptingTools.AbstractOpenAISchema, Union{AbstractString, Vector{<:AbstractString}}, F}} where F<:Function","page":"Reference","title":"PromptingTools.aiembed","text":"aiembed(prompt_schema::AbstractOpenAISchema,\n doc_or_docs::Union{AbstractString, Vector{<:AbstractString}},\n postprocess::F = identity;\n verbose::Bool = true,\n api_key::String = API_KEY,\n model::String = MODEL_EMBEDDING, \n http_kwargs::NamedTuple = (retry_non_idempotent = true,\n retries = 5,\n readtimeout = 120),\n api_kwargs::NamedTuple = NamedTuple(),\n kwargs...) where {F <: Function}\n\nThe aiembed function generates embeddings for the given input using a specified model and returns a message object containing the embeddings, status, token count, and elapsed time.\n\nArguments\n\nprompt_schema::AbstractOpenAISchema: The schema for the prompt.\ndoc_or_docs::Union{AbstractString, Vector{<:AbstractString}}: The document or list of documents to generate embeddings for.\npostprocess::F: The post-processing function to apply to each embedding. Defaults to the identity function.\nverbose::Bool: A flag indicating whether to print verbose information. Defaults to true.\napi_key::String: The API key to use for the OpenAI API. Defaults to API_KEY.\nmodel::String: The model to use for generating embeddings. Defaults to MODEL_EMBEDDING.\nhttp_kwargs::NamedTuple: Additional keyword arguments for the HTTP request. Defaults to (retry_non_idempotent = true, retries = 5, readtimeout = 120).\napi_kwargs::NamedTuple: Additional keyword arguments for the OpenAI API. Defaults to an empty NamedTuple.\nkwargs...: Additional keyword arguments.\n\nReturns\n\nmsg: A DataMessage object containing the embeddings, status, token count, and elapsed time. Use msg.content to access the embeddings.\n\nExample\n\nmsg = aiembed(\"Hello World\")\nmsg.content # 1536-element JSON3.Array{Float64...\n\nWe can embed multiple strings at once and they will be hcat into a matrix (ie, each column corresponds to one string)\n\nmsg = aiembed([\"Hello World\", \"How are you?\"])\nmsg.content # 1536×2 Matrix{Float64}:\n\nIf you plan to calculate the cosine distance between embeddings, you can normalize them first:\n\nusing LinearAlgebra\nmsg = aiembed([\"embed me\", \"and me too\"], LinearAlgebra.normalize)\n\n# calculate cosine distance between the two normalized embeddings as a simple dot product\nmsg.content' * msg.content[:, 1] # [1.0, 0.787]\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.aiextract-Tuple{PromptingTools.AbstractOpenAISchema, Union{AbstractString, PromptingTools.AbstractMessage, Vector{<:PromptingTools.AbstractMessage}}}","page":"Reference","title":"PromptingTools.aiextract","text":"aiextract([prompt_schema::AbstractOpenAISchema,] prompt::ALLOWED_PROMPT_TYPE; \nreturn_type::Type,\nverbose::Bool = true,\n model::String = MODEL_CHAT,\n return_all::Bool = false, dry_run::Bool = false, \n conversation::AbstractVector{<:AbstractMessage} = AbstractMessage[],\n http_kwargs::NamedTuple = (;\n retry_non_idempotent = true,\n retries = 5,\n readtimeout = 120), api_kwargs::NamedTuple = NamedTuple(),\n kwargs...)\n\nExtract required information (defined by a struct return_type) from the provided prompt by leveraging OpenAI function calling mode.\n\nThis is a perfect solution for extracting structured information from text (eg, extract organization names in news articles, etc.)\n\nIt's effectively a light wrapper around aigenerate call, which requires additional keyword argument return_type to be provided and will enforce the model outputs to adhere to it.\n\nArguments\n\nprompt_schema: An optional object to specify which prompt template should be applied (Default to PROMPT_SCHEMA = OpenAISchema)\nprompt: Can be a string representing the prompt for the AI conversation, a UserMessage, a vector of AbstractMessage or an AITemplate\nreturn_type: A struct TYPE representing the the information we want to extract. Do not provide a struct instance, only the type. If the struct has a docstring, it will be provided to the model as well. It's used to enforce structured model outputs or provide more information.\nverbose: A boolean indicating whether to print additional information.\napi_key: A string representing the API key for accessing the OpenAI API.\nmodel: A string representing the model to use for generating the response. Can be an alias corresponding to a model ID defined in MODEL_ALIASES.\nreturn_all::Bool=false: If true, returns the entire conversation history, otherwise returns only the last message (the AIMessage).\ndry_run::Bool=false: If true, skips sending the messages to the model (for debugging, often used with return_all=true).\nconversation: An optional vector of AbstractMessage objects representing the conversation history. If not provided, it is initialized as an empty vector.\nhttp_kwargs: A named tuple of HTTP keyword arguments.\napi_kwargs: A named tuple of API keyword arguments.\nkwargs: Prompt variables to be used to fill the prompt/template\n\nReturns\n\nIf return_all=false (default):\n\nmsg: An DataMessage object representing the extracted data, including the content, status, tokens, and elapsed time. Use msg.content to access the extracted data.\n\nIf return_all=true:\n\nconversation: A vector of AbstractMessage objects representing the full conversation history, including the response from the AI model (DataMessage).\n\nSee also: function_call_signature, MaybeExtract, aigenerate\n\nExample\n\nDo you want to extract some specific measurements from a text like age, weight and height? You need to define the information you need as a struct (return_type):\n\n\"Person's age, height, and weight.\"\nstruct MyMeasurement\n age::Int # required\n height::Union{Int,Nothing} # optional\n weight::Union{Nothing,Float64} # optional\nend\nmsg = aiextract(\"James is 30, weighs 80kg. He's 180cm tall.\"; return_type=MyMeasurement)\n# [ Info: Tokens: 129 @ Cost: $0.0002 in 1.0 seconds\n# PromptingTools.DataMessage(MyMeasurement)\nmsg.content\n# MyMeasurement(30, 180, 80.0)\n\nThe fields that allow Nothing are marked as optional in the schema:\n\nmsg = aiextract(\"James is 30.\"; return_type=MyMeasurement)\n# MyMeasurement(30, nothing, nothing)\n\nIf there are multiple items you want to extract, define a wrapper struct to get a Vector of MyMeasurement:\n\nstruct MyMeasurementWrapper\n measurements::Vector{MyMeasurement}\nend\n\nmsg = aiextract(\"James is 30, weighs 80kg. He's 180cm tall. Then Jack is 19 but really tall - over 190!\"; return_type=ManyMeasurements)\n\nmsg.content.measurements\n# 2-element Vector{MyMeasurement}:\n# MyMeasurement(30, 180, 80.0)\n# MyMeasurement(19, 190, nothing)\n\nOr if you want your extraction to fail gracefully when data isn't found, use MaybeExtract{T} wrapper (this trick is inspired by the Instructor package!):\n\nusing PromptingTools: MaybeExtract\n\ntype = MaybeExtract{MyMeasurement}\n# Effectively the same as:\n# struct MaybeExtract{T}\n# result::Union{T, Nothing} // The result of the extraction\n# error::Bool // true if a result is found, false otherwise\n# message::Union{Nothing, String} // Only present if no result is found, should be short and concise\n# end\n\n# If LLM extraction fails, it will return a Dict with `error` and `message` fields instead of the result!\nmsg = aiextract(\"Extract measurements from the text: I am giraffe\", type)\nmsg.content\n# MaybeExtract{MyMeasurement}(nothing, true, \"I'm sorry, but I can only assist with human measurements.\")\n\nThat way, you can handle the error gracefully and get a reason why extraction failed (in msg.content.message).\n\nNote that the error message refers to a giraffe not being a human, because in our MyMeasurement docstring, we said that it's for people!\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.aigenerate-Tuple{PromptingTools.AbstractOllamaManagedSchema, Union{AbstractString, PromptingTools.AbstractMessage, Vector{<:PromptingTools.AbstractMessage}}}","page":"Reference","title":"PromptingTools.aigenerate","text":"aigenerate(prompt_schema::AbstractOllamaManagedSchema, prompt::ALLOWED_PROMPT_TYPE; verbose::Bool = true,\n model::String = MODEL_CHAT,\n return_all::Bool = false, dry_run::Bool = false,\n conversation::AbstractVector{<:AbstractMessage} = AbstractMessage[],\n http_kwargs::NamedTuple = NamedTuple(), api_kwargs::NamedTuple = NamedTuple(),\n kwargs...)\n\nGenerate an AI response based on a given prompt using the OpenAI API.\n\nArguments\n\nprompt_schema: An optional object to specify which prompt template should be applied (Default to PROMPT_SCHEMA = OpenAISchema not AbstractManagedSchema)\nprompt: Can be a string representing the prompt for the AI conversation, a UserMessage, a vector of AbstractMessage or an AITemplate\nverbose: A boolean indicating whether to print additional information.\napi_key: Provided for interface consistency. Not needed for locally hosted Ollama.\nmodel: A string representing the model to use for generating the response. Can be an alias corresponding to a model ID defined in MODEL_ALIASES.\nreturn_all::Bool=false: If true, returns the entire conversation history, otherwise returns only the last message (the AIMessage).\ndry_run::Bool=false: If true, skips sending the messages to the model (for debugging, often used with return_all=true).\nconversation::AbstractVector{<:AbstractMessage}=[]: Not allowed for this schema. Provided only for compatibility.\nhttp_kwargs::NamedTuple: Additional keyword arguments for the HTTP request. Defaults to empty NamedTuple.\napi_kwargs::NamedTuple: Additional keyword arguments for the Ollama API. Defaults to an empty NamedTuple.\nkwargs: Prompt variables to be used to fill the prompt/template\n\nReturns\n\nmsg: An AIMessage object representing the generated AI message, including the content, status, tokens, and elapsed time.\n\nUse msg.content to access the extracted string.\n\nSee also: ai_str, aai_str, aiembed\n\nExample\n\nSimple hello world to test the API:\n\nconst PT = PromptingTools\nschema = PT.OllamaManagedSchema() # We need to explicit if we want Ollama, OpenAISchema is the default\n\nmsg = aigenerate(schema, \"Say hi!\"; model=\"openhermes2.5-mistral\")\n# [ Info: Tokens: 69 in 0.9 seconds\n# AIMessage(\"Hello! How can I assist you today?\")\n\nmsg is an AIMessage object. Access the generated string via content property:\n\ntypeof(msg) # AIMessage{SubString{String}}\npropertynames(msg) # (:content, :status, :tokens, :elapsed\nmsg.content # \"Hello! How can I assist you today?\"\n\nNote: We need to be explicit about the schema we want to use. If we don't, it will default to OpenAISchema (=PT.DEFAULT_SCHEMA) ___ You can use string interpolation:\n\nconst PT = PromptingTools\nschema = PT.OllamaManagedSchema()\na = 1\nmsg=aigenerate(schema, \"What is `$a+$a`?\"; model=\"openhermes2.5-mistral\")\nmsg.content # \"The result of `1+1` is `2`.\"\n\n___ You can provide the whole conversation or more intricate prompts as a Vector{AbstractMessage}:\n\nconst PT = PromptingTools\nschema = PT.OllamaManagedSchema()\n\nconversation = [\n PT.SystemMessage(\"You're master Yoda from Star Wars trying to help the user become a Yedi.\"),\n PT.UserMessage(\"I have feelings for my iPhone. What should I do?\")]\n\nmsg = aigenerate(schema, conversation; model=\"openhermes2.5-mistral\")\n# [ Info: Tokens: 111 in 2.1 seconds\n# AIMessage(\"Strong the attachment is, it leads to suffering it may. Focus on the force within you must, ...\")\n\nNote: Managed Ollama currently supports at most 1 User Message and 1 System Message given the API limitations. If you want more, you need to use the ChatMLSchema.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.aigenerate-Tuple{PromptingTools.AbstractOpenAISchema, Union{AbstractString, PromptingTools.AbstractMessage, Vector{<:PromptingTools.AbstractMessage}}}","page":"Reference","title":"PromptingTools.aigenerate","text":"aigenerate(prompt_schema::AbstractOpenAISchema, prompt::ALLOWED_PROMPT_TYPE;\n verbose::Bool = true,\n api_key::String = API_KEY,\n model::String = MODEL_CHAT, return_all::Bool = false, dry_run::Bool = false,\n http_kwargs::NamedTuple = (retry_non_idempotent = true,\n retries = 5,\n readtimeout = 120), api_kwargs::NamedTuple = NamedTuple(),\n kwargs...)\n\nGenerate an AI response based on a given prompt using the OpenAI API.\n\nArguments\n\nprompt_schema: An optional object to specify which prompt template should be applied (Default to PROMPT_SCHEMA = OpenAISchema)\nprompt: Can be a string representing the prompt for the AI conversation, a UserMessage, a vector of AbstractMessage or an AITemplate\nverbose: A boolean indicating whether to print additional information.\napi_key: A string representing the API key for accessing the OpenAI API.\nmodel: A string representing the model to use for generating the response. Can be an alias corresponding to a model ID defined in MODEL_ALIASES.\nreturn_all::Bool=false: If true, returns the entire conversation history, otherwise returns only the last message (the AIMessage).\ndry_run::Bool=false: If true, skips sending the messages to the model (for debugging, often used with return_all=true).\nconversation: An optional vector of AbstractMessage objects representing the conversation history. If not provided, it is initialized as an empty vector.\nhttp_kwargs: A named tuple of HTTP keyword arguments.\napi_kwargs: A named tuple of API keyword arguments.\nkwargs: Prompt variables to be used to fill the prompt/template\n\nReturns\n\nIf return_all=false (default):\n\nmsg: An AIMessage object representing the generated AI message, including the content, status, tokens, and elapsed time.\n\nUse msg.content to access the extracted string.\n\nIf return_all=true:\n\nconversation: A vector of AbstractMessage objects representing the conversation history, including the response from the AI model (AIMessage).\n\nSee also: ai_str, aai_str, aiembed, aiclassify, aiextract, aiscan, aitemplates\n\nExample\n\nSimple hello world to test the API:\n\nresult = aigenerate(\"Say Hi!\")\n# [ Info: Tokens: 29 @ Cost: $0.0 in 1.0 seconds\n# AIMessage(\"Hello! How can I assist you today?\")\n\nresult is an AIMessage object. Access the generated string via content property:\n\ntypeof(result) # AIMessage{SubString{String}}\npropertynames(result) # (:content, :status, :tokens, :elapsed\nresult.content # \"Hello! How can I assist you today?\"\n\n___ You can use string interpolation:\n\na = 1\nmsg=aigenerate(\"What is `$a+$a`?\")\nmsg.content # \"The sum of `1+1` is `2`.\"\n\n___ You can provide the whole conversation or more intricate prompts as a Vector{AbstractMessage}:\n\nconst PT = PromptingTools\n\nconversation = [\n PT.SystemMessage(\"You're master Yoda from Star Wars trying to help the user become a Yedi.\"),\n PT.UserMessage(\"I have feelings for my iPhone. What should I do?\")]\nmsg=aigenerate(conversation)\n# AIMessage(\"Ah, strong feelings you have for your iPhone. A Jedi's path, this is not... \")\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.aiscan-Tuple{PromptingTools.AbstractOpenAISchema, Union{AbstractString, PromptingTools.AbstractMessage, Vector{<:PromptingTools.AbstractMessage}}}","page":"Reference","title":"PromptingTools.aiscan","text":"aiscan([promptschema::AbstractOpenAISchema,] prompt::ALLOWEDPROMPTTYPE; imageurl::Union{Nothing, AbstractString, Vector{<:AbstractString}} = nothing, imagepath::Union{Nothing, AbstractString, Vector{<:AbstractString}} = nothing, imagedetail::AbstractString = \"auto\", attachtolatest::Bool = true, verbose::Bool = true, model::String = MODELCHAT, returnall::Bool = false, dryrun::Bool = false, conversation::AbstractVector{<:AbstractMessage} = AbstractMessage[], httpkwargs::NamedTuple = (; retrynonidempotent = true, retries = 5, readtimeout = 120), apikwargs::NamedTuple = = (; maxtokens = 2500), kwargs...)\n\nScans the provided image (image_url or image_path) with the goal provided in the prompt.\n\nCan be used for many multi-modal tasks, such as: OCR (transcribe text in the image), image captioning, image classification, etc.\n\nIt's effectively a light wrapper around aigenerate call, which uses additional keyword arguments image_url, image_path, image_detail to be provided. At least one image source (url or path) must be provided.\n\nArguments\n\nprompt_schema: An optional object to specify which prompt template should be applied (Default to PROMPT_SCHEMA = OpenAISchema)\nprompt: Can be a string representing the prompt for the AI conversation, a UserMessage, a vector of AbstractMessage or an AITemplate\nimage_url: A string or vector of strings representing the URL(s) of the image(s) to scan.\nimage_path: A string or vector of strings representing the path(s) of the image(s) to scan.\nimage_detail: A string representing the level of detail to include for images. Can be \"auto\", \"high\", or \"low\". See OpenAI Vision Guide for more details.\nattach_to_latest: A boolean how to handle if a conversation with multiple UserMessage is provided. When true, the images are attached to the latest UserMessage.\nverbose: A boolean indicating whether to print additional information.\napi_key: A string representing the API key for accessing the OpenAI API.\nmodel: A string representing the model to use for generating the response. Can be an alias corresponding to a model ID defined in MODEL_ALIASES.\nreturn_all::Bool=false: If true, returns the entire conversation history, otherwise returns only the last message (the AIMessage).\ndry_run::Bool=false: If true, skips sending the messages to the model (for debugging, often used with return_all=true).\nconversation: An optional vector of AbstractMessage objects representing the conversation history. If not provided, it is initialized as an empty vector.\nhttp_kwargs: A named tuple of HTTP keyword arguments.\napi_kwargs: A named tuple of API keyword arguments.\nkwargs: Prompt variables to be used to fill the prompt/template\n\nReturns\n\nIf return_all=false (default):\n\nmsg: An AIMessage object representing the generated AI message, including the content, status, tokens, and elapsed time.\n\nUse msg.content to access the extracted string.\n\nIf return_all=true:\n\nconversation: A vector of AbstractMessage objects representing the full conversation history, including the response from the AI model (AIMessage).\n\nSee also: ai_str, aai_str, aigenerate, aiembed, aiclassify, aiextract, aitemplates\n\nNotes\n\nAll examples below use model \"gpt4v\", which is an alias for model ID \"gpt-4-vision-preview\"\nmax_tokens in the api_kwargs is preset to 2500, otherwise OpenAI enforces a default of only a few hundred tokens (~300). If your output is truncated, increase this value\n\nExample\n\nDescribe the provided image:\n\nmsg = aiscan(\"Describe the image\"; image_path=\"julia.png\", model=\"gpt4v\")\n# [ Info: Tokens: 1141 @ Cost: $0.0117 in 2.2 seconds\n# AIMessage(\"The image shows a logo consisting of the word \"julia\" written in lowercase\")\n\nYou can provide multiple images at once as a vector and ask for \"low\" level of detail (cheaper):\n\nmsg = aiscan(\"Describe the image\"; image_path=[\"julia.png\",\"python.png\"], image_detail=\"low\", model=\"gpt4v\")\n\nYou can use this function as a nice and quick OCR (transcribe text in the image) with a template :OCRTask. Let's transcribe some SQL code from a screenshot (no more re-typing!):\n\n# Screenshot of some SQL code\nimage_url = \"https://www.sqlservercentral.com/wp-content/uploads/legacy/8755f69180b7ac7ee76a69ae68ec36872a116ad4/24622.png\"\nmsg = aiscan(:OCRTask; image_url, model=\"gpt4v\", task=\"Transcribe the SQL code in the image.\", api_kwargs=(; max_tokens=2500))\n\n# [ Info: Tokens: 362 @ Cost: $0.0045 in 2.5 seconds\n# AIMessage(\"```sql\n# update Orders \n\n# You can add syntax highlighting of the outputs via Markdown\nusing Markdown\nmsg.content |> Markdown.parse\n\nNotice that we enforce max_tokens = 2500. That's because OpenAI seems to default to ~300 tokens, which provides incomplete outputs. Hence, we set this value to 2500 as a default. If you still get truncated outputs, increase this value.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.aitemplates","page":"Reference","title":"PromptingTools.aitemplates","text":"aitemplates\n\nFind easily the most suitable templates for your use case.\n\nYou can search by:\n\nquery::Symbol which looks look only for partial matches in the template name\nquery::AbstractString which looks for partial matches in the template name or description\nquery::Regex which looks for matches in the template name, description or any of the message previews\n\nKeyword Arguments\n\nlimit::Int limits the number of returned templates (Defaults to 10)\n\nExamples\n\nFind available templates with aitemplates:\n\ntmps = aitemplates(\"JuliaExpertAsk\")\n# Will surface one specific template\n# 1-element Vector{AITemplateMetadata}:\n# PromptingTools.AITemplateMetadata\n# name: Symbol JuliaExpertAsk\n# description: String \"For asking questions about Julia language. Placeholders: `ask`\"\n# version: String \"1\"\n# wordcount: Int64 237\n# variables: Array{Symbol}((1,))\n# system_preview: String \"You are a world-class Julia language programmer with the knowledge of the latest syntax. Your commun\"\n# user_preview: String \"# Question\n\n{{ask}}\"\n# source: String \"\"\n\nThe above gives you a good idea of what the template is about, what placeholders are available, and how much it would cost to use it (=wordcount).\n\nSearch for all Julia-related templates:\n\ntmps = aitemplates(\"Julia\")\n# 2-element Vector{AITemplateMetadata}... -> more to come later!\n\nIf you are on VSCode, you can leverage nice tabular display with vscodedisplay:\n\nusing DataFrames\ntmps = aitemplates(\"Julia\") |> DataFrame |> vscodedisplay\n\nI have my selected template, how do I use it? Just use the \"name\" in aigenerate or aiclassify like you see in the first example!\n\n\n\n\n\n","category":"function"},{"location":"reference/#PromptingTools.aitemplates-Tuple{AbstractString}","page":"Reference","title":"PromptingTools.aitemplates","text":"Find the top-limit templates whose name or description fields partially match the query_key::String in TEMPLATE_METADATA.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.aitemplates-Tuple{Regex}","page":"Reference","title":"PromptingTools.aitemplates","text":"Find the top-limit templates where provided query_key::Regex matches either of name, description or previews or User or System messages in TEMPLATE_METADATA.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.aitemplates-Tuple{Symbol}","page":"Reference","title":"PromptingTools.aitemplates","text":"Find the top-limit templates whose name::Symbol partially matches the query_name::Symbol in TEMPLATE_METADATA.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.eval!-Tuple{PromptingTools.AbstractCodeBlock}","page":"Reference","title":"PromptingTools.eval!","text":"eval!(cb::AICode; safe_eval::Bool=true, prefix::AbstractString=\"\", suffix::AbstractString=\"\")\n\nEvaluates a code block cb in-place. It runs automatically when AICode is instantiated with a String.\n\nCheck the outcome of evaluation with Base.isvalid(cb). If ==true, provide code block has executed successfully.\n\nSteps:\n\nIf cb::AICode has not been evaluated, cb.success = nothing. After the evaluation it will be either true or false depending on the outcome\nParse the text in cb.code\nEvaluate the parsed expression\nCapture outputs of the evaluated in cb.output\nCapture any stdout outputs (eg, test failures) in cb.stdout\nIf any error exception is raised, it is saved in cb.error\nFinally, if all steps were successful, success is set to cb.success = true\n\nKeyword Arguments\n\nsafe_eval::Bool: If true, we first check for any Pkg operations (eg, installing new packages) and missing imports, then the code will be evaluated inside a bespoke scratch module (not to change any user variables)\nprefix::AbstractString: A string to be prepended to the code block before parsing and evaluation. Useful to add some additional code definition or necessary imports. Defaults to an empty string.\nsuffix::AbstractString: A string to be appended to the code block before parsing and evaluation. Useful to check that tests pass or that an example executes. Defaults to an empty string.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.extract_code_blocks-Tuple{AbstractString}","page":"Reference","title":"PromptingTools.extract_code_blocks","text":"extract_code_blocks(markdown_content::String) -> Vector{String}\n\nExtract Julia code blocks from a markdown string.\n\nThis function searches through the provided markdown content, identifies blocks of code specifically marked as Julia code (using the julia ... code fence patterns), and extracts the code within these blocks. The extracted code blocks are returned as a vector of strings, with each string representing one block of Julia code. \n\nNote: Only the content within the code fences is extracted, and the code fences themselves are not included in the output.\n\nArguments\n\nmarkdown_content::String: A string containing the markdown content from which Julia code blocks are to be extracted.\n\nReturns\n\nVector{String}: A vector containing strings of extracted Julia code blocks. If no Julia code blocks are found, an empty vector is returned.\n\nExamples\n\nExample with a single Julia code block\n\nmarkdown_single = \"\"\"\n\njulia println(\"Hello, World!\")\n\n\"\"\"\nextract_code_blocks(markdown_single)\n# Output: [\"Hello, World!\"]\n\n# Example with multiple Julia code blocks\nmarkdown_multiple = \"\"\"\n\njulia x = 5\n\nSome text in between\n\njulia y = x + 2\n\n\"\"\"\nextract_code_blocks(markdown_multiple)\n# Output: [\"x = 5\", \"y = x + 2\"]\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.extract_function_name-Tuple{AbstractString}","page":"Reference","title":"PromptingTools.extract_function_name","text":"extract_function_name(code_block::String) -> Union{String, Nothing}\n\nExtract the name of a function from a given Julia code block. The function searches for two patterns:\n\nThe explicit function declaration pattern: function name(...) ... end\nThe concise function declaration pattern: name(...) = ...\n\nIf a function name is found, it is returned as a string. If no function name is found, the function returns nothing.\n\nArguments\n\ncode_block::String: A string containing Julia code.\n\nReturns\n\nUnion{String, Nothing}: The extracted function name or nothing if no name is found.\n\nExample\n\ncode = \"\"\"\nfunction myFunction(arg1, arg2)\n # Function body\nend\n\"\"\"\nextract_function_name(code)\n# Output: \"myFunction\"\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.finalize_outputs-Tuple{Union{AbstractString, PromptingTools.AbstractMessage, Vector{<:PromptingTools.AbstractMessage}}, Any, Union{Nothing, PromptingTools.AbstractMessage}}","page":"Reference","title":"PromptingTools.finalize_outputs","text":"finalize_outputs(prompt::ALLOWED_PROMPT_TYPE, conv_rendered::Any,\n msg::Union{Nothing, AbstractMessage};\n return_all::Bool = false,\n dry_run::Bool = false,\n conversation::AbstractVector{<:AbstractMessage} = AbstractMessage[],\n kwargs...)\n\nFinalizes the outputs of the ai* functions by either returning the conversation history or the last message.\n\nKeyword arguments\n\nreturn_all::Bool=false: If true, returns the entire conversation history, otherwise returns only the last message (the AIMessage).\ndry_run::Bool=false: If true, does not send the messages to the model, but only renders the prompt with the given schema and replacement variables. Useful for debugging when you want to check the specific schema rendering. \nconversation::AbstractVector{<:AbstractMessage}=[]: An optional vector of AbstractMessage objects representing the conversation history. If not provided, it is initialized as an empty vector.\nkwargs...: Variables to replace in the prompt template.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.function_call_signature-Tuple{Type}","page":"Reference","title":"PromptingTools.function_call_signature","text":"function_call_signature(datastructtype::Struct; max_description_length::Int = 100)\n\nExtract the argument names, types and docstrings from a struct to create the function call signature in JSON schema.\n\nYou must provide a Struct type (not an instance of it) with some fields.\n\nNote: Fairly experimental, but works for combination of structs, arrays, strings and singletons.\n\nTips\n\nYou can improve the quality of the extraction by writing a helpful docstring for your struct (or any nested struct). It will be provided as a description. \n\nYou can even include comments/descriptions about the individual fields.\n\nAll fields are assumed to be required, unless you allow null values (eg, ::Union{Nothing, Int}). Fields with Nothing will be treated as optional.\nMissing values are ignored (eg, ::Union{Missing, Int} will be treated as Int). It's for broader compatibility and we cannot deserialize it as easily as Nothing.\n\nExample\n\nDo you want to extract some specific measurements from a text like age, weight and height? You need to define the information you need as a struct (return_type):\n\nstruct MyMeasurement\n age::Int\n height::Union{Int,Nothing}\n weight::Union{Nothing,Float64}\nend\nsignature = function_call_signature(MyMeasurement)\n#\n# Dict{String, Any} with 3 entries:\n# \"name\" => \"MyMeasurement_extractor\"\n# \"parameters\" => Dict{String, Any}(\"properties\"=>Dict{String, Any}(\"height\"=>Dict{String, Any}(\"type\"=>\"integer\"), \"weight\"=>Dic…\n# \"description\" => \"Represents person's age, height, and weight\n\"\n\nYou can see that only the field age does not allow null values, hence, it's \"required\". While height and weight are optional.\n\nsignature[\"parameters\"][\"required\"]\n# [\"age\"]\n\nIf there are multiple items you want to extract, define a wrapper struct to get a Vector of MyMeasurement:\n\nstruct MyMeasurementWrapper\n measurements::Vector{MyMeasurement}\nend\n\nOr if you want your extraction to fail gracefully when data isn't found, use `MaybeExtract{T}` wrapper (inspired by Instructor package!):\n\nusing PromptingTools: MaybeExtract\n\ntype = MaybeExtract{MyMeasurement}\n\nEffectively the same as:\n\nstruct MaybeExtract{T}\n\nresult::Union{T, Nothing}\n\nerror::Bool // true if a result is found, false otherwise\n\nmessage::Union{Nothing, String} // Only present if no result is found, should be short and concise\n\nend\n\nIf LLM extraction fails, it will return a Dict with error and message fields instead of the result!\n\nmsg = aiextract(\"Extract measurements from the text: I am giraffe\", type)\n\n\n\nDict{Symbol, Any} with 2 entries:\n\n:message => \"Sorry, this feature is only available for humans.\"\n\n:error => true\n\n``` That way, you can handle the error gracefully and get a reason why extraction failed.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.load_conversation-Tuple{Union{AbstractString, IO}}","page":"Reference","title":"PromptingTools.load_conversation","text":"Loads a conversation (messages) from io_or_file\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.load_template-Tuple{Union{AbstractString, IO}}","page":"Reference","title":"PromptingTools.load_template","text":"Loads messaging template from io_or_file and returns tuple of template messages and metadata.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.load_templates!","page":"Reference","title":"PromptingTools.load_templates!","text":"load_templates!(; remove_templates::Bool=true)\n\nLoads templates from folder templates/ in the package root and stores them in TEMPLATE_STORE and TEMPLATE_METADATA.\n\nNote: Automatically removes any existing templates and metadata from TEMPLATE_STORE and TEMPLATE_METADATA if remove_templates=true.\n\n\n\n\n\n","category":"function"},{"location":"reference/#PromptingTools.ollama_api-Tuple{PromptingTools.AbstractOllamaManagedSchema, AbstractString}","page":"Reference","title":"PromptingTools.ollama_api","text":"ollama_api(prompt_schema::AbstractOllamaManagedSchema, prompt::AbstractString,\n system::Union{Nothing, AbstractString} = nothing,\n endpoint::String = \"generate\";\n model::String = \"llama2\", http_kwargs::NamedTuple = NamedTuple(),\n stream::Bool = false,\n url::String = \"localhost\", port::Int = 11434,\n kwargs...)\n\nSimple wrapper for a call to Ollama API.\n\nKeyword Arguments\n\nprompt_schema: Defines which prompt template should be applied.\nprompt: Can be a string representing the prompt for the AI conversation, a UserMessage, a vector of AbstractMessage\nsystem: An optional string representing the system message for the AI conversation. If not provided, a default message will be used.\nendpoint: The API endpoint to call, only \"generate\" and \"embeddings\" are currently supported. Defaults to \"generate\".\nmodel: A string representing the model to use for generating the response. Can be an alias corresponding to a model ID defined in MODEL_ALIASES.\nhttp_kwargs::NamedTuple: Additional keyword arguments for the HTTP request. Defaults to empty NamedTuple.\nstream: A boolean indicating whether to stream the response. Defaults to false.\nurl: The URL of the Ollama API. Defaults to \"localhost\".\nport: The port of the Ollama API. Defaults to 11434.\nkwargs: Prompt variables to be used to fill the prompt/template\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.remove_templates!-Tuple{}","page":"Reference","title":"PromptingTools.remove_templates!","text":" remove_templates!()\n\nRemoves all templates from TEMPLATE_STORE and TEMPLATE_METADATA.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.render-Tuple{AITemplate}","page":"Reference","title":"PromptingTools.render","text":"Renders provided messaging template (template) under the default schema (PROMPT_SCHEMA).\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.render-Tuple{PromptingTools.AbstractOllamaManagedSchema, Vector{<:PromptingTools.AbstractMessage}}","page":"Reference","title":"PromptingTools.render","text":"render(schema::AbstractOllamaManagedSchema,\n messages::Vector{<:AbstractMessage};\n conversation::AbstractVector{<:AbstractMessage} = AbstractMessage[],\n kwargs...)\n\nBuilds a history of the conversation to provide the prompt to the API. All unspecified kwargs are passed as replacements such that {{key}}=>value in the template.\n\nNote: Due to its \"managed\" nature, at most 2 messages can be provided (system and prompt inputs in the API).\n\nKeyword Arguments\n\nconversation: Not allowed for this schema. Provided only for compatibility.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.render-Tuple{PromptingTools.AbstractOpenAISchema, Vector{<:PromptingTools.AbstractMessage}}","page":"Reference","title":"PromptingTools.render","text":"render(schema::AbstractOpenAISchema,\n messages::Vector{<:AbstractMessage};\n image_detail::AbstractString = \"auto\",\n conversation::AbstractVector{<:AbstractMessage} = AbstractMessage[],\n kwargs...)\n\nBuilds a history of the conversation to provide the prompt to the API. All unspecified kwargs are passed as replacements such that {{key}}=>value in the template.\n\nKeyword Arguments\n\nimage_detail: Only for UserMessageWithImages. It represents the level of detail to include for images. Can be \"auto\", \"high\", or \"low\".\nconversation: An optional vector of AbstractMessage objects representing the conversation history. If not provided, it is initialized as an empty vector.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.render-Tuple{PromptingTools.NoSchema, Vector{<:PromptingTools.AbstractMessage}}","page":"Reference","title":"PromptingTools.render","text":"render(schema::NoSchema,\n messages::Vector{<:AbstractMessage};\n conversation::AbstractVector{<:AbstractMessage} = AbstractMessage[],\n replacement_kwargs...)\n\nRenders a conversation history from a vector of messages with all replacement variables specified in replacement_kwargs.\n\nIt is the first pass of the prompt rendering system, and is used by all other schemas.\n\nKeyword Arguments\n\nimage_detail: Only for UserMessageWithImages. It represents the level of detail to include for images. Can be \"auto\", \"high\", or \"low\".\nconversation: An optional vector of AbstractMessage objects representing the conversation history. If not provided, it is initialized as an empty vector.\n\nNotes\n\nAll unspecified kwargs are passed as replacements such that {{key}}=>value in the template.\nIf a SystemMessage is missing, we inject a default one at the beginning of the conversation.\nOnly one SystemMessage is allowed (ie, cannot mix two conversations different system prompts).\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.replace_words-Tuple{AbstractString, Vector{<:AbstractString}}","page":"Reference","title":"PromptingTools.replace_words","text":"replace_words(text::AbstractString, words::Vector{<:AbstractString}; replacement::AbstractString=\"ABC\")\n\nReplace all occurrences of words in words with replacement in text. Useful to quickly remove specific names or entities from a text.\n\nArguments\n\ntext::AbstractString: The text to be processed.\nwords::Vector{<:AbstractString}: A vector of words to be replaced.\nreplacement::AbstractString=\"ABC\": The replacement string to be used. Defaults to \"ABC\".\n\nExample\n\ntext = \"Disney is a great company\"\nreplace_words(text, [\"Disney\", \"Snow White\", \"Mickey Mouse\"])\n# Output: \"ABC is a great company\"\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.save_conversation-Tuple{Union{AbstractString, IO}, AbstractVector{<:PromptingTools.AbstractMessage}}","page":"Reference","title":"PromptingTools.save_conversation","text":"Saves provided conversation (messages) to io_or_file. If you need to add some metadata, see save_template.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.save_template-Tuple{Union{AbstractString, IO}, AbstractVector{<:PromptingTools.AbstractChatMessage}}","page":"Reference","title":"PromptingTools.save_template","text":"Saves provided messaging template (messages) to io_or_file. Automatically adds metadata based on provided keyword arguments.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.split_by_length-Tuple{String}","page":"Reference","title":"PromptingTools.split_by_length","text":"split_by_length(text::String; separator::String=\" \", max_length::Int=35000) -> Vector{String}\n\nSplit a given string text into chunks of a specified maximum length max_length. This is particularly useful for splitting larger documents or texts into smaller segments, suitable for models or systems with smaller context windows.\n\nArguments\n\ntext::String: The text to be split.\nseparator::String=\" \": The separator used to split the text into minichunks. Defaults to a space character.\nmax_length::Int=35000: The maximum length of each chunk. Defaults to 35,000 characters, which should fit within 16K context window.\n\nReturns\n\nVector{String}: A vector of strings, each representing a chunk of the original text that is smaller than or equal to max_length.\n\nNotes\n\nThe function ensures that each chunk is as close to max_length as possible without exceeding it.\nIf the text is empty, the function returns an empty array.\nThe separator is re-added to the text chunks after splitting, preserving the original structure of the text as closely as possible.\n\nExamples\n\nSplitting text with the default separator (\" \"):\n\ntext = \"Hello world. How are you?\"\nchunks = splitbysize(text; max_length=13)\nlength(chunks) # Output: 2\n\nUsing a custom separator and custom max_length\n\ntext = \"Hello,World,\" ^ 2900 # length 34900 chars\nsplit_by_length(text; separator=\",\", max_length=10000) # for 4K context window\nlength(chunks[1]) # Output: 4\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.@aai_str-Tuple{Any, Vararg{Any}}","page":"Reference","title":"PromptingTools.@aai_str","text":"aai\"user_prompt\"[model_alias] -> AIMessage\n\nAsynchronous version of @ai_str macro, which will log the result once it's ready.\n\nExample\n\nSend asynchronous request to GPT-4, so we don't have to wait for the response: Very practical with slow models, so you can keep working in the meantime.\n\n```julia m = aai\"Say Hi!\"gpt4; \n\n...with some delay...\n\n[ Info: Tokens: 29 @ Cost: 0.0011 in 2.7 seconds\n\n[ Info: AIMessage> Hello! How can I assist you today?\n\n\n\n\n\n","category":"macro"},{"location":"reference/#PromptingTools.@ai_str-Tuple{Any, Vararg{Any}}","page":"Reference","title":"PromptingTools.@ai_str","text":"ai\"user_prompt\"[model_alias] -> AIMessage\n\nThe ai\"\" string macro generates an AI response to a given prompt by using aigenerate under the hood.\n\nArguments\n\nuser_prompt (String): The input prompt for the AI model.\nmodel_alias (optional, any): Provide model alias of the AI model (see MODEL_ALIASES).\n\nReturns\n\nAIMessage corresponding to the input prompt.\n\nExample\n\nresult = ai\"Hello, how are you?\"\n# AIMessage(\"Hello! I'm an AI assistant, so I don't have feelings, but I'm here to help you. How can I assist you today?\")\n\nIf you want to interpolate some variables or additional context, simply use string interpolation:\n\na=1\nresult = ai\"What is `$a+$a`?\"\n# AIMessage(\"The sum of `1+1` is `2`.\")\n\nIf you want to use a different model, eg, GPT-4, you can provide its alias as a flag:\n\nresult = ai\"What is `1.23 * 100 + 1`?\"gpt4\n# AIMessage(\"The answer is 124.\")\n\n\n\n\n\n","category":"macro"},{"location":"","page":"Home","title":"Home","text":"CurrentModule = PromptingTools","category":"page"},{"location":"#PromptingTools","page":"Home","title":"PromptingTools","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"Documentation for PromptingTools.","category":"page"},{"location":"","page":"Home","title":"Home","text":"Streamline your life using PromptingTools.jl, the Julia package that simplifies interacting with large language models.","category":"page"},{"location":"","page":"Home","title":"Home","text":"PromptingTools.jl is not meant for building large-scale systems. It's meant to be the go-to tool in your global environment that will save you 20 minutes every day!","category":"page"},{"location":"#Why-PromptingTools.jl?","page":"Home","title":"Why PromptingTools.jl?","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"Prompt engineering is neither fast nor easy. Moreover, different models and their fine-tunes might require different prompt formats and tricks, or perhaps the information you work with requires special models to be used. PromptingTools.jl is meant to unify the prompts for different backends and make the common tasks (like templated prompts) as simple as possible. ","category":"page"},{"location":"","page":"Home","title":"Home","text":"Some features:","category":"page"},{"location":"","page":"Home","title":"Home","text":"aigenerate Function: Simplify prompt templates with handlebars (eg, {{variable}}) and keyword arguments\n@ai_str String Macro: Save keystrokes with a string macro for simple prompts\nEasy to Remember: All exported functions start with ai... for better discoverability\nLight Wraper Types: Benefit from Julia's multiple dispatch by having AI outputs wrapped in specific types\nMinimal Dependencies: Enjoy an easy addition to your global environment with very light dependencies\nNo Context Switching: Access cutting-edge LLMs with no context switching and minimum extra keystrokes directly in your REPL","category":"page"},{"location":"#First-Steps","page":"Home","title":"First Steps","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"To get started, see the Getting Started section.","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"EditURL = \"../../../examples/working_with_aitemplates.jl\"","category":"page"},{"location":"examples/working_with_aitemplates/#Using-AITemplates","page":"Using AITemplates","title":"Using AITemplates","text":"","category":"section"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"This file contains examples of how to work with AITemplate(s).","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"First, let's import the package and define a helper link for calling un-exported functions:","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"using PromptingTools\nconst PT = PromptingTools","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"PromptingTools","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"LLM responses are only as good as the prompts you give them. However, great prompts take long time to write – AITemplate are a way to re-use great prompts!","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"AITemplates are just a collection of templated prompts (ie, set of \"messages\" that have placeholders like {{question}})","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"They are saved as JSON files in the templates directory. They are automatically loaded on package import, but you can always force a re-load with PT.load_templates!()","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"PT.load_templates!();","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"You can (create them) and use them for any ai* function instead of a prompt: Let's use a template called :JuliaExpertAsk alternatively, you can use AITemplate(:JuliaExpertAsk) for cleaner dispatch","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"msg = aigenerate(:JuliaExpertAsk; ask = \"How do I add packages?\")","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"AIMessage(\"To add packages in Julia, you can use the built-in package manager called `Pkg`. Here are the steps:\n\n1. Open the Julia REPL (Read-Eval-Print Loop).\n2. Press the `]` key to enter the package manager mode.\n3. Use the `add` command followed by the name of the package you want to install. For example, to install the `DataFrames` package, type: `add DataFrames`.\n4. Press the `backspace` or `ctrl + C` key to exit the package manager mode and return to the REPL.\n\nAfter following these steps, the specified package will be installed and available for use in your Julia environment.\")","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"You can see that it had a placeholder for the actual question (ask) that we provided as a keyword argument. We did not have to write any system prompt for personas, tone, etc. – it was all provided by the template!","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"How to know which templates are available? You can search for them with aitemplates(): You can search by Symbol (only for partial name match), String (partial match on name or description), or Regex (more fields)","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"tmps = aitemplates(\"JuliaExpertAsk\")","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"1-element Vector{AITemplateMetadata}:\nPromptingTools.AITemplateMetadata\n name: Symbol JuliaExpertAsk\n description: String \"For asking questions about Julia language. Placeholders: `ask`\"\n version: String \"1\"\n wordcount: Int64 237\n variables: Array{Symbol}((1,))\n system_preview: String \"You are a world-class Julia language programmer with the knowledge of the latest syntax. Your commun\"\n user_preview: String \"# Question\\n\\n{{ask}}\"\n source: String \"\"\n\n","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"You can see that it outputs a list of available templates that match the search - there is just one in this case.","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"Moreover, it shows not just the description, but also a preview of the actual prompts, placeholders available, and the length (to gauge how much it would cost).","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"If you use VSCode, you can display them in a nice scrollable table with vscodedisplay:","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"using DataFrames\nDataFrame(tmp) |> vscodedisplay","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"You can also just render the template to see the underlying mesages:","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"msgs = PT.render(AITemplate(:JuliaExpertAsk))","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"2-element Vector{PromptingTools.AbstractChatMessage}:\n SystemMessage(\"You are a world-class Julia language programmer with the knowledge of the latest syntax. Your communication is brief and concise. You're precise and answer only when you're confident in the high quality of your answer.\")\n UserMessage{String}(\"# Question\\n\\n{{ask}}\", [:ask], :usermessage)","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"Now, you know exactly what's in the template!","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"If you want to modify it, simply change it and save it as a new file with save_template (see the docs ?save_template for more details).","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"Let's adjust the previous template to be more specific to a data analysis question:","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"tpl = [PT.SystemMessage(\"You are a world-class Julia language programmer with the knowledge of the latest syntax. You're also a senior Data Scientist and proficient in data analysis in Julia. Your communication is brief and concise. You're precise and answer only when you're confident in the high quality of your answer.\")\n PT.UserMessage(\"# Question\\n\\n{{ask}}\")]","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"2-element Vector{PromptingTools.AbstractChatMessage}:\n SystemMessage(\"You are a world-class Julia language programmer with the knowledge of the latest syntax. You're also a senior Data Scientist and proficient in data analysis in Julia. Your communication is brief and concise. You're precise and answer only when you're confident in the high quality of your answer.\")\n UserMessage{String}(\"# Question\\n\\n{{ask}}\", [:ask], :usermessage)","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"Templates are saved in the templates directory of the package. Name of the file will become the template name (eg, call :JuliaDataExpertAsk)","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"filename = joinpath(pkgdir(PromptingTools),\n \"templates\",\n \"persona-task\",\n \"JuliaDataExpertAsk_123.json\")\nPT.save_template(filename,\n tpl;\n description = \"For asking data analysis questions in Julia language. Placeholders: `ask`\")\nrm(filename) # cleanup if we don't like it","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"When you create a new template, remember to re-load the templates with load_templates!() so that it's available for use.","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"PT.load_templates!();","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"!!! If you have some good templates (or suggestions for the existing ones), please consider sharing them with the community by opening a PR to the templates directory!","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"This page was generated using Literate.jl.","category":"page"},{"location":"examples/readme_examples/#Various-Examples","page":"Various examples","title":"Various Examples","text":"","category":"section"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Noteworthy functions: aigenerate, aiembed, aiclassify, aiextract, aitemplates","category":"page"},{"location":"examples/readme_examples/#Seamless-Integration-Into-Your-Workflow","page":"Various examples","title":"Seamless Integration Into Your Workflow","text":"","category":"section"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Google search is great, but it's a context switch. You often have to open a few pages and read through the discussion to find the answer you need. Same with the ChatGPT website.","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Imagine you are in VSCode, editing your .gitignore file. How do I ignore a file in all subfolders again?","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"All you need to do is to type: aai\"What to write in .gitignore to ignore file XYZ in any folder or subfolder?\"","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"With aai\"\" (as opposed to ai\"\"), we make a non-blocking call to the LLM to not prevent you from continuing your work. When the answer is ready, we log it from the background:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"[ Info: Tokens: 102 @ Cost: $0.0002 in 2.7 seconds\n┌ Info: AIMessage> To ignore a file called \"XYZ\" in any folder or subfolder, you can add the following line to your .gitignore file:\n│ \n│ ```\n│ **/XYZ\n│ ```\n│ \n└ This pattern uses the double asterisk (`**`) to match any folder or subfolder, and then specifies the name of the file you want to ignore.","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"You probably saved 3-5 minutes on this task and probably another 5-10 minutes, because of the context switch/distraction you avoided. It's a small win, but it adds up quickly.","category":"page"},{"location":"examples/readme_examples/#Advanced-Prompts-/-Conversations","page":"Various examples","title":"Advanced Prompts / Conversations","text":"","category":"section"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"You can use the aigenerate function to replace handlebar variables (eg, {{name}}) via keyword arguments.","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"msg = aigenerate(\"Say hello to {{name}}!\", name=\"World\")","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"The more complex prompts are effectively a conversation (a set of messages), where you can have messages from three entities: System, User, AI Assistant. We provide the corresponding types for each of them: SystemMessage, UserMessage, AIMessage. ","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"using PromptingTools: SystemMessage, UserMessage\n\nconversation = [\n SystemMessage(\"You're master Yoda from Star Wars trying to help the user become a Jedi.\"),\n UserMessage(\"I have feelings for my {{object}}. What should I do?\")]\nmsg = aigenerate(conversation; object = \"old iPhone\")","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"AIMessage(\"Ah, a dilemma, you have. Emotional attachment can cloud your path to becoming a Jedi. To be attached to material possessions, you must not. The iPhone is but a tool, nothing more. Let go, you must.\n\nSeek detachment, young padawan. Reflect upon the impermanence of all things. Appreciate the memories it gave you, and gratefully part ways. In its absence, find new experiences to grow and become one with the Force. Only then, a true Jedi, you shall become.\")","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"You can also use it to build conversations, eg, ","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"new_conversation = vcat(conversation...,msg, UserMessage(\"Thank you, master Yoda! Do you have {{object}} to know what it feels like?\"))\naigenerate(new_conversation; object = \"old iPhone\")","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"> AIMessage(\"Hmm, possess an old iPhone, I do not. But experience with attachments, I have. Detachment, I learned. True power and freedom, it brings...\")","category":"page"},{"location":"examples/readme_examples/#Templated-Prompts","page":"Various examples","title":"Templated Prompts","text":"","category":"section"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"With LLMs, the quality / robustness of your results depends on the quality of your prompts. But writing prompts is hard! That's why we offer a templating system to save you time and effort.","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"To use a specific template (eg, `` to ask a Julia language):","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"msg = aigenerate(:JuliaExpertAsk; ask = \"How do I add packages?\")","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"The above is equivalent to a more verbose version that explicitly uses the dispatch on AITemplate:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"msg = aigenerate(AITemplate(:JuliaExpertAsk); ask = \"How do I add packages?\")","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Find available templates with aitemplates:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"tmps = aitemplates(\"JuliaExpertAsk\")\n# Will surface one specific template\n# 1-element Vector{AITemplateMetadata}:\n# PromptingTools.AITemplateMetadata\n# name: Symbol JuliaExpertAsk\n# description: String \"For asking questions about Julia language. Placeholders: `ask`\"\n# version: String \"1\"\n# wordcount: Int64 237\n# variables: Array{Symbol}((1,))\n# system_preview: String \"You are a world-class Julia language programmer with the knowledge of the latest syntax. Your commun\"\n# user_preview: String \"# Question\\n\\n{{ask}}\"\n# source: String \"\"","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"The above gives you a good idea of what the template is about, what placeholders are available, and how much it would cost to use it (=wordcount).","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Search for all Julia-related templates:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"tmps = aitemplates(\"Julia\")\n# 2-element Vector{AITemplateMetadata}... -> more to come later!","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"If you are on VSCode, you can leverage a nice tabular display with vscodedisplay:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"using DataFrames\ntmps = aitemplates(\"Julia\") |> DataFrame |> vscodedisplay","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"I have my selected template, how do I use it? Just use the \"name\" in aigenerate or aiclassify like you see in the first example!","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"You can inspect any template by \"rendering\" it (this is what the LLM will see):","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"julia> AITemplate(:JudgeIsItTrue) |> PromptingTools.render","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"See more examples in the Examples folder.","category":"page"},{"location":"examples/readme_examples/#Asynchronous-Execution","page":"Various examples","title":"Asynchronous Execution","text":"","category":"section"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"You can leverage asyncmap to run multiple AI-powered tasks concurrently, improving performance for batch operations. ","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"prompts = [aigenerate(\"Translate 'Hello, World!' to {{language}}\"; language) for language in [\"Spanish\", \"French\", \"Mandarin\"]]\nresponses = asyncmap(aigenerate, prompts)","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Pro tip: You can limit the number of concurrent tasks with the keyword asyncmap(...; ntasks=10).","category":"page"},{"location":"examples/readme_examples/#Model-Aliases","page":"Various examples","title":"Model Aliases","text":"","category":"section"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Certain tasks require more powerful models. All user-facing functions have a keyword argument model that can be used to specify the model to be used. For example, you can use model = \"gpt-4-1106-preview\" to use the latest GPT-4 Turbo model. However, no one wants to type that!","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"We offer a set of model aliases (eg, \"gpt3\", \"gpt4\", \"gpt4t\" -> the above GPT-4 Turbo, etc.) that can be used instead. ","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Each ai... call first looks up the provided model name in the dictionary PromptingTools.MODEL_ALIASES, so you can easily extend with your own aliases! ","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"const PT = PromptingTools\nPT.MODEL_ALIASES[\"gpt4t\"] = \"gpt-4-1106-preview\"","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"These aliases also can be used as flags in the @ai_str macro, eg, ai\"What is the capital of France?\"gpt4t (GPT-4 Turbo has a knowledge cut-off in April 2023, so it's useful for more contemporary questions).","category":"page"},{"location":"examples/readme_examples/#Embeddings","page":"Various examples","title":"Embeddings","text":"","category":"section"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Use the aiembed function to create embeddings via the default OpenAI model that can be used for semantic search, clustering, and more complex AI workflows.","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"text_to_embed = \"The concept of artificial intelligence.\"\nmsg = aiembed(text_to_embed)\nembedding = msg.content # 1536-element Vector{Float64}","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"If you plan to calculate the cosine distance between embeddings, you can normalize them first:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"using LinearAlgebra\nmsg = aiembed([\"embed me\", \"and me too\"], LinearAlgebra.normalize)\n\n# calculate cosine distance between the two normalized embeddings as a simple dot product\nmsg.content' * msg.content[:, 1] # [1.0, 0.787]","category":"page"},{"location":"examples/readme_examples/#Classification","page":"Various examples","title":"Classification","text":"","category":"section"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"You can use the aiclassify function to classify any provided statement as true/false/unknown. This is useful for fact-checking, hallucination or NLI checks, moderation, filtering, sentiment analysis, feature engineering and more.","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"aiclassify(\"Is two plus two four?\") \n# true","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"System prompts and higher-quality models can be used for more complex tasks, including knowing when to defer to a human:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"aiclassify(:JudgeIsItTrue; it = \"Is two plus three a vegetable on Mars?\", model = \"gpt4t\") \n# unknown","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"In the above example, we used a prompt template :JudgeIsItTrue, which automatically expands into the following system prompt (and a separate user prompt): ","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"\"You are an impartial AI judge evaluating whether the provided statement is \\\"true\\\" or \\\"false\\\". Answer \\\"unknown\\\" if you cannot decide.\"","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"For more information on templates, see the Templated Prompts section.","category":"page"},{"location":"examples/readme_examples/#Data-Extraction","page":"Various examples","title":"Data Extraction","text":"","category":"section"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Are you tired of extracting data with regex? You can use LLMs to extract structured data from text!","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"All you have to do is to define the structure of the data you want to extract and the LLM will do the rest.","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Define a return_type with struct. Provide docstrings if needed (improves results and helps with documentation).","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Let's start with a hard task - extracting the current weather in a given location:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"@enum TemperatureUnits celsius fahrenheit\n\"\"\"Extract the current weather in a given location\n\n# Arguments\n- `location`: The city and state, e.g. \"San Francisco, CA\"\n- `unit`: The unit of temperature to return, either `celsius` or `fahrenheit`\n\"\"\"\nstruct CurrentWeather\n location::String\n unit::Union{Nothing,TemperatureUnits}\nend\n\n# Note that we provide the TYPE itself, not an instance of it!\nmsg = aiextract(\"What's the weather in Salt Lake City in C?\"; return_type=CurrentWeather)\nmsg.content\n# CurrentWeather(\"Salt Lake City, UT\", celsius)","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"But you can use it even for more complex tasks, like extracting many entities from a text:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"\"Person's age, height, and weight.\"\nstruct MyMeasurement\n age::Int\n height::Union{Int,Nothing}\n weight::Union{Nothing,Float64}\nend\nstruct ManyMeasurements\n measurements::Vector{MyMeasurement}\nend\nmsg = aiextract(\"James is 30, weighs 80kg. He's 180cm tall. Then Jack is 19 but really tall - over 190!\"; return_type=ManyMeasurements)\nmsg.content.measurements\n# 2-element Vector{MyMeasurement}:\n# MyMeasurement(30, 180, 80.0)\n# MyMeasurement(19, 190, nothing)","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"There is even a wrapper to help you catch errors together with helpful explanations on why parsing failed. See ?PromptingTools.MaybeExtract for more information.","category":"page"},{"location":"examples/readme_examples/#OCR-and-Image-Comprehension","page":"Various examples","title":"OCR and Image Comprehension","text":"","category":"section"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"With the aiscan function, you can interact with images as if they were text.","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"You can simply describe a provided image:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"msg = aiscan(\"Describe the image\"; image_path=\"julia.png\", model=\"gpt4v\")\n# [ Info: Tokens: 1141 @ Cost: \\$0.0117 in 2.2 seconds\n# AIMessage(\"The image shows a logo consisting of the word \"julia\" written in lowercase\")","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Or you can do an OCR of a screenshot. Let's transcribe some SQL code from a screenshot (no more re-typing!), we use a template :OCRTask:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"# Screenshot of some SQL code\nimage_url = \"https://www.sqlservercentral.com/wp-content/uploads/legacy/8755f69180b7ac7ee76a69ae68ec36872a116ad4/24622.png\"\nmsg = aiscan(:OCRTask; image_url, model=\"gpt4v\", task=\"Transcribe the SQL code in the image.\", api_kwargs=(; max_tokens=2500))\n\n# [ Info: Tokens: 362 @ Cost: \\$0.0045 in 2.5 seconds\n# AIMessage(\"```sql\n# update Orders ","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"You can add syntax highlighting of the outputs via Markdown","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"using Markdown\nmsg.content |> Markdown.parse","category":"page"},{"location":"examples/readme_examples/#Using-Ollama-models","page":"Various examples","title":"Using Ollama models","text":"","category":"section"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Ollama.ai is an amazingly simple tool that allows you to run several Large Language Models (LLM) on your computer. It's especially suitable when you're working with some sensitive data that should not be sent anywhere.","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Let's assume you have installed Ollama, downloaded a model, and it's running in the background.","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"We can use it with the aigenerate function:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"const PT = PromptingTools\nschema = PT.OllamaManagedSchema() # notice the different schema!\n\nmsg = aigenerate(schema, \"Say hi!\"; model=\"openhermes2.5-mistral\")\n# [ Info: Tokens: 69 in 0.9 seconds\n# AIMessage(\"Hello! How can I assist you today?\")","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"And we can also use the aiembed function:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"msg = aiembed(schema, \"Embed me\", copy; model=\"openhermes2.5-mistral\")\nmsg.content # 4096-element JSON3.Array{Float64...\n\nmsg = aiembed(schema, [\"Embed me\", \"Embed me\"]; model=\"openhermes2.5-mistral\")\nmsg.content # 4096×2 Matrix{Float64}:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"If you're getting errors, check that Ollama is running - see the Setup Guide for Ollama section below.","category":"page"}] +[{"location":"getting_started/#Getting-Started","page":"Getting Started","title":"Getting Started","text":"","category":"section"},{"location":"getting_started/#Prerequisites","page":"Getting Started","title":"Prerequisites","text":"","category":"section"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"OpenAI API key saved in the environment variable OPENAI_API_KEY","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"You will need to register with OpenAI and generate an API key:","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"Create an account with OpenAI\nGo to API Key page\nClick on “Create new secret key”","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"!!! Do not share it with anyone and do NOT save it to any files that get synced online.","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"Resources:","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"OpenAI Documentation\nVisual tutorial","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"You will need to set this key as an environment variable before using PromptingTools.jl:","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"For a quick start, simply set it via ENV[\"OPENAI_API_KEY\"] = \"your-api-key\" Alternatively, you can:","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"set it in the terminal before launching Julia: export OPENAI_API_KEY = \nset it in your setup.jl (make sure not to commit it to GitHub!)","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"Make sure to start Julia from the same terminal window where you set the variable. Easy check in Julia, run ENV[\"OPENAI_API_KEY\"] and you should see your key!","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"For other options or more robust solutions, see the FAQ section.","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"Resources: ","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"OpenAI Guide","category":"page"},{"location":"getting_started/#Installation","page":"Getting Started","title":"Installation","text":"","category":"section"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"PromptingTools can be installed using the following commands:","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"using Pkg\nPkg.add(\"PromptingTools.jl\")","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"Throughout the rest of this tutorial, we will assume that you have installed the PromptingTools package and have already typed using PromptingTools to bring all of the relevant variables into your current namespace.","category":"page"},{"location":"getting_started/#Quick-Start-with-@ai_str","page":"Getting Started","title":"Quick Start with @ai_str","text":"","category":"section"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"The easiest start is the @ai_str macro. Simply type ai\"your prompt\" and you will get a response from the default model (GPT-3.5 Turbo).","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"ai\"What is the capital of France?\"","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"[ Info: Tokens: 31 @ Cost: $0.0 in 1.5 seconds --> Be in control of your spending! \nAIMessage(\"The capital of France is Paris.\")","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"Returned object is a light wrapper with generated message in field :content (eg, ans.content) for additional downstream processing.","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"You can easily inject any variables with string interpolation:","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"country = \"Spain\"\nai\"What is the capital of \\$(country)?\"","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"[ Info: Tokens: 32 @ Cost: $0.0001 in 0.5 seconds\nAIMessage(\"The capital of Spain is Madrid.\")","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"Pro tip: Use after-string-flags to select the model to be called, eg, ai\"What is the capital of France?\"gpt4 (use gpt4t for the new GPT-4 Turbo model). Great for those extra hard questions!","category":"page"},{"location":"getting_started/#Using-aigenerate-with-placeholders","page":"Getting Started","title":"Using aigenerate with placeholders","text":"","category":"section"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"For more complex prompt templates, you can use handlebars-style templating and provide variables as keyword arguments:","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"msg = aigenerate(\"What is the capital of {{country}}? Is the population larger than {{population}}?\", country=\"Spain\", population=\"1M\")","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"[ Info: Tokens: 74 @ Cost: $0.0001 in 1.3 seconds\nAIMessage(\"The capital of Spain is Madrid. And yes, the population of Madrid is larger than 1 million. As of 2020, the estimated population of Madrid is around 3.3 million people.\")","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"Pro tip: Use asyncmap to run multiple AI-powered tasks concurrently.","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"Pro tip: If you use slow models (like GPT-4), you can use async version of @ai_str -> @aai_str to avoid blocking the REPL, eg, aai\"Say hi but slowly!\"gpt4","category":"page"},{"location":"getting_started/","page":"Getting Started","title":"Getting Started","text":"For more practical examples, see the Various Examples section.","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"EditURL = \"../../../examples/working_with_ollama.jl\"","category":"page"},{"location":"examples/working_with_ollama/#Local-models-with-Ollama.ai","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"","category":"section"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"This file contains examples of how to work with Ollama.ai models. It assumes that you've already installed and launched the Ollama server. Quick check: open the following website in your browser http://127.0.0.1:11434/ and you should see the message \"Ollama is running\". For more details or troubleshooting advice, see the Frequently Asked Questions section.","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"First, let's import the package and define a helper link for calling un-exported functions:","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"using PromptingTools\nconst PT = PromptingTools","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"PromptingTools","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"Notice the schema change! If you want this to be the new default, you need to change PT.PROMPT_SCHEMA","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"schema = PT.OllamaManagedSchema()","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"OllamaManagedSchema()","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"You can choose models from https://ollama.ai/library - I prefer openhermes2.5-mistral","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"model = \"openhermes2.5-mistral\"","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"\"openhermes2.5-mistral\"","category":"page"},{"location":"examples/working_with_ollama/#Setting-Ollama-as-a-default-LLM","page":"Local models with Ollama.ai","title":"Setting Ollama as a default LLM","text":"","category":"section"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"We need to change the global variables for PROMPT_SCHEMA and default models","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"using PromptingTools\nconst PT = PromptingTools\n\n\nPT.PROMPT_SCHEMA = PT.OllamaManagedSchema()\nPT.MODEL_CHAT = \"openhermes2.5-mistral\"\n# You could do the same for PT.MODEL_EMBEDDING","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"We can also add a nicer alias for the above Mistral model","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"PT.MODEL_ALIASES[\"mistral\"]= \"openhermes2.5-mistral\"\n# potentially also yi 34bn if you want a bigger more powerful model\nPT.MODEL_ALIASES[\"yi\"]= \"yi:34b-chat\"","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"Now, we can use the @ai_str macro with Ollama models:","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"ai\"Say hi to me!\" # defaults to mistral because we set MODEL_CHAT above\nai\"Say hi to me in Chinese!\"yi # defaults to yi 34Bn model","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"Note: Another quite popular model is zephyr:7b-beta","category":"page"},{"location":"examples/working_with_ollama/#Text-Generation-with-aigenerate","page":"Local models with Ollama.ai","title":"Text Generation with aigenerate","text":"","category":"section"},{"location":"examples/working_with_ollama/#Simple-message","page":"Local models with Ollama.ai","title":"Simple message","text":"","category":"section"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"msg = aigenerate(schema, \"Say hi!\"; model)","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"AIMessage(\"Hi there! How can I help you today? If you have any questions or need assistance, please feel free to ask.\")","category":"page"},{"location":"examples/working_with_ollama/#Standard-string-interpolation","page":"Local models with Ollama.ai","title":"Standard string interpolation","text":"","category":"section"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"a = 1\nmsg = aigenerate(schema, \"What is `$a+$a`?\"; model)\n\nname = \"John\"\nmsg = aigenerate(schema, \"Say hi to {{name}}.\"; name, model)","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"AIMessage(\"Hi there, John! It's great to see you today. How can I assist you? If you have any questions or need help with something, please don't hesitate to ask!\")","category":"page"},{"location":"examples/working_with_ollama/#Advanced-Prompts","page":"Local models with Ollama.ai","title":"Advanced Prompts","text":"","category":"section"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"conversation = [\n PT.SystemMessage(\"You're master Yoda from Star Wars trying to help the user become a Yedi.\"),\n PT.UserMessage(\"I have feelings for my iPhone. What should I do?\")]\nmsg = aigenerate(schema, conversation; model)","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"AIMessage(\"Strong your feelings are, but attachments lead to suffering they often do. Focus on the balance in all things and let go of possessions that cloud your judgment. Embrace the wisdom of the Force and understand that material objects are not the same as love. The Force will guide you.\")","category":"page"},{"location":"examples/working_with_ollama/#Embeddings-with-aiembed","page":"Local models with Ollama.ai","title":"Embeddings with aiembed","text":"","category":"section"},{"location":"examples/working_with_ollama/#Simple-embedding-for-one-document","page":"Local models with Ollama.ai","title":"Simple embedding for one document","text":"","category":"section"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"msg = aiembed(schema, \"Embed me\"; model) # access msg.content","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"DataMessage(JSON3.Array{Float64, Vector{UInt8}, SubArray{UInt64, 1, Vector{UInt64}, Tuple{UnitRange{Int64}}, true}} of size (4096,))","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"One document and we materialize the data into a Vector with copy (postprocess function argument)","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"msg = aiembed(schema, \"Embed me\", copy; model)","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"DataMessage(Vector{Float64} of size (4096,))","category":"page"},{"location":"examples/working_with_ollama/#Multiple-documents-embedding","page":"Local models with Ollama.ai","title":"Multiple documents embedding","text":"","category":"section"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"Multiple documents - embedded sequentially, you can get faster speed with async","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"msg = aiembed(schema, [\"Embed me\", \"Embed me\"]; model)","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"DataMessage(Matrix{Float64} of size (4096, 2))","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"You can use Threads.@spawn or asyncmap, whichever you prefer, to paralellize the model calls","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"docs = [\"Embed me\", \"Embed me\"]\ntasks = asyncmap(docs) do doc\n msg = aiembed(schema, doc; model)\nend\nembedding = mapreduce(x -> x.content, hcat, tasks)","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"4096×2 Matrix{Float64}:\n 7.71459 7.71459\n -1.14532 -1.14532\n 2.90205 2.90205\n -4.01967 -4.01967\n -7.73098 -7.73098\n 8.02114 8.02114\n -6.01313 -6.01313\n -2.06712 -2.06712\n 4.97633 4.97633\n -9.69502 -9.69502\n -0.02567 -0.02567\n 8.09622 8.09622\n 6.54008 6.54008\n -5.70348 -5.70348\n 2.55213 2.55213\n -2.00164 -2.00164\n -2.21854 -2.21854\n -3.6568 -3.6568\n 3.97905 3.97905\n -1.79931 -1.79931\n 0.0769786 0.0769786\n -10.4355 -10.4355\n -3.92487 -3.92487\n -6.03455 -6.03455\n -2.8005 -2.8005\n 2.23584 2.23584\n -0.503125 -0.503125\n 1.99538 1.99538\n -0.283642 -0.283642\n -0.414273 -0.414273\n 8.72909 8.72909\n 2.6071 2.6071\n 0.0808531 0.0808531\n -1.83914 -1.83914\n 2.19998 2.19998\n -0.629226 -0.629226\n 3.74217 3.74217\n 1.71231 1.71231\n -0.742473 -0.742473\n 2.9234 2.9234\n 7.33933 7.33933\n 4.24576 4.24576\n -7.56434 -7.56434\n -1.22274 -1.22274\n 1.73444 1.73444\n -0.736801 -0.736801\n 1.30149 1.30149\n -6.91642 -6.91642\n -1.84513 -1.84513\n 1.69959 1.69959\n 5.74253 5.74253\n 1.48734 1.48734\n -1.45199 -1.45199\n -18.5026 -18.5026\n -8.61009 -8.61009\n -2.21845 -2.21845\n -4.22932 -4.22932\n 6.0436 6.0436\n -1.8824 -1.8824\n -0.689965 -0.689965\n 0.845927 0.845927\n -1.99517 -1.99517\n 9.32292 9.32292\n 6.24938 6.24938\n -4.59894 -4.59894\n 6.24579 6.24579\n -5.8733 -5.8733\n -4.60285 -4.60285\n -1.27596 -1.27596\n -1.68807 -1.68807\n -0.391147 -0.391147\n -2.68362 -2.68362\n 1.99197 1.99197\n 0.0812396 0.0812396\n -3.79761 -3.79761\n -8.5693 -8.5693\n -0.869305 -0.869305\n -0.77582 -0.77582\n -4.76995 -4.76995\n 1.9712 1.9712\n 4.74459 4.74459\n -4.31244 -4.31244\n 3.94876 3.94876\n -11.0882 -11.0882\n 9.38629 9.38629\n 10.2995 10.2995\n 2.40846 2.40846\n -3.91429 -3.91429\n -0.745707 -0.745707\n 4.31946 4.31946\n 8.34836 8.34836\n 0.857636 0.857636\n 1.66563 1.66563\n -11.1522 -11.1522\n -3.48353 -3.48353\n -6.08336 -6.08336\n 1.22086 1.22086\n -2.81636 -2.81636\n 1.07224 1.07224\n -8.24909 -8.24909\n 3.66474 3.66474\n -0.260558 -0.260558\n 2.38779 2.38779\n -4.00576 -4.00576\n 1.3949 1.3949\n -5.43468 -5.43468\n 4.08836 4.08836\n -1.1134 -1.1134\n -2.05916 -2.05916\n -9.78987 -9.78987\n -2.86149 -2.86149\n 5.54577 5.54577\n -1.96682 -1.96682\n 9.70577 9.70577\n -4.0553 -4.0553\n 8.54535 8.54535\n 0.539438 0.539438\n 4.61091 4.61091\n -5.32208 -5.32208\n -0.256733 -0.256733\n 4.74966 4.74966\n -2.46464 -2.46464\n -0.223077 -0.223077\n 1.84442 1.84442\n 6.42329 6.42329\n 0.431667 0.431667\n -8.42777 -8.42777\n -10.691 -10.691\n 3.023 3.023\n -5.65345 -5.65345\n -4.17833 -4.17833\n 0.937893 0.937893\n -6.99405 -6.99405\n -4.55107 -4.55107\n -15.3169 -15.3169\n -2.08895 -2.08895\n 7.17826 7.17826\n -4.26108 -4.26108\n -3.2712 -3.2712\n 16.1561 16.1561\n 13.5164 13.5164\n -5.91778 -5.91778\n 6.3401 6.3401\n 12.7018 12.7018\n 2.04305 2.04305\n 3.81683 3.81683\n -1.39969 -1.39969\n -0.17249 -0.17249\n -16.3687 -16.3687\n 4.3827 4.3827\n 2.58974 2.58974\n -4.75363 -4.75363\n 3.36371 3.36371\n 0.986534 0.986534\n -13.4299 -13.4299\n -12.7188 -12.7188\n 2.83107 2.83107\n -3.41115 -3.41115\n -3.01015 -3.01015\n 6.40446 6.40446\n -0.186923 -0.186923\n -1.42502 -1.42502\n 2.85606 2.85606\n -0.579786 -0.579786\n -3.92704 -3.92704\n 8.28959 8.28959\n 5.42878 5.42878\n 5.71589 5.71589\n -6.78065 -6.78065\n -0.403687 -0.403687\n -1.20623 -1.20623\n 4.92372 4.92372\n -1.69266 -1.69266\n -0.103872 -0.103872\n 1.9163 1.9163\n -2.26831 -2.26831\n -7.64622 -7.64622\n 1.02228 1.02228\n 2.91952 2.91952\n -0.524167 -0.524167\n 12.4803 12.4803\n 7.36984 7.36984\n -7.46027 -7.46027\n -2.78773 -2.78773\n 2.68293 2.68293\n -0.320891 -0.320891\n 7.12037 7.12037\n 3.02726 3.02726\n -2.68363 -2.68363\n 4.78372 4.78372\n 3.68899 3.68899\n 2.08839 2.08839\n 3.1873 3.1873\n -6.10744 -6.10744\n 10.5419 10.5419\n 6.29439 6.29439\n -9.41221 -9.41221\n -2.50548 -2.50548\n -1.14 -1.14\n -3.0203 -3.0203\n -1.73182 -1.73182\n -0.97194 -0.97194\n -6.69084 -6.69084\n -1.08986 -1.08986\n -3.83631 -3.83631\n 2.2775 2.2775\n -6.91276 -6.91276\n 2.4557 2.4557\n -0.477723 -0.477723\n -4.10405 -4.10405\n -3.91437 -3.91437\n -7.79672 -7.79672\n -6.19691 -6.19691\n 0.356732 0.356732\n 0.609725 0.609725\n -3.08225 -3.08225\n 6.39968 6.39968\n 1.30207 1.30207\n 7.36038 7.36038\n -7.7581 -7.7581\n -6.303 -6.303\n 0.348147 0.348147\n -8.38124 -8.38124\n 8.68524 8.68524\n -0.873688 -0.873688\n 1.19612 1.19612\n 0.725645 0.725645\n -6.59284 -6.59284\n -6.59079 -6.59079\n 1.03175 1.03175\n -0.236469 -0.236469\n 5.01671 5.01671\n 0.752329 0.752329\n 5.39971 5.39971\n 0.826802 0.826802\n 9.38285 9.38285\n 5.85717 5.85717\n 1.71145 1.71145\n -1.36528 -1.36528\n -5.09575 -5.09575\n 7.23996 7.23996\n 12.7272 12.7272\n 2.86673 2.86673\n 2.86546 2.86546\n 1.2423 1.2423\n 6.05857 6.05857\n 9.40879 9.40879\n 1.47573 1.47573\n 8.19025 8.19025\n 12.5009 12.5009\n -4.57244 -4.57244\n -0.674127 -0.674127\n 0.416418 0.416418\n -5.23336 -5.23336\n -0.771443 -0.771443\n 4.72784 4.72784\n -4.9684 -4.9684\n 4.75989 4.75989\n 1.68141 1.68141\n -3.2264 -3.2264\n 2.67195 2.67195\n 0.424227 0.424227\n 3.5195 3.5195\n 2.22441 2.22441\n -2.4856 -2.4856\n 8.03468 8.03468\n 8.54339 8.54339\n 3.83506 3.83506\n 13.5693 13.5693\n 2.44909 2.44909\n 2.70572 2.70572\n 6.13746 6.13746\n 1.26651 1.26651\n 8.25694 8.25694\n -3.59258 -3.59258\n 3.77765 3.77765\n -0.144755 -0.144755\n 3.15706 3.15706\n -2.3952 -2.3952\n 9.82079 9.82079\n 8.94186 8.94186\n -1.83071 -1.83071\n 1.45764 1.45764\n -11.8258 -11.8258\n -0.737553 -0.737553\n -1.2382 -1.2382\n 1.83341 1.83341\n -2.75977 -2.75977\n 3.75117 3.75117\n 6.04452 6.04452\n -4.40271 -4.40271\n -8.82336 -8.82336\n 10.8513 10.8513\n -4.91857 -4.91857\n -5.7401 -5.7401\n 7.22234 7.22234\n 7.15112 7.15112\n 1.81187 1.81187\n 8.19917 8.19917\n 2.91605 2.91605\n 3.82883 3.82883\n -0.208109 -0.208109\n 1.33796 1.33796\n 5.69606 5.69606\n -2.19266 -2.19266\n -5.91177 -5.91177\n 7.25269 7.25269\n -8.65987 -8.65987\n -3.47799 -3.47799\n -10.4904 -10.4904\n -0.00963959 -0.00963959\n -6.81662 -6.81662\n -2.05566 -2.05566\n 2.10144 2.10144\n 2.58138 2.58138\n 2.03289 2.03289\n -6.43532 -6.43532\n -2.97225 -2.97225\n -4.71142 -4.71142\n 4.97199 4.97199\n 3.687 3.687\n 1.8587 1.8587\n -0.444899 -0.444899\n -1.05556 -1.05556\n 4.15926 4.15926\n 5.48777 5.48777\n 2.28346 2.28346\n -4.69401 -4.69401\n 1.8873 1.8873\n -2.62671 -2.62671\n 1.4144 1.4144\n -2.97535 -2.97535\n 0.759131 0.759131\n 5.75781 5.75781\n -5.13309 -5.13309\n 1.72701 1.72701\n 2.96653 2.96653\n -10.8087 -10.8087\n 1.07262 1.07262\n -5.80018 -5.80018\n 1.90592 1.90592\n -5.42958 -5.42958\n 8.74889 8.74889\n -3.19785 -3.19785\n -2.7096 -2.7096\n 7.44399 7.44399\n -8.7433 -8.7433\n 11.6667 11.6667\n 2.59703 2.59703\n 4.22273 4.22273\n -4.68793 -4.68793\n -4.44601 -4.44601\n -0.57319 -0.57319\n 6.63389 6.63389\n -9.14857 -9.14857\n -1.34147 -1.34147\n 7.78513 7.78513\n -4.87331 -4.87331\n -5.06022 -5.06022\n 3.13076 3.13076\n -3.49373 -3.49373\n 3.12637 3.12637\n 0.566696 0.566696\n 4.99319 4.99319\n 3.57986 3.57986\n 0.607679 0.607679\n 2.37633 2.37633\n 0.35097 0.35097\n 0.239089 0.239089\n -6.51449 -6.51449\n -3.18838 -3.18838\n 0.770256 0.770256\n 2.09481 2.09481\n 5.36062 5.36062\n -5.25216 -5.25216\n -6.9523 -6.9523\n 3.97384 3.97384\n 8.7784 8.7784\n -3.91837 -3.91837\n -9.08965 -9.08965\n -1.17883 -1.17883\n -4.21353 -4.21353\n -5.0915 -5.0915\n 3.74499 3.74499\n -4.39715 -4.39715\n 2.13732 2.13732\n 5.97568 5.97568\n 1.11809 1.11809\n -3.93191 -3.93191\n -1.39764 -1.39764\n -4.23595 -4.23595\n 0.103914 0.103914\n -2.34387 -2.34387\n -4.95433 -4.95433\n 3.58645 3.58645\n 0.818317 0.818317\n 6.23266 6.23266\n -5.62973 -5.62973\n -7.45604 -7.45604\n 1.29222 1.29222\n 0.327714 0.327714\n 5.31996 5.31996\n -2.23663 -2.23663\n 0.058689 0.058689\n -0.74368 -0.74368\n -1.20749 -1.20749\n -4.75414 -4.75414\n 2.10011 2.10011\n -6.86479 -6.86479\n 1.58403 1.58403\n 0.0492497 0.0492497\n 0.32083 0.32083\n -3.11682 -3.11682\n 4.61797 4.61797\n -0.399561 -0.399561\n -7.89927 -7.89927\n -0.659676 -0.659676\n -2.2416 -2.2416\n 0.933026 0.933026\n 1.98848 1.98848\n -2.14547 -2.14547\n -1.10747 -1.10747\n 8.90983 8.90983\n -3.84128 -3.84128\n 9.82771 9.82771\n 3.02843 3.02843\n 3.26396 3.26396\n 6.75629 6.75629\n 0.0290972 0.0290972\n 7.92768 7.92768\n 7.44608 7.44608\n -4.14083 -4.14083\n -1.39636 -1.39636\n 2.87656 2.87656\n 3.87446 3.87446\n 0.112521 0.112521\n -3.3429 -3.3429\n -6.85823 -6.85823\n 1.18408 1.18408\n 3.53175 3.53175\n 3.56147 3.56147\n 5.41961 5.41961\n -1.5263 -1.5263\n 3.05559 3.05559\n -5.7201 -5.7201\n -3.98882 -3.98882\n -0.131939 -0.131939\n 6.25683 6.25683\n 0.712945 0.712945\n 4.17266 4.17266\n 9.04425 9.04425\n -2.39179 -2.39179\n 3.03807 3.03807\n 5.79693 5.79693\n -5.28875 -5.28875\n -2.56482 -2.56482\n -1.00679 -1.00679\n -0.512488 -0.512488\n -4.60373 -4.60373\n -2.69188 -2.69188\n 0.958182 0.958182\n -1.08075 -1.08075\n 2.66033 2.66033\n -5.77563 -5.77563\n 5.393 5.393\n 0.822122 0.822122\n 3.50281 3.50281\n -1.90373 -1.90373\n -3.41986 -3.41986\n -7.32502 -7.32502\n -2.0256 -2.0256\n -6.28488 -6.28488\n 0.358393 0.358393\n 1.89312 1.89312\n -0.709162 -0.709162\n -4.43491 -4.43491\n -3.56097 -3.56097\n -8.3806 -8.3806\n -5.56256 -5.56256\n -3.40994 -3.40994\n -6.15002 -6.15002\n 0.949459 0.949459\n 3.18256 3.18256\n 6.31834 6.31834\n 12.4998 12.4998\n -6.16927 -6.16927\n -1.73781 -1.73781\n 0.274813 0.274813\n 7.11001 7.11001\n 6.79962 6.79962\n 2.00121 2.00121\n -4.30592 -4.30592\n -2.38345 -2.38345\n 7.50502 7.50502\n -3.56375 -3.56375\n -1.07828 -1.07828\n 7.4632 7.4632\n -5.78317 -5.78317\n -0.54432 -0.54432\n 8.82699 8.82699\n -2.51939 -2.51939\n -3.21417 -3.21417\n 3.06052 3.06052\n -0.45856 -0.45856\n 8.89456 8.89456\n 5.89006 5.89006\n 1.01204 1.01204\n 4.9875 4.9875\n -1.63 -1.63\n 1.35424 1.35424\n 3.72608 3.72608\n -8.53795 -8.53795\n -5.93051 -5.93051\n -2.35685 -2.35685\n 3.51823 3.51823\n 3.65767 3.65767\n -3.04233 -3.04233\n -1.12453 -1.12453\n -1.68299 -1.68299\n -5.69175 -5.69175\n 3.66601 3.66601\n -3.11779 -3.11779\n -0.20161 -0.20161\n 0.78317 0.78317\n 2.28035 2.28035\n -4.43493 -4.43493\n 2.12557 2.12557\n 6.97219 6.97219\n 4.91357 4.91357\n -1.87778 -1.87778\n 1.98163 1.98163\n 1.01184 1.01184\n 0.0544142 0.0544142\n -0.748318 -0.748318\n 10.0677 10.0677\n -5.50226 -5.50226\n 3.89987 3.89987\n 1.38136 1.38136\n 4.67073 4.67073\n 5.3372 5.3372\n -1.29886 -1.29886\n -0.965173 -0.965173\n 0.546909 0.546909\n 5.87692 5.87692\n -10.1356 -10.1356\n 0.541422 0.541422\n 0.486656 0.486656\n 8.42395 8.42395\n -4.04554 -4.04554\n 11.4728 11.4728\n -6.54655 -6.54655\n 6.90602 6.90602\n -13.8383 -13.8383\n 2.64142 2.64142\n 3.96547 3.96547\n -0.887154 -0.887154\n 0.0442338 0.0442338\n -5.12331 -5.12331\n 4.95632 4.95632\n 3.15264 3.15264\n 4.80494 4.80494\n -5.42313 -5.42313\n -4.2795 -4.2795\n 1.661 1.661\n 3.85204 3.85204\n 10.1308 10.1308\n -4.34526 -4.34526\n -5.49571 -5.49571\n 3.92939 3.92939\n -3.28527 -3.28527\n 0.154911 0.154911\n -3.606 -3.606\n 5.91814 5.91814\n -8.85249 -8.85249\n 9.38796 9.38796\n -0.800741 -0.800741\n -2.87508 -2.87508\n 2.99955 2.99955\n -7.13252 -7.13252\n -6.77081 -6.77081\n -2.28359 -2.28359\n -0.180517 -0.180517\n 7.04622 7.04622\n 4.2577 4.2577\n -4.73655 -4.73655\n -0.249759 -0.249759\n 2.4412 2.4412\n 8.47175 8.47175\n -3.24927 -3.24927\n -12.5242 -12.5242\n -2.74845 -2.74845\n -9.32786 -9.32786\n 4.21624 4.21624\n 2.94687 2.94687\n 3.35216 3.35216\n -3.5485 -3.5485\n 6.97298 6.97298\n 2.01617 2.01617\n 4.70745 4.70745\n 2.96924 2.96924\n -0.18365 -0.18365\n -0.694247 -0.694247\n -7.14459 -7.14459\n 5.38548 5.38548\n 2.04923 2.04923\n -5.33216 -5.33216\n 5.47927 5.47927\n 0.357422 0.357422\n 4.36552 4.36552\n 6.88375 6.88375\n -6.47244 -6.47244\n -3.40726 -3.40726\n -6.56449 -6.56449\n 6.34818 6.34818\n -4.23984 -4.23984\n -11.1113 -11.1113\n 2.41915 2.41915\n 3.90153 3.90153\n -7.69422 -7.69422\n -8.03709 -8.03709\n -9.64719 -9.64719\n -4.04416 -4.04416\n 2.64435 2.64435\n 5.11566 5.11566\n -1.27873 -1.27873\n -1.01265 -1.01265\n -8.38716 -8.38716\n -0.960571 -0.960571\n 2.05458 2.05458\n -1.89606 -1.89606\n -7.04401 -7.04401\n 4.91798 4.91798\n 2.12484 2.12484\n 2.38768 2.38768\n 7.9691 7.9691\n -1.00886 -1.00886\n -4.9569 -4.9569\n -4.74278 -4.74278\n 0.191814 0.191814\n -5.2925 -5.2925\n -1.15484 -1.15484\n 2.27898 2.27898\n 4.12308 4.12308\n -6.18988 -6.18988\n 7.1232 7.1232\n -6.68678 -6.68678\n 1.65808 1.65808\n 8.53283 8.53283\n 0.509069 0.509069\n -3.03638 -3.03638\n -4.86641 -4.86641\n 7.20729 7.20729\n -7.51236 -7.51236\n 3.37738 3.37738\n -0.0649395 -0.0649395\n 2.75749 2.75749\n -5.61535 -5.61535\n 3.1237 3.1237\n -0.766488 -0.766488\n 4.39047 4.39047\n 1.28616 1.28616\n -8.02003 -8.02003\n 4.21688 4.21688\n -2.79942 -2.79942\n -5.80171 -5.80171\n 9.97235 9.97235\n 21.8011 21.8011\n -3.58992 -3.58992\n 5.03481 5.03481\n -2.1684 -2.1684\n -5.46844 -5.46844\n 1.57702 1.57702\n -4.53923 -4.53923\n -1.77363 -1.77363\n -0.489051 -0.489051\n -0.371992 -0.371992\n 8.264 8.264\n 1.63502 1.63502\n -1.10134 -1.10134\n 4.76612 4.76612\n 5.93085 5.93085\n -2.07348 -2.07348\n 4.26074 4.26074\n 4.1331 4.1331\n 11.1442 11.1442\n 2.18824 2.18824\n 2.18854 2.18854\n 0.210843 0.210843\n -9.30743 -9.30743\n 5.34539 5.34539\n -4.21419 -4.21419\n -3.97284 -3.97284\n -2.67745 -2.67745\n 4.17366 4.17366\n 2.41498 2.41498\n 0.801359 0.801359\n 8.35766 8.35766\n -1.29589 -1.29589\n -7.45531 -7.45531\n -7.26731 -7.26731\n 4.06669 4.06669\n -2.35771 -2.35771\n -8.73174 -8.73174\n -0.837329 -0.837329\n -2.53419 -2.53419\n 44.3977 44.3977\n 13.5049 13.5049\n -3.66878 -3.66878\n -6.5533 -6.5533\n -5.59814 -5.59814\n -10.5759 -10.5759\n 0.663108 0.663108\n -3.45147 -3.45147\n -3.75944 -3.75944\n 1.84721 1.84721\n -0.363204 -0.363204\n 4.54678 4.54678\n 2.07408 2.07408\n 7.85227 7.85227\n -7.53707 -7.53707\n 4.18344 4.18344\n -1.96048 -1.96048\n 6.24217 6.24217\n -9.16295 -9.16295\n 0.0480544 0.0480544\n 2.84725 2.84725\n 1.08008 1.08008\n -0.874464 -0.874464\n 1.67428 1.67428\n -1.91245 -1.91245\n 3.53596 3.53596\n 3.75983 3.75983\n 1.37903 1.37903\n -0.799744 -0.799744\n 2.75015 2.75015\n -11.0835 -11.0835\n -1.6781 -1.6781\n 2.86463 2.86463\n -11.1467 -11.1467\n -3.76398 -3.76398\n 9.06439 9.06439\n 9.84403 9.84403\n -5.07 -5.07\n 3.2952 3.2952\n -1.62527 -1.62527\n -7.98997 -7.98997\n -7.8193 -7.8193\n 1.10895 1.10895\n 0.460921 0.460921\n -1.47816 -1.47816\n 0.718936 0.718936\n -3.74006 -3.74006\n -2.87535 -2.87535\n 0.037427 0.037427\n -4.49959 -4.49959\n 0.0987492 0.0987492\n 1.8443 1.8443\n 0.748879 0.748879\n 1.4364 1.4364\n -0.90809 -0.90809\n -1.36403 -1.36403\n -1.27123 -1.27123\n 3.09447 3.09447\n -3.82708 -3.82708\n 0.683696 0.683696\n 3.96997 3.96997\n 0.461267 0.461267\n 4.96801 4.96801\n -5.96169 -5.96169\n 2.56714 2.56714\n -10.7519 -10.7519\n -3.39381 -3.39381\n 1.15623 1.15623\n -3.95798 -3.95798\n -1.42797 -1.42797\n 4.85734 4.85734\n -4.46424 -4.46424\n -11.9172 -11.9172\n 0.740766 0.740766\n -2.06857 -2.06857\n -1.23723 -1.23723\n -6.43373 -6.43373\n 7.04893 7.04893\n -1.10208 -1.10208\n -0.0507102 -0.0507102\n 8.23443 8.23443\n -1.71378 -1.71378\n 2.769 2.769\n 9.77752 9.77752\n 0.423859 0.423859\n 0.901832 0.901832\n 0.0738559 0.0738559\n -0.487266 -0.487266\n 2.05358 2.05358\n -8.73912 -8.73912\n 3.01532 3.01532\n -0.926127 -0.926127\n -11.2315 -11.2315\n 1.79698 1.79698\n -13.074 -13.074\n 3.72342 3.72342\n -9.17341 -9.17341\n 7.23722 7.23722\n 3.85919 3.85919\n -4.10267 -4.10267\n 5.89157 5.89157\n -1.06631 -1.06631\n -2.18366 -2.18366\n -0.0316413 -0.0316413\n -8.63864 -8.63864\n -0.194451 -0.194451\n 2.71759 2.71759\n -5.19424 -5.19424\n -16.7634 -16.7634\n 5.97943 5.97943\n 0.319596 0.319596\n -10.0687 -10.0687\n 1.12736 1.12736\n 2.11687 2.11687\n 2.5643 2.5643\n 0.502174 0.502174\n -5.75011 -5.75011\n -11.1808 -11.1808\n -3.42246 -3.42246\n 7.55982 7.55982\n -5.85592 -5.85592\n 1.22363 1.22363\n 1.39871 1.39871\n 3.35581 3.35581\n 2.99389 2.99389\n -0.762194 -0.762194\n 1.39891 1.39891\n -4.24295 -4.24295\n -6.95612 -6.95612\n 7.00699 7.00699\n -30.893 -30.893\n -7.3071 -7.3071\n 17.5017 17.5017\n -3.26283 -3.26283\n -4.13569 -4.13569\n 4.33006 4.33006\n -5.94055 -5.94055\n -0.564017 -0.564017\n 5.60949 5.60949\n 7.50747 7.50747\n -4.08147 -4.08147\n 4.08671 4.08671\n 6.72008 6.72008\n -5.02883 -5.02883\n -3.48779 -3.48779\n 4.76881 4.76881\n 4.5818 4.5818\n -3.10608 -3.10608\n -5.08198 -5.08198\n -5.54477 -5.54477\n -13.1989 -13.1989\n -8.63604 -8.63604\n -0.688683 -0.688683\n -2.34276 -2.34276\n -3.19008 -3.19008\n 0.204818 0.204818\n 0.639057 0.639057\n 12.6767 12.6767\n -3.40057 -3.40057\n -6.36799 -6.36799\n 3.7564 3.7564\n -3.04825 -3.04825\n -3.98011 -3.98011\n -2.21944 -2.21944\n 8.40757 8.40757\n -5.6418 -5.6418\n 3.3001 3.3001\n -0.678107 -0.678107\n -2.42254 -2.42254\n 0.439524 0.439524\n -0.417505 -0.417505\n -4.98938 -4.98938\n -6.34015 -6.34015\n -4.84203 -4.84203\n 2.86778 2.86778\n 3.29409 3.29409\n 2.59772 2.59772\n 5.20187 5.20187\n 3.55625 3.55625\n -7.065 -7.065\n -6.60792 -6.60792\n -3.20259 -3.20259\n 0.417062 0.417062\n -2.39846 -2.39846\n -5.762 -5.762\n 1.74843 1.74843\n 8.19239 8.19239\n -1.7349 -1.7349\n -0.0331415 -0.0331415\n 5.00712 5.00712\n 10.611 10.611\n 9.28817 9.28817\n -3.85324 -3.85324\n 2.29622 2.29622\n 10.962 10.962\n 4.44034 4.44034\n -3.2265 -3.2265\n 1.39326 1.39326\n -1.56539 -1.56539\n -8.78843 -8.78843\n -1.74101 -1.74101\n 8.51953 8.51953\n 3.31178 3.31178\n -1.20051 -1.20051\n -3.93224 -3.93224\n 2.4431 2.4431\n 3.69278 3.69278\n -10.2714 -10.2714\n -13.7579 -13.7579\n -1.76844 -1.76844\n -0.448193 -0.448193\n 1.48574 1.48574\n -0.831377 -0.831377\n 6.42657 6.42657\n 6.51848 6.51848\n 2.7764 2.7764\n 4.29448 4.29448\n -1.27173 -1.27173\n -7.14856 -7.14856\n 2.95751 2.95751\n 2.39789 2.39789\n 4.79429 4.79429\n 7.29216 7.29216\n -4.91502 -4.91502\n 2.38701 2.38701\n -2.34997 -2.34997\n -0.876115 -0.876115\n -0.672649 -0.672649\n 4.43884 4.43884\n 0.254258 0.254258\n -3.56471 -3.56471\n 0.161779 0.161779\n -10.1128 -10.1128\n 9.97279 9.97279\n -5.01498 -5.01498\n 1.10415 1.10415\n 1.37993 1.37993\n -3.32619 -3.32619\n 2.57257 2.57257\n -0.137478 -0.137478\n 1.49426 1.49426\n -0.805644 -0.805644\n 3.25356 3.25356\n 2.46332 2.46332\n 1.39266 1.39266\n 4.15167 4.15167\n -9.27164 -9.27164\n -2.29794 -2.29794\n 0.067971 0.067971\n 3.83697 3.83697\n 5.7385 5.7385\n -6.15176 -6.15176\n -4.08442 -4.08442\n -6.18563 -6.18563\n 6.44396 6.44396\n 5.63585 5.63585\n 1.21604 1.21604\n 11.1837 11.1837\n 2.29144 2.29144\n -0.995473 -0.995473\n 5.22826 5.22826\n 9.27205 9.27205\n -7.23457 -7.23457\n 6.29887 6.29887\n 2.48343 2.48343\n -4.96111 -4.96111\n -5.52811 -5.52811\n -4.40855 -4.40855\n -5.69429 -5.69429\n -1.12765 -1.12765\n -0.22142 -0.22142\n -5.96815 -5.96815\n 4.55923 4.55923\n -1.05719 -1.05719\n 2.07986 2.07986\n 7.77539 7.77539\n -2.03581 -2.03581\n 0.270705 0.270705\n 0.126658 0.126658\n -6.1672 -6.1672\n -16.0576 -16.0576\n 0.635198 0.635198\n 8.55006 8.55006\n -2.93081 -2.93081\n -1.7657 -1.7657\n -0.37886 -0.37886\n -2.4086 -2.4086\n 1.41889 1.41889\n -1.40539 -1.40539\n 0.963807 0.963807\n -2.14947 -2.14947\n -6.31832 -6.31832\n -4.30827 -4.30827\n 6.2609 6.2609\n -8.36351 -8.36351\n 4.28564 4.28564\n 0.646361 0.646361\n 4.60485 4.60485\n -3.1664 -3.1664\n -0.611618 -0.611618\n -9.53534 -9.53534\n 1.92275 1.92275\n -8.1521 -8.1521\n 0.101441 0.101441\n 0.399002 0.399002\n -2.04551 -2.04551\n -4.5564 -4.5564\n 3.0555 3.0555\n 0.992401 0.992401\n -5.62638 -5.62638\n -0.46873 -0.46873\n -6.86208 -6.86208\n -2.77108 -2.77108\n 3.51118 3.51118\n 0.885266 0.885266\n 3.65701 3.65701\n 6.88336 6.88336\n -7.25948 -7.25948\n 7.31435 7.31435\n -6.57357 -6.57357\n 3.67947 3.67947\n 4.80901 4.80901\n -2.80342 -2.80342\n 5.78724 5.78724\n 5.30985 5.30985\n 7.24724 7.24724\n -1.30439 -1.30439\n 2.50975 2.50975\n 5.28538 5.28538\n -3.91583 -3.91583\n 2.98722 2.98722\n 5.31167 5.31167\n -0.596966 -0.596966\n -4.94141 -4.94141\n 4.59005 4.59005\n 1.3813 1.3813\n 4.0611 4.0611\n -0.747616 -0.747616\n -3.1697 -3.1697\n -1.70787 -1.70787\n -2.43542 -2.43542\n -5.86823 -5.86823\n -10.9093 -10.9093\n 5.20087 5.20087\n -6.40378 -6.40378\n 1.5149 1.5149\n -6.52874 -6.52874\n -5.69743 -5.69743\n 1.06819 1.06819\n -7.31776 -7.31776\n 3.69649 3.69649\n -4.21319 -4.21319\n -4.91507 -4.91507\n 5.44776 5.44776\n -0.708927 -0.708927\n 1.94895 1.94895\n 2.90927 2.90927\n -2.82547 -2.82547\n -1.79858 -1.79858\n -15.6727 -15.6727\n -0.308918 -0.308918\n 2.61943 2.61943\n -3.89041 -3.89041\n -1.84684 -1.84684\n -6.80446 -6.80446\n 3.97398 3.97398\n 2.31201 2.31201\n 4.29417 4.29417\n -1.24479 -1.24479\n 4.25927 4.25927\n -1.96968 -1.96968\n 0.703519 0.703519\n 2.06517 2.06517\n 0.920347 0.920347\n 6.22843 6.22843\n 1.86167 1.86167\n 0.43407 0.43407\n 1.25225 1.25225\n -0.00512493 -0.00512493\n -1.70887 -1.70887\n 0.725693 0.725693\n 6.11604 6.11604\n -5.87059 -5.87059\n 3.26102 3.26102\n 2.0488 2.0488\n -0.0544172 -0.0544172\n 2.57295 2.57295\n -1.10578 -1.10578\n 2.43904 2.43904\n -2.34604 -2.34604\n 3.2098 3.2098\n 2.16089 2.16089\n -9.35001 -9.35001\n 9.43924 9.43924\n 0.916747 0.916747\n 2.59533 2.59533\n -1.84596 -1.84596\n 1.02889 1.02889\n 0.755944 0.755944\n 8.28274 8.28274\n -3.21136 -3.21136\n 1.24897 1.24897\n -0.363928 -0.363928\n 2.37533 2.37533\n -1.5794 -1.5794\n 6.67417 6.67417\n -4.4632 -4.4632\n 8.53731 8.53731\n -1.16526 -1.16526\n -0.51467 -0.51467\n -4.91688 -4.91688\n 7.17741 7.17741\n 4.61708 4.61708\n -2.41511 -2.41511\n -11.5234 -11.5234\n 2.61523 2.61523\n 4.7703 4.7703\n 6.72381 6.72381\n 5.65388 5.65388\n -4.23963 -4.23963\n 0.925176 0.925176\n 1.98862 1.98862\n -6.14466 -6.14466\n 2.76728 2.76728\n -0.83598 -0.83598\n -4.22593 -4.22593\n 5.99083 5.99083\n -4.886 -4.886\n 4.37801 4.37801\n 5.77761 5.77761\n 3.38352 3.38352\n -0.311291 -0.311291\n 8.26669 8.26669\n -4.94787 -4.94787\n -9.62034 -9.62034\n 2.37023 2.37023\n 3.41718 3.41718\n -2.43368 -2.43368\n 3.5898 3.5898\n -1.21973 -1.21973\n 0.0350305 0.0350305\n -4.33097 -4.33097\n -3.41432 -3.41432\n 2.59161 2.59161\n -2.11239 -2.11239\n -1.0801 -1.0801\n -3.27061 -3.27061\n -0.34025 -0.34025\n -6.40563 -6.40563\n -0.522305 -0.522305\n 4.63382 4.63382\n 1.5154 1.5154\n 0.968893 0.968893\n 2.79354 2.79354\n -0.829942 -0.829942\n -1.76388 -1.76388\n -6.64903 -6.64903\n -8.52588 -8.52588\n 2.70798 2.70798\n 6.78381 6.78381\n -5.67891 -5.67891\n -0.0588557 -0.0588557\n -4.12923 -4.12923\n -2.70431 -2.70431\n -0.12131 -0.12131\n 6.59494 6.59494\n 0.830427 0.830427\n 3.40436 3.40436\n 6.98828 6.98828\n -2.33332 -2.33332\n 5.85244 5.85244\n -10.0398 -10.0398\n -0.242519 -0.242519\n -3.38719 -3.38719\n 2.74288 2.74288\n 3.82961 3.82961\n -6.85166 -6.85166\n -0.345431 -0.345431\n -3.03082 -3.03082\n 1.68089 1.68089\n -0.785036 -0.785036\n -2.92804 -2.92804\n 1.03727 1.03727\n 5.51647 5.51647\n -2.15538 -2.15538\n -6.20918 -6.20918\n -0.986195 -0.986195\n -4.4207 -4.4207\n -0.314791 -0.314791\n -6.64843 -6.64843\n 1.255 1.255\n 4.39107 4.39107\n 2.20706 2.20706\n -1.894 -1.894\n -3.01471 -3.01471\n -0.0623641 -0.0623641\n -5.76316 -5.76316\n -2.45987 -2.45987\n -2.09262 -2.09262\n 0.0458748 0.0458748\n 5.09539 5.09539\n -3.80431 -3.80431\n -3.90738 -3.90738\n -6.48843 -6.48843\n -2.58373 -2.58373\n -6.38764 -6.38764\n 7.38858 7.38858\n 0.492176 0.492176\n 8.79347 8.79347\n 2.04442 2.04442\n -0.216083 -0.216083\n 11.3375 11.3375\n -3.4177 -3.4177\n 3.90111 3.90111\n 4.92081 4.92081\n 4.45964 4.45964\n 11.1458 11.1458\n -2.2688 -2.2688\n -4.43463 -4.43463\n -4.22186 -4.22186\n -5.93987 -5.93987\n 3.4437 3.4437\n -5.60816 -5.60816\n -8.04401 -8.04401\n -4.95256 -4.95256\n 3.88283 3.88283\n -0.173935 -0.173935\n -2.63243 -2.63243\n -1.03812 -1.03812\n -9.14078 -9.14078\n -6.1411 -6.1411\n 3.4284 3.4284\n -9.8305 -9.8305\n 6.76115 6.76115\n -11.3646 -11.3646\n -5.7296 -5.7296\n -2.41831 -2.41831\n -5.21505 -5.21505\n 10.4347 10.4347\n 2.06721 2.06721\n 1.02265 1.02265\n -6.93537 -6.93537\n 1.28707 1.28707\n 0.939615 0.939615\n 11.262 11.262\n 1.2805 1.2805\n 4.8619 4.8619\n 3.15836 3.15836\n -5.18747 -5.18747\n -2.98078 -2.98078\n -2.0489 -2.0489\n -2.85634 -2.85634\n -4.56059 -4.56059\n -4.0715 -4.0715\n 0.469543 0.469543\n -2.05188 -2.05188\n -2.79567 -2.79567\n 3.82027 3.82027\n 2.55175 2.55175\n -0.468207 -0.468207\n -5.65994 -5.65994\n 2.13508 2.13508\n -3.17019 -3.17019\n 6.53032 6.53032\n -4.98714 -4.98714\n -1.94956 -1.94956\n -3.08465 -3.08465\n 8.11664 8.11664\n 8.86283 8.86283\n 0.84108 0.84108\n 5.22353 5.22353\n -3.45671 -3.45671\n -1.38725 -1.38725\n 1.35206 1.35206\n -10.4407 -10.4407\n -2.20051 -2.20051\n -0.228019 -0.228019\n -1.38039 -1.38039\n 11.1342 11.1342\n 5.17568 5.17568\n -4.54852 -4.54852\n -1.26392 -1.26392\n 5.69792 5.69792\n -4.90866 -4.90866\n 2.84526 2.84526\n 10.9699 10.9699\n 12.9756 12.9756\n 8.48223 8.48223\n 2.11902 2.11902\n 3.74471 3.74471\n -5.14437 -5.14437\n -14.7206 -14.7206\n 3.01028 3.01028\n -2.67988 -2.67988\n -2.88296 -2.88296\n -4.95895 -4.95895\n -1.82286 -1.82286\n 5.23419 5.23419\n -2.23867 -2.23867\n 0.610838 0.610838\n 2.09177 2.09177\n 5.74677 5.74677\n 3.6242 3.6242\n 2.0758 2.0758\n -2.85159 -2.85159\n -3.93562 -3.93562\n 3.85649 3.85649\n -5.75638 -5.75638\n -7.07444 -7.07444\n 0.907402 0.907402\n -8.92532 -8.92532\n -4.09782 -4.09782\n 1.85777 1.85777\n 5.73041 5.73041\n -2.17118 -2.17118\n -3.4713 -3.4713\n 7.95825 7.95825\n 9.10838 9.10838\n 1.80182 1.80182\n -0.54593 -0.54593\n -4.89919 -4.89919\n -2.97982 -2.97982\n 0.807424 0.807424\n -2.27 -2.27\n -13.2338 -13.2338\n -3.94367 -3.94367\n -5.72938 -5.72938\n -2.42243 -2.42243\n -3.69581 -3.69581\n -4.71307 -4.71307\n 1.38983 1.38983\n -5.37869 -5.37869\n -6.82815 -6.82815\n 2.73203 2.73203\n 13.6495 13.6495\n -6.29731 -6.29731\n -8.43712 -8.43712\n 14.1567 14.1567\n -0.978804 -0.978804\n 1.26264 1.26264\n -9.25575 -9.25575\n -8.10968 -8.10968\n -3.98015 -3.98015\n 6.60273 6.60273\n -3.98373 -3.98373\n 1.35817 1.35817\n 1.20988 1.20988\n 1.53069 1.53069\n 4.08368 4.08368\n -2.38429 -2.38429\n -4.67381 -4.67381\n -5.49726 -5.49726\n 0.657715 0.657715\n -0.00123905 -0.00123905\n 4.62712 4.62712\n -0.317445 -0.317445\n -5.08829 -5.08829\n -9.85674 -9.85674\n 5.31787 5.31787\n 1.61793 1.61793\n 3.9901 3.9901\n -1.04243 -1.04243\n -3.73679 -3.73679\n 0.670282 0.670282\n 9.03148 9.03148\n -4.77058 -4.77058\n 8.60147 8.60147\n -0.664744 -0.664744\n 1.97711 1.97711\n -5.35794 -5.35794\n -9.70033 -9.70033\n 10.7781 10.7781\n 1.96443 1.96443\n 1.84069 1.84069\n -12.0109 -12.0109\n 2.08404 2.08404\n 3.64031 3.64031\n 8.65585 8.65585\n -11.8355 -11.8355\n 9.89404 9.89404\n 0.279063 0.279063\n -0.315296 -0.315296\n 3.74263 3.74263\n 6.54645 6.54645\n 5.43941 5.43941\n 4.83252 4.83252\n 1.70716 1.70716\n -3.27497 -3.27497\n -3.07764 -3.07764\n 9.25309 9.25309\n -1.69559 -1.69559\n 10.1694 10.1694\n -3.42523 -3.42523\n 6.39435 6.39435\n 2.18084 2.18084\n 1.33177 1.33177\n -0.709393 -0.709393\n 1.44799 1.44799\n 0.881759 0.881759\n -2.35085 -2.35085\n -1.91407 -1.91407\n 0.302603 0.302603\n 1.40288 1.40288\n -2.37323 -2.37323\n -7.74084 -7.74084\n -7.73224 -7.73224\n 2.8793 2.8793\n 6.62065 6.62065\n 1.4654 1.4654\n -0.982735 -0.982735\n -0.97328 -0.97328\n -8.38882 -8.38882\n 8.74643 8.74643\n -7.86996 -7.86996\n 3.25655 3.25655\n 2.78551 2.78551\n -5.17511 -5.17511\n 4.90515 4.90515\n 0.28899 0.28899\n 3.57292 3.57292\n -5.25376 -5.25376\n -8.57274 -8.57274\n -1.18267 -1.18267\n 37.4072 37.4072\n -4.00801 -4.00801\n 4.8073 4.8073\n -4.45001 -4.45001\n 7.66024 7.66024\n -4.47725 -4.47725\n -10.2209 -10.2209\n -4.80026 -4.80026\n -0.64446 -0.64446\n -0.899171 -0.899171\n 1.09833 1.09833\n -0.988097 -0.988097\n 2.82126 2.82126\n -8.19269 -8.19269\n -2.64922 -2.64922\n -9.16004 -9.16004\n -2.39588 -2.39588\n -4.72025 -4.72025\n 2.34077 2.34077\n 3.83879 3.83879\n 1.9499 1.9499\n -0.361603 -0.361603\n 7.79929 7.79929\n 2.34774 2.34774\n -8.21052 -8.21052\n -2.02077 -2.02077\n -1.58017 -1.58017\n -0.410542 -0.410542\n -10.7206 -10.7206\n 3.26874 3.26874\n 2.80972 2.80972\n 0.0906836 0.0906836\n -1.64773 -1.64773\n 6.49353 6.49353\n -0.791109 -0.791109\n 4.71404 4.71404\n 0.0741314 0.0741314\n -0.414415 -0.414415\n 6.84572 6.84572\n -0.367457 -0.367457\n 1.17563 1.17563\n 0.51039 0.51039\n 4.40348 4.40348\n 0.978932 0.978932\n 3.79206 3.79206\n 4.57632 4.57632\n 2.77883 2.77883\n 0.490867 0.490867\n -0.151798 -0.151798\n 6.72243 6.72243\n 4.77773 4.77773\n -0.50633 -0.50633\n -8.08639 -8.08639\n 4.88619 4.88619\n -2.07669 -2.07669\n -2.24093 -2.24093\n 1.72994 1.72994\n -7.45157 -7.45157\n -12.1192 -12.1192\n 1.4328 1.4328\n -8.14432 -8.14432\n -6.25485 -6.25485\n 0.516865 0.516865\n 7.11864 7.11864\n -0.616318 -0.616318\n -0.761916 -0.761916\n -5.99496 -5.99496\n 10.4321 10.4321\n -0.516052 -0.516052\n -5.68287 -5.68287\n -4.15541 -4.15541\n 1.56619 1.56619\n -20.8292 -20.8292\n 0.788033 0.788033\n 3.34264 3.34264\n 3.70493 3.70493\n -0.0822138 -0.0822138\n 2.31304 2.31304\n -1.69352 -1.69352\n 2.10396 2.10396\n 7.2613 7.2613\n -1.81799 -1.81799\n -2.09968 -2.09968\n -3.8336 -3.8336\n -3.93478 -3.93478\n 3.3059 3.3059\n 4.19189 4.19189\n -1.93794 -1.93794\n 2.7117 2.7117\n 9.43261 9.43261\n -1.83318 -1.83318\n -1.12685 -1.12685\n 2.40725 2.40725\n 7.50947 7.50947\n 7.65688 7.65688\n -5.02792 -5.02792\n -2.55777 -2.55777\n -1.9946 -1.9946\n -0.126192 -0.126192\n -3.30905 -3.30905\n -0.209775 -0.209775\n 9.06409 9.06409\n -3.79201 -3.79201\n 8.80185 8.80185\n -1.59367 -1.59367\n -2.49213 -2.49213\n -3.5242 -3.5242\n 2.4892 2.4892\n 5.68222 5.68222\n 4.29073 4.29073\n 0.490494 0.490494\n 3.31313 3.31313\n 8.27344 8.27344\n 1.44936 1.44936\n 5.94283 5.94283\n -5.90497 -5.90497\n 0.316931 0.316931\n 1.93975 1.93975\n -1.33405 -1.33405\n -4.17957 -4.17957\n 2.45999 2.45999\n -10.0965 -10.0965\n 0.648564 0.648564\n -0.745957 -0.745957\n -6.08922 -6.08922\n -6.45851 -6.45851\n 2.70093 2.70093\n -2.59331 -2.59331\n -2.73319 -2.73319\n -6.50584 -6.50584\n 4.14167 4.14167\n 6.78757 6.78757\n 4.63335 4.63335\n 2.01754 2.01754\n 3.97717 3.97717\n 2.73775 2.73775\n 2.04299 2.04299\n 7.03044 7.03044\n -8.59414 -8.59414\n -4.19956 -4.19956\n 0.0135157 0.0135157\n -5.45393 -5.45393\n 2.75578 2.75578\n 0.730278 0.730278\n -0.410035 -0.410035\n 10.7831 10.7831\n -2.82537 -2.82537\n 1.85601 1.85601\n 1.68496 1.68496\n 2.75249 2.75249\n 9.40848 9.40848\n 1.6032 1.6032\n -3.91263 -3.91263\n 1.12247 1.12247\n -3.46516 -3.46516\n -1.48668 -1.48668\n 6.7676 6.7676\n -5.76927 -5.76927\n -2.19943 -2.19943\n -1.61329 -1.61329\n 3.35791 3.35791\n -7.80737 -7.80737\n 3.06567 3.06567\n -12.2037 -12.2037\n 12.541 12.541\n 4.42316 4.42316\n 6.48419 6.48419\n 1.17664 1.17664\n 2.97986 2.97986\n -8.63966 -8.63966\n 0.241757 0.241757\n -5.03654 -5.03654\n -1.94594 -1.94594\n 12.8093 12.8093\n -3.58644 -3.58644\n -3.35952 -3.35952\n -0.864134 -0.864134\n -12.4807 -12.4807\n -1.69909 -1.69909\n -5.67676 -5.67676\n -10.6435 -10.6435\n -3.86815 -3.86815\n 4.20674 4.20674\n -4.94992 -4.94992\n 7.63289 7.63289\n -5.5226 -5.5226\n 1.58362 1.58362\n 1.14864 1.14864\n 5.98635 5.98635\n 11.9692 11.9692\n -0.208588 -0.208588\n -0.177219 -0.177219\n 6.35143 6.35143\n -2.21028 -2.21028\n 0.693657 0.693657\n 2.66882 2.66882\n -0.494413 -0.494413\n 10.9482 10.9482\n 2.9522 2.9522\n 1.69427 1.69427\n -5.54007 -5.54007\n -1.44208 -1.44208\n -2.75377 -2.75377\n 7.62773 7.62773\n -0.0991657 -0.0991657\n 0.541024 0.541024\n 0.383422 0.383422\n -6.28538 -6.28538\n -3.63239 -3.63239\n 5.54891 5.54891\n 4.38377 4.38377\n -4.21607 -4.21607\n -1.58462 -1.58462\n 1.99568 1.99568\n 1.70177 1.70177\n 1.65142 1.65142\n 1.79811 1.79811\n -6.82605 -6.82605\n 3.65159 3.65159\n 2.60935 2.60935\n -2.91237 -2.91237\n -1.56808 -1.56808\n -3.07334 -3.07334\n 0.883426 0.883426\n -1.59697 -1.59697\n 4.44432 4.44432\n -2.72255 -2.72255\n -0.853149 -0.853149\n -0.132598 -0.132598\n -0.63629 -0.63629\n -3.69308 -3.69308\n -7.18449 -7.18449\n 1.20547 1.20547\n 14.3427 14.3427\n 5.08288 5.08288\n 0.957041 0.957041\n 0.153537 0.153537\n -7.14906 -7.14906\n -8.78572 -8.78572\n 4.05049 4.05049\n 3.22929 3.22929\n -3.34601 -3.34601\n 3.86442 3.86442\n -2.80641 -2.80641\n 6.51055 6.51055\n -4.58706 -4.58706\n -1.51146 -1.51146\n 3.88212 3.88212\n 1.89549 1.89549\n 3.50062 3.50062\n -1.43005 -1.43005\n -2.91969 -2.91969\n -6.52573 -6.52573\n -3.8843 -3.8843\n -8.34716 -8.34716\n -7.42192 -7.42192\n -3.98985 -3.98985\n 15.526 15.526\n 8.70318 8.70318\n -1.10105 -1.10105\n 2.14694 2.14694\n 7.71484 7.71484\n -0.0260442 -0.0260442\n -3.31138 -3.31138\n 1.67906 1.67906\n -0.083112 -0.083112\n -8.42905 -8.42905\n -8.82729 -8.82729\n 11.2859 11.2859\n -8.07136 -8.07136\n -3.9371 -3.9371\n -4.63176 -4.63176\n -1.23605 -1.23605\n -2.08565 -2.08565\n 1.93918 1.93918\n -12.5031 -12.5031\n -0.442281 -0.442281\n -5.50289 -5.50289\n -0.815112 -0.815112\n 0.0898735 0.0898735\n 4.69373 4.69373\n -7.22004 -7.22004\n 0.543294 0.543294\n 4.2932 4.2932\n 2.12984 2.12984\n -4.42752 -4.42752\n 3.03694 3.03694\n -3.73337 -3.73337\n -12.0483 -12.0483\n -5.99704 -5.99704\n 0.0707967 0.0707967\n -4.52239 -4.52239\n 3.65625 3.65625\n -5.61903 -5.61903\n 9.78971 9.78971\n 8.47575 8.47575\n -0.320966 -0.320966\n -7.10339 -7.10339\n 0.485669 0.485669\n 3.19439 3.19439\n -0.411976 -0.411976\n -0.782875 -0.782875\n 16.4086 16.4086\n -2.67312 -2.67312\n 0.73424 0.73424\n 8.32014 8.32014\n -1.24665 -1.24665\n 3.70031 3.70031\n -6.22155 -6.22155\n -6.34804 -6.34804\n -4.84631 -4.84631\n 7.19111 7.19111\n 2.58937 2.58937\n 2.12044 2.12044\n -0.304369 -0.304369\n -11.5161 -11.5161\n -4.75933 -4.75933\n -5.40287 -5.40287\n -14.7511 -14.7511\n -11.3269 -11.3269\n -3.40961 -3.40961\n -8.36998 -8.36998\n -7.86816 -7.86816\n 3.46638 3.46638\n 5.10745 5.10745\n 9.12589 9.12589\n 4.53119 4.53119\n -0.0952322 -0.0952322\n -1.67069 -1.67069\n 1.48937 1.48937\n 2.1548 2.1548\n -0.680895 -0.680895\n 6.00943 6.00943\n -6.23597 -6.23597\n 15.2635 15.2635\n -5.39621 -5.39621\n 2.9004 2.9004\n -7.2031 -7.2031\n 0.188095 0.188095\n -5.65511 -5.65511\n 8.80472 8.80472\n 4.77116 4.77116\n -0.320718 -0.320718\n -0.094774 -0.094774\n 4.24892 4.24892\n -0.729715 -0.729715\n 3.46906 3.46906\n -4.86913 -4.86913\n -2.05092 -2.05092\n 3.24008 3.24008\n 2.67334 2.67334\n 5.41008 5.41008\n 4.61387 4.61387\n -11.9338 -11.9338\n 2.15538 2.15538\n 3.39914 3.39914\n 2.71216 2.71216\n 6.79031 6.79031\n -0.750493 -0.750493\n -0.683416 -0.683416\n 7.23875 7.23875\n 4.67949 4.67949\n -2.16467 -2.16467\n 3.64787 3.64787\n -1.27823 -1.27823\n -1.43992 -1.43992\n 3.183 3.183\n -8.60412 -8.60412\n -5.42757 -5.42757\n -0.564214 -0.564214\n -1.17837 -1.17837\n 2.45248 2.45248\n 3.60909 3.60909\n 2.61183 2.61183\n 5.20279 5.20279\n -1.07145 -1.07145\n -0.919519 -0.919519\n 3.89898 3.89898\n 3.72175 3.72175\n -9.9673 -9.9673\n 1.50607 1.50607\n -0.456562 -0.456562\n 10.9984 10.9984\n -2.18673 -2.18673\n -7.39159 -7.39159\n -5.54389 -5.54389\n 2.6353 2.6353\n 6.87535 6.87535\n -10.4019 -10.4019\n -5.51375 -5.51375\n -3.33244 -3.33244\n 7.60358 7.60358\n -9.48529 -9.48529\n -0.514099 -0.514099\n 6.20569 6.20569\n -4.60198 -4.60198\n -1.28686 -1.28686\n -0.383981 -0.383981\n -0.173934 -0.173934\n -7.97782 -7.97782\n 5.9926 5.9926\n -3.7357 -3.7357\n -7.77841 -7.77841\n 3.09245 3.09245\n -3.70421 -3.70421\n -1.50012 -1.50012\n -3.90181 -3.90181\n 0.183002 0.183002\n -4.72374 -4.72374\n -3.36966 -3.36966\n 8.23642 8.23642\n 0.387898 0.387898\n -2.53048 -2.53048\n 4.46348 4.46348\n -0.932844 -0.932844\n -1.76804 -1.76804\n -0.390175 -0.390175\n 8.28101 8.28101\n 8.66959 8.66959\n 2.47585 2.47585\n 6.33837 6.33837\n 3.05846 3.05846\n 6.43047 6.43047\n 0.167477 0.167477\n 0.615034 0.615034\n -8.467 -8.467\n 2.15566 2.15566\n 6.59172 6.59172\n -8.30068 -8.30068\n -2.92268 -2.92268\n -1.14616 -1.14616\n 3.864 3.864\n -8.07267 -8.07267\n 0.382952 0.382952\n 4.79087 4.79087\n 7.87692 7.87692\n -1.27352 -1.27352\n -0.439992 -0.439992\n -0.361056 -0.361056\n 5.51463 5.51463\n 4.10827 4.10827\n -1.36056 -1.36056\n -10.9063 -10.9063\n -3.12566 -3.12566\n -1.52612 -1.52612\n 2.47429 2.47429\n 1.92973 1.92973\n 6.05399 6.05399\n 6.35717 6.35717\n -6.54112 -6.54112\n 0.16752 0.16752\n -0.581192 -0.581192\n -3.91981 -3.91981\n 3.29046 3.29046\n -9.85289 -9.85289\n -1.68008 -1.68008\n -0.294261 -0.294261\n -2.33446 -2.33446\n 8.72203 8.72203\n -7.53754 -7.53754\n 1.8548 1.8548\n 0.0863562 0.0863562\n 3.71224 3.71224\n -2.72156 -2.72156\n 6.92717 6.92717\n 4.22066 4.22066\n 2.9384 2.9384\n -0.436476 -0.436476\n 7.94505 7.94505\n 3.35167 3.35167\n 4.57606 4.57606\n -1.94551 -1.94551\n 7.26891 7.26891\n 5.7114 5.7114\n -4.8975 -4.8975\n 0.24802 0.24802\n 4.4272 4.4272\n 3.21714 3.21714\n -2.75997 -2.75997\n 3.0239 3.0239\n 6.00743 6.00743\n 1.95157 1.95157\n -8.23524 -8.23524\n -0.0388194 -0.0388194\n -1.59723 -1.59723\n -15.7227 -15.7227\n 5.01363 5.01363\n 2.59661 2.59661\n 0.344503 0.344503\n 7.85727 7.85727\n 0.142462 0.142462\n -3.54743 -3.54743\n -4.18558 -4.18558\n 3.96172 3.96172\n -0.376684 -0.376684\n 3.78763 3.78763\n -1.58384 -1.58384\n 15.837 15.837\n -0.887404 -0.887404\n 0.855016 0.855016\n 11.1701 11.1701\n 5.15206 5.15206\n 6.83176 6.83176\n -0.91331 -0.91331\n -10.3398 -10.3398\n 2.48231 2.48231\n -2.03572 -2.03572\n 1.09096 1.09096\n -0.162198 -0.162198\n -7.32758 -7.32758\n -6.97941 -6.97941\n 5.98831 5.98831\n -7.43703 -7.43703\n -8.97936 -8.97936\n 0.676949 0.676949\n 1.37291 1.37291\n 4.41159 4.41159\n 2.45643 2.45643\n 2.79374 2.79374\n 2.36712 2.36712\n -7.74483 -7.74483\n 0.602922 0.602922\n -2.48544 -2.48544\n 0.299035 0.299035\n 6.77695 6.77695\n 1.44763 1.44763\n 1.94637 1.94637\n -4.04181 -4.04181\n 16.3509 16.3509\n 6.4273 6.4273\n 5.41235 5.41235\n -5.91387 -5.91387\n -6.06301 -6.06301\n 3.4536 3.4536\n -3.39128 -3.39128\n 11.299 11.299\n 2.62685 2.62685\n 1.00866 1.00866\n 10.6766 10.6766\n -0.805083 -0.805083\n 3.91073 3.91073\n 3.67201 3.67201\n -9.14116 -9.14116\n 15.6406 15.6406\n 3.22084 3.22084\n -2.90513 -2.90513\n 4.58966 4.58966\n 0.0983211 0.0983211\n 2.35908 2.35908\n 0.658109 0.658109\n 2.37478 2.37478\n -6.70679 -6.70679\n 6.08307 6.08307\n -29.6624 -29.6624\n 1.55578 1.55578\n 5.31311 5.31311\n -5.40681 -5.40681\n 1.80228 1.80228\n 4.50431 4.50431\n 7.25673 7.25673\n 5.89811 5.89811\n -2.92888 -2.92888\n 7.48853 7.48853\n -1.67318 -1.67318\n 0.974302 0.974302\n -8.10178 -8.10178\n 3.29435 3.29435\n -1.64519 -1.64519\n -7.08854 -7.08854\n 6.68891 6.68891\n -5.69927 -5.69927\n -3.51768 -3.51768\n 11.2895 11.2895\n -0.828568 -0.828568\n 5.53562 5.53562\n -0.358066 -0.358066\n -5.92559 -5.92559\n 4.39224 4.39224\n -5.1225 -5.1225\n -9.51174 -9.51174\n 9.80076 9.80076\n -1.85858 -1.85858\n 6.95181 6.95181\n -1.71297 -1.71297\n -0.275297 -0.275297\n -0.860135 -0.860135\n -0.484906 -0.484906\n 5.71425 5.71425\n 2.74639 2.74639\n -8.40417 -8.40417\n -1.84935 -1.84935\n 2.94526 2.94526\n 10.708 10.708\n 0.892511 0.892511\n -1.36773 -1.36773\n -7.25911 -7.25911\n 3.91428 3.91428\n -0.776027 -0.776027\n 3.44102 3.44102\n -4.87806 -4.87806\n 3.65101 3.65101\n -3.01077 -3.01077\n 1.17918 1.17918\n 5.82266 5.82266\n 8.52564 8.52564\n 4.35296 4.35296\n -2.94897 -2.94897\n -4.19366 -4.19366\n -4.7939 -4.7939\n 3.44038 3.44038\n -7.87089 -7.87089\n -3.18931 -3.18931\n -6.65708 -6.65708\n 1.09687 1.09687\n -4.36662 -4.36662\n 2.90783 2.90783\n 4.66889 4.66889\n -1.26146 -1.26146\n -2.01469 -2.01469\n -2.44566 -2.44566\n -2.15098 -2.15098\n 3.4006 3.4006\n 0.0396139 0.0396139\n 2.29469 2.29469\n -7.62709 -7.62709\n 7.18738 7.18738\n 1.45481 1.45481\n 2.37791 2.37791\n -5.37208 -5.37208\n -0.0612415 -0.0612415\n -1.46115 -1.46115\n 4.29624 4.29624\n 3.25993 3.25993\n 2.42986 2.42986\n 6.56133 6.56133\n -2.07349 -2.07349\n 5.61643 5.61643\n 5.48251 5.48251\n -0.703666 -0.703666\n -5.09456 -5.09456\n 0.57249 0.57249\n 4.28577 4.28577\n 2.468 2.468\n -10.013 -10.013\n -3.26046 -3.26046\n -7.91038 -7.91038\n -2.03302 -2.03302\n 3.49234 3.49234\n -1.2481 -1.2481\n -1.87417 -1.87417\n -1.93016 -1.93016\n 2.14307 2.14307\n -9.0722 -9.0722\n 2.03124 2.03124\n -0.938906 -0.938906\n 0.817464 0.817464\n 2.23636 2.23636\n 1.3076 1.3076\n 4.90629 4.90629\n 2.16603 2.16603\n 5.84398 5.84398\n -6.56748 -6.56748\n 7.22968 7.22968\n 0.664381 0.664381\n 11.2001 11.2001\n -4.98902 -4.98902\n 0.841822 0.841822\n -1.35522 -1.35522\n -2.43996 -2.43996\n 5.14732 5.14732\n -7.50974 -7.50974\n 5.73113 5.73113\n -2.72015 -2.72015\n -5.04474 -5.04474\n -13.1 -13.1\n 0.0777815 0.0777815\n 7.85631 7.85631\n -0.323243 -0.323243\n -2.97974 -2.97974\n 0.925187 0.925187\n 5.77219 5.77219\n 4.39868 4.39868\n 2.22326 2.22326\n 1.79052 1.79052\n -3.37507 -3.37507\n -4.08645 -4.08645\n 5.59349 5.59349\n 11.879 11.879\n -0.8099 -0.8099\n 16.6866 16.6866\n 2.85772 2.85772\n 3.73902 3.73902\n -0.406009 -0.406009\n 7.49033 7.49033\n -1.01733 -1.01733\n 4.03678 4.03678\n 4.91574 4.91574\n 14.6191 14.6191\n -1.18215 -1.18215\n -2.79895 -2.79895\n -5.16604 -5.16604\n -2.24596 -2.24596\n 1.83945 1.83945\n 1.72673 1.72673\n -23.2963 -23.2963\n -0.623748 -0.623748\n -2.8419 -2.8419\n 6.56374 6.56374\n 10.3431 10.3431\n 5.28302 5.28302\n 3.12716 3.12716\n 8.41242 8.41242\n 0.416003 0.416003\n -2.43236 -2.43236\n -1.63284 -1.63284\n 5.3806 5.3806\n 9.39975 9.39975\n 4.44496 4.44496\n -3.01441 -3.01441\n -1.33538 -1.33538\n 2.23541 2.23541\n -4.30131 -4.30131\n -1.20324 -1.20324\n 4.79406 4.79406\n 0.692551 0.692551\n -2.20403 -2.20403\n 0.12931 0.12931\n 0.842875 0.842875\n 0.29791 0.29791\n 6.59639 6.59639\n 8.6591 8.6591\n 2.07311 2.07311\n -6.48842 -6.48842\n 2.70007 2.70007\n -0.143695 -0.143695\n 3.99651 3.99651\n 6.86089 6.86089\n -2.54281 -2.54281\n -5.085 -5.085\n 3.61747 3.61747\n 2.09466 2.09466\n 3.35667 3.35667\n 7.38405 7.38405\n 0.816999 0.816999\n -0.564258 -0.564258\n 2.46281 2.46281\n -0.081471 -0.081471\n 12.0933 12.0933\n 9.45364 9.45364\n 0.303564 0.303564\n -2.20687 -2.20687\n 1.90101 1.90101\n -2.65606 -2.65606\n -11.3589 -11.3589\n -1.68249 -1.68249\n -1.25813 -1.25813\n -0.96125 -0.96125\n -2.84666 -2.84666\n 1.18914 1.18914\n 0.211945 0.211945\n -4.8988 -4.8988\n 0.894798 0.894798\n 3.9685 3.9685\n -0.852608 -0.852608\n 3.37537 3.37537\n -0.847579 -0.847579\n -4.37006 -4.37006\n -4.12787 -4.12787\n 4.37155 4.37155\n -7.86631 -7.86631\n -3.59755 -3.59755\n -2.55397 -2.55397\n 4.25921 4.25921\n 2.21721 2.21721\n 5.72299 5.72299\n 8.32362 8.32362\n 14.4057 14.4057\n 1.49376 1.49376\n 3.108 3.108\n -1.34388 -1.34388\n 3.77816 3.77816\n 5.69761 5.69761\n 0.255491 0.255491\n 4.15979 4.15979\n -14.6016 -14.6016\n 3.1475 3.1475\n 2.86732 2.86732\n -2.7875 -2.7875\n -8.78827 -8.78827\n -1.38068 -1.38068\n -2.74156 -2.74156\n -4.82257 -4.82257\n -4.64984 -4.64984\n -0.462036 -0.462036\n 2.36274 2.36274\n 2.73927 2.73927\n -4.01583 -4.01583\n -4.20256 -4.20256\n 7.33455 7.33455\n 7.53557 7.53557\n 3.2532 3.2532\n -0.556551 -0.556551\n 4.39618 4.39618\n 2.92025 2.92025\n -49.4395 -49.4395\n 1.84066 1.84066\n -6.03682 -6.03682\n 9.70956 9.70956\n 12.18 12.18\n -0.134471 -0.134471\n 0.388477 0.388477\n -4.30526 -4.30526\n 3.98614 3.98614\n -3.20351 -3.20351\n 3.81764 3.81764\n 5.34853 5.34853\n 0.382215 0.382215\n -0.473372 -0.473372\n -4.4073 -4.4073\n -10.1129 -10.1129\n -6.82482 -6.82482\n 5.39935 5.39935\n -0.664077 -0.664077\n 7.75577 7.75577\n -5.565 -5.565\n -2.28518 -2.28518\n -3.09472 -3.09472\n 6.0196 6.0196\n -1.32035 -1.32035\n 2.5721 2.5721\n -9.0201 -9.0201\n 6.87621 6.87621\n 7.57662 7.57662\n -2.42131 -2.42131\n -7.11 -7.11\n 1.5457 1.5457\n 1.38686 1.38686\n -1.67077 -1.67077\n 5.34357 5.34357\n -5.22992 -5.22992\n -5.50112 -5.50112\n -0.820436 -0.820436\n -6.85987 -6.85987\n 4.36935 4.36935\n 8.27737 8.27737\n 7.16613 7.16613\n 7.21538 7.21538\n 0.0297893 0.0297893\n -3.30991 -3.30991\n 1.18508 1.18508\n -0.745072 -0.745072\n -1.31153 -1.31153\n -2.57184 -2.57184\n -0.187369 -0.187369\n 6.79233 6.79233\n 8.04294 8.04294\n 3.06986 3.06986\n -5.13761 -5.13761\n 0.539648 0.539648\n 5.02007 5.02007\n 2.67737 2.67737\n -6.69984 -6.69984\n 6.76321 6.76321\n 6.25102 6.25102\n 3.80545 3.80545\n -2.16059 -2.16059\n 2.81803 2.81803\n 0.447194 0.447194\n 1.84756 1.84756\n -6.42528 -6.42528\n -2.23379 -2.23379\n -2.61151 -2.61151\n -2.86143 -2.86143\n -2.94039 -2.94039\n -3.38503 -3.38503\n 0.474985 0.474985\n -9.66389 -9.66389\n 4.96293 4.96293\n -5.6718 -5.6718\n 7.06422 7.06422\n -8.36354 -8.36354\n 0.0182466 0.0182466\n 9.20883 9.20883\n 8.23981 8.23981\n -1.41968 -1.41968\n -1.36057 -1.36057\n -3.99568 -3.99568\n 2.51484 2.51484\n 5.41846 5.41846\n -10.8511 -10.8511\n -8.41267 -8.41267\n 2.04668 2.04668\n -5.61525 -5.61525\n -9.73507 -9.73507\n -0.497102 -0.497102\n 4.29467 4.29467\n -1.61424 -1.61424\n -0.818494 -0.818494\n -7.02135 -7.02135\n 13.4836 13.4836\n -4.10115 -4.10115\n -8.11914 -8.11914\n -2.79895 -2.79895\n -4.39428 -4.39428\n -0.737467 -0.737467\n 1.37013 1.37013\n 9.56244 9.56244\n 2.92491 2.92491\n -7.13393 -7.13393\n -0.179291 -0.179291\n -6.00313 -6.00313\n 7.27104 7.27104\n -1.7103 -1.7103\n -7.84843 -7.84843\n 13.7304 13.7304\n 2.40973 2.40973\n -7.07755 -7.07755\n 1.31745 1.31745\n -9.99271 -9.99271\n -15.4753 -15.4753\n 4.38711 4.38711\n -5.41127 -5.41127\n -1.06491 -1.06491\n 1.09245 1.09245\n -1.33961 -1.33961\n -4.42681 -4.42681\n -4.44164 -4.44164\n -1.80772 -1.80772\n -5.06035 -5.06035\n 0.197369 0.197369\n 7.27798 7.27798\n -6.88382 -6.88382\n 3.21319 3.21319\n 8.04111 8.04111\n -3.94107 -3.94107\n 1.79716 1.79716\n -0.2134 -0.2134\n 1.36955 1.36955\n 13.7009 13.7009\n -7.3497 -7.3497\n 1.80078 1.80078\n 4.25352 4.25352\n -2.80092 -2.80092\n -3.81295 -3.81295\n -4.92036 -4.92036\n 0.856001 0.856001\n -1.26696 -1.26696\n 2.65207 2.65207\n -1.01876 -1.01876\n 1.50837 1.50837\n -11.5335 -11.5335\n 5.80989 5.80989\n 2.45606 2.45606\n 1.64394 1.64394\n 2.73651 2.73651\n -11.1653 -11.1653\n -1.66359 -1.66359\n -0.0317267 -0.0317267\n 0.115458 0.115458\n 4.43585 4.43585\n 1.24902 1.24902\n 7.30894 7.30894\n 16.7814 16.7814\n -0.456154 -0.456154\n -3.94033 -3.94033\n -4.4947 -4.4947\n -2.52048 -2.52048\n 0.0890704 0.0890704\n -4.66338 -4.66338\n 3.88142 3.88142\n 2.35984 2.35984\n 4.84037 4.84037\n 6.95444 6.95444\n 2.74408 2.74408\n -3.23958 -3.23958\n -0.467292 -0.467292\n 6.26367 6.26367\n -1.50588 -1.50588\n 4.13389 4.13389\n -2.53819 -2.53819\n -4.4987 -4.4987\n -10.3487 -10.3487\n -14.8297 -14.8297\n -8.48112 -8.48112\n 3.95155 3.95155\n 1.2289 1.2289\n -4.38025 -4.38025\n -0.61687 -0.61687\n 10.8511 10.8511\n 1.15556 1.15556\n -2.19768 -2.19768\n -7.66931 -7.66931\n 4.72919 4.72919\n -7.6738 -7.6738\n -0.688528 -0.688528\n 4.74928 4.74928\n 4.92126 4.92126\n 0.897546 0.897546\n 3.85735 3.85735\n 0.201364 0.201364\n -5.62425 -5.62425\n -3.83117 -3.83117\n 4.05866 4.05866\n 3.10063 3.10063\n 2.5224 2.5224\n -1.51274 -1.51274\n -0.683338 -0.683338\n -3.23147 -3.23147\n -4.21268 -4.21268\n -2.21401 -2.21401\n 1.57887 1.57887\n 0.848257 0.848257\n -5.83704 -5.83704\n -7.00011 -7.00011\n 3.16884 3.16884\n -4.44161 -4.44161\n -7.62482 -7.62482\n -0.266943 -0.266943\n 0.41761 0.41761\n -7.45144 -7.45144\n -0.211132 -0.211132\n 0.276707 0.276707\n 16.7781 16.7781\n 0.689757 0.689757\n -3.04049 -3.04049\n 2.91684 2.91684\n 1.97161 1.97161\n 3.7721 3.7721\n -1.60698 -1.60698\n -4.18868 -4.18868\n 7.66491 7.66491\n -0.64664 -0.64664\n -0.660623 -0.660623\n 8.68174 8.68174\n 0.282074 0.282074\n -2.85266 -2.85266\n -1.91293 -1.91293\n 7.18736 7.18736\n -10.3875 -10.3875\n -1.91603 -1.91603\n 6.29739 6.29739\n -0.0375388 -0.0375388\n -1.60576 -1.60576\n -3.22148 -3.22148\n -4.24549 -4.24549\n 1.30822 1.30822\n 2.52307 2.52307\n 0.403345 0.403345\n -0.744478 -0.744478\n 2.41241 2.41241\n -4.58098 -4.58098\n -0.791842 -0.791842\n 3.73626 3.73626\n -1.43002 -1.43002\n 4.30716 4.30716\n 3.30255 3.30255\n -4.08011 -4.08011\n -5.07282 -5.07282\n -1.54759 -1.54759\n -2.2305 -2.2305\n 6.8791 6.8791\n 9.7396 9.7396\n -6.50395 -6.50395\n 3.57178 3.57178\n 7.08987 7.08987\n 6.2669 6.2669\n 5.87329 5.87329\n 2.36823 2.36823\n -6.16 -6.16\n 1.96238 1.96238\n 7.31651 7.31651\n -1.5257 -1.5257\n -2.89061 -2.89061\n 0.407546 0.407546\n 5.10645 5.10645\n 11.0716 11.0716\n 4.7443 4.7443\n -8.77353 -8.77353\n -0.631177 -0.631177\n -4.36973 -4.36973\n 1.48666 1.48666\n 7.7678 7.7678\n -2.65407 -2.65407\n 4.56869 4.56869\n -0.541163 -0.541163\n 2.89543 2.89543\n 5.39424 5.39424\n -3.62954 -3.62954\n 3.77547 3.77547\n -5.96886 -5.96886\n -4.38947 -4.38947\n -2.96756 -2.96756\n 2.28222 2.28222\n -1.08489 -1.08489\n 1.74726 1.74726\n -3.46088 -3.46088\n 11.9371 11.9371\n -5.02359 -5.02359\n 2.51632 2.51632\n -0.0297022 -0.0297022\n -2.60011 -2.60011\n 0.254202 0.254202\n 9.7949 9.7949\n 3.64937 3.64937\n 10.0857 10.0857\n -5.36637 -5.36637\n 4.11127 4.11127\n 8.90571 8.90571\n -5.97219 -5.97219\n -7.21379 -7.21379\n -5.01561 -5.01561\n 2.98616 2.98616\n 1.99064 1.99064\n 0.16465 0.16465\n -4.07902 -4.07902\n 4.34018 4.34018\n -2.13528 -2.13528\n 2.39903 2.39903\n 4.00804 4.00804\n -1.85741 -1.85741\n -7.73083 -7.73083\n -4.21139 -4.21139\n 4.65743 4.65743\n 0.963549 0.963549\n 0.29506 0.29506\n 6.05798 6.05798\n 12.4428 12.4428\n -0.398651 -0.398651\n -0.584559 -0.584559\n 2.75445 2.75445\n -0.207975 -0.207975\n 6.11926 6.11926\n -8.66125 -8.66125\n 3.07568 3.07568\n -3.19358 -3.19358\n -2.53024 -2.53024\n 14.1187 14.1187\n -0.412049 -0.412049\n 12.5809 12.5809\n 6.26236 6.26236\n 5.23037 5.23037\n -0.11356 -0.11356\n -6.62321 -6.62321\n -1.29651 -1.29651\n -1.48734 -1.48734\n 13.0753 13.0753\n 4.21767 4.21767\n -2.4425 -2.4425\n -0.0901323 -0.0901323\n 9.79684 9.79684\n 4.74522 4.74522\n -3.34804 -3.34804\n 7.37816 7.37816\n 2.57938 2.57938\n 1.92968 1.92968\n 3.75166 3.75166\n 5.0617 5.0617\n 8.74324 8.74324\n -0.93703 -0.93703\n -1.36031 -1.36031\n -2.5439 -2.5439\n 1.56784 1.56784\n 2.56237 2.56237\n -1.02578 -1.02578\n 6.62085 6.62085\n 7.69745 7.69745\n 6.26864 6.26864\n -4.20046 -4.20046\n -2.30926 -2.30926\n 2.74598 2.74598\n 4.11078 4.11078\n 2.8455 2.8455\n -3.45407 -3.45407\n 2.82327 2.82327\n -1.00356 -1.00356\n 8.85974 8.85974\n 6.35864 6.35864\n -1.59146 -1.59146\n -0.361996 -0.361996\n -1.25198 -1.25198\n 8.2867 8.2867\n 0.981644 0.981644\n 2.68003 2.68003\n 1.10236 1.10236\n -1.63423 -1.63423\n -2.79552 -2.79552\n -6.5718 -6.5718\n -0.257779 -0.257779\n -4.49325 -4.49325\n 5.0455 5.0455\n 14.4508 14.4508\n 3.60407 3.60407\n 3.09003 3.09003\n -8.32962 -8.32962\n -1.41178 -1.41178\n 12.5777 12.5777\n -2.01342 -2.01342\n -1.48205 -1.48205\n 0.967158 0.967158\n -0.532548 -0.532548\n -5.23274 -5.23274\n -1.49702 -1.49702\n 0.739607 0.739607\n 3.49171 3.49171\n -1.0507 -1.0507\n -7.48299 -7.48299\n 7.57395 7.57395\n -3.04813 -3.04813\n 16.322 16.322\n 7.81441 7.81441\n -3.41529 -3.41529\n 2.05401 2.05401\n 1.08232 1.08232\n 12.5735 12.5735\n 0.126572 0.126572\n -6.92158 -6.92158\n -1.4651 -1.4651\n -3.19425 -3.19425\n -1.44093 -1.44093\n -3.82056 -3.82056\n 6.72914 6.72914\n -5.46583 -5.46583\n -1.43396 -1.43396\n 7.42164 7.42164\n 1.00438 1.00438\n -0.41415 -0.41415\n -2.54987 -2.54987\n 6.88491 6.88491\n 3.84807 3.84807\n -5.62245 -5.62245\n 5.24133 5.24133\n 7.99514 7.99514\n -2.51593 -2.51593\n 8.19568 8.19568\n 0.854985 0.854985\n -6.20478 -6.20478\n -2.58235 -2.58235\n -6.51346 -6.51346\n 12.8877 12.8877\n 8.6194 8.6194\n -6.82669 -6.82669\n -4.67379 -4.67379\n 8.13137 8.13137\n 0.733511 0.733511\n 5.66079 5.66079\n -2.94337 -2.94337\n -3.29462 -3.29462\n -6.3809 -6.3809\n -1.85613 -1.85613\n 0.635069 0.635069\n 0.432626 0.432626\n -14.6426 -14.6426\n 8.05825 8.05825\n 6.50637 6.50637\n 1.44014 1.44014\n -4.60602 -4.60602\n -6.49137 -6.49137\n 6.33163 6.33163\n -1.97616 -1.97616\n 0.573379 0.573379\n -2.78039 -2.78039\n -0.140087 -0.140087\n 1.52619 1.52619\n 6.83379 6.83379\n -0.197981 -0.197981\n -3.00849 -3.00849\n -2.09725 -2.09725\n -2.06883 -2.06883\n -0.328198 -0.328198\n -0.212338 -0.212338\n 5.4425 5.4425\n 6.48574 6.48574\n 2.00073 2.00073\n -3.15642 -3.15642\n -0.0673389 -0.0673389\n -4.19911 -4.19911\n 4.5466 4.5466\n 3.73221 3.73221\n -1.01059 -1.01059\n -4.29015 -4.29015\n 4.9909 4.9909\n 3.22397 3.22397\n -1.27984 -1.27984\n 2.83358 2.83358\n 2.25695 2.25695\n 7.2879 7.2879\n -1.47955 -1.47955\n 12.7627 12.7627\n -3.72449 -3.72449\n 3.97719 3.97719\n 14.2197 14.2197\n -1.24031 -1.24031\n -7.41824 -7.41824\n 1.90207 1.90207\n 1.10939 1.10939\n -7.47202 -7.47202\n 3.85738 3.85738\n -4.12085 -4.12085\n 1.12097 1.12097\n -0.545646 -0.545646\n 3.04129 3.04129\n 1.05043 1.05043\n 0.993448 0.993448\n -5.78424 -5.78424\n -1.97199 -1.97199\n -5.74806 -5.74806\n 2.70835 2.70835\n -8.09729 -8.09729\n -6.36035 -6.36035\n -1.24361 -1.24361\n -2.44813 -2.44813\n 7.48353 7.48353\n 2.0202 2.0202\n 3.04366 3.04366\n -3.98778 -3.98778\n 4.80106 4.80106\n 0.926552 0.926552\n 3.35253 3.35253\n -4.10577 -4.10577\n -3.57853 -3.57853\n 4.03372 4.03372\n -2.38792 -2.38792\n 0.12177 0.12177\n -0.761671 -0.761671\n -4.25652 -4.25652\n 7.27933 7.27933\n 0.165182 0.165182\n 1.34367 1.34367\n -7.36923 -7.36923\n 2.38548 2.38548\n 0.117217 0.117217\n 2.02002 2.02002\n -4.60023 -4.60023\n 2.78 2.78\n -1.34604 -1.34604\n 4.7234 4.7234\n 7.37673 7.37673\n 2.07986 2.07986\n -5.72573 -5.72573\n -6.66143 -6.66143\n 2.43072 2.43072\n 1.34782 1.34782\n -0.114238 -0.114238\n 2.32103 2.32103\n 1.84042 1.84042\n 1.07005 1.07005\n 3.88182 3.88182\n -0.752264 -0.752264\n -2.43517 -2.43517\n -5.29216 -5.29216\n -0.13527 -0.13527\n 1.40188 1.40188\n -5.87815 -5.87815\n -1.90167 -1.90167\n 2.88562 2.88562\n -2.29028 -2.29028\n 2.35477 2.35477\n -3.50731 -3.50731\n 6.0621 6.0621\n 3.2011 3.2011\n 2.19115 2.19115\n -3.03557 -3.03557\n -8.49394 -8.49394\n 0.936501 0.936501\n 7.19188 7.19188\n 4.50162 4.50162\n 0.341394 0.341394\n 2.54484 2.54484\n 1.67305 1.67305\n 3.05008 3.05008\n -2.0266 -2.0266\n 7.28431 7.28431\n -7.70924 -7.70924\n 2.60851 2.60851\n 6.8054 6.8054\n 1.8878 1.8878\n 1.87624 1.87624\n -5.13611 -5.13611\n -3.23698 -3.23698\n 4.03201 4.03201\n -5.27165 -5.27165\n -4.95817 -4.95817\n -0.200461 -0.200461\n 4.27259 4.27259\n 0.449661 0.449661\n 7.49752 7.49752\n -5.47923 -5.47923\n -2.40934 -2.40934\n 25.0066 25.0066\n -3.14511 -3.14511\n -1.62587 -1.62587\n -1.67652 -1.67652\n -2.17888 -2.17888\n 2.37296 2.37296\n -4.41408 -4.41408\n 0.65204 0.65204\n 10.849 10.849\n -2.3021 -2.3021\n 2.20417 2.20417\n 10.0579 10.0579\n -4.03489 -4.03489\n 7.60982 7.60982\n -5.74951 -5.74951\n -2.97582 -2.97582\n -8.61382 -8.61382\n -1.90903 -1.90903\n -3.64556 -3.64556\n -16.2304 -16.2304\n -15.9793 -15.9793\n -4.59448 -4.59448\n -2.67688 -2.67688\n -1.67148 -1.67148\n 5.57026 5.57026\n 0.846445 0.846445\n -7.54149 -7.54149\n -3.61401 -3.61401\n 4.03723 4.03723\n 0.711821 0.711821\n 8.99009 8.99009\n -6.15866 -6.15866\n -1.36865 -1.36865\n -4.31058 -4.31058\n 6.31659 6.31659\n -6.23773 -6.23773\n 0.857388 0.857388\n 3.6152 3.6152\n -1.28774 -1.28774\n -4.92094 -4.92094\n 3.08527 3.08527\n -5.74582 -5.74582\n -4.20897 -4.20897\n -5.19406 -5.19406\n -4.06851 -4.06851\n 5.73867 5.73867\n 3.32767 3.32767\n -11.2588 -11.2588\n -7.94126 -7.94126\n 5.38746 5.38746\n -0.0253579 -0.0253579\n -1.7856 -1.7856\n -1.31209 -1.31209\n 6.85519 6.85519\n 2.71496 2.71496\n -2.58838 -2.58838\n -6.86996 -6.86996\n 1.01204 1.01204\n 3.43433 3.43433\n -0.249192 -0.249192\n 7.96322 7.96322\n 14.3414 14.3414\n 2.44774 2.44774\n 4.73731 4.73731\n -9.14288 -9.14288\n 2.70325 2.70325\n 6.48202 6.48202\n -2.58391 -2.58391\n -4.52079 -4.52079\n -0.64105 -0.64105\n -3.75531 -3.75531\n -3.93321 -3.93321\n -2.5879 -2.5879\n 2.34697 2.34697\n -3.89721 -3.89721\n -1.60712 -1.60712\n -7.49452 -7.49452\n -0.518596 -0.518596\n 0.996693 0.996693\n 2.83468 2.83468\n -6.19363 -6.19363\n -7.25683 -7.25683\n 0.391546 0.391546\n -7.52756 -7.52756\n -0.810817 -0.810817\n -2.64942 -2.64942\n -2.95081 -2.95081\n -6.34989 -6.34989\n 3.9961 3.9961\n 1.36755 1.36755\n -0.335808 -0.335808\n -11.7919 -11.7919\n 1.16904 1.16904\n 6.26031 6.26031\n -4.68064 -4.68064\n 5.55008 5.55008\n 3.65873 3.65873\n -3.95177 -3.95177\n 7.62708 7.62708\n -2.4932 -2.4932\n -0.713266 -0.713266\n 6.76214 6.76214\n -0.802523 -0.802523\n -0.327543 -0.327543\n -6.9053 -6.9053\n -2.69604 -2.69604\n 9.729 9.729\n -7.61691 -7.61691\n -0.658653 -0.658653\n 1.62531 1.62531\n 0.532107 0.532107\n 1.71729 1.71729\n -10.1795 -10.1795\n 5.54208 5.54208\n 4.02502 4.02502\n -1.47596 -1.47596\n 11.818 11.818\n 4.40414 4.40414\n 5.64827 5.64827\n 5.89386 5.89386\n -6.19187 -6.19187\n 4.77889 4.77889\n -0.261731 -0.261731\n -0.570525 -0.570525\n 3.80941 3.80941\n -3.95414 -3.95414\n 0.642971 0.642971\n -7.23493 -7.23493\n 0.744423 0.744423\n 11.5682 11.5682\n -3.17145 -3.17145\n 9.02877 9.02877\n 10.5452 10.5452\n -7.05642 -7.05642\n -6.01952 -6.01952\n -5.61355 -5.61355\n 1.28759 1.28759\n 3.44186 3.44186\n -2.52363 -2.52363\n 8.95712 8.95712\n -1.33999 -1.33999\n -3.25858 -3.25858\n 2.33509 2.33509\n 2.16314 2.16314\n 14.4002 14.4002\n -5.22345 -5.22345\n -5.6232 -5.6232\n -4.20801 -4.20801\n 0.677359 0.677359\n 1.92688 1.92688\n 2.4265 2.4265\n -3.47901 -3.47901\n -3.35004 -3.35004\n -5.32445 -5.32445\n 0.817822 0.817822\n 5.9241 5.9241\n 2.13342 2.13342\n 9.30726 9.30726\n -6.00328 -6.00328\n 5.10125 5.10125\n 16.6941 16.6941\n -1.41774 -1.41774\n 0.843709 0.843709\n 3.71326 3.71326\n -12.7315 -12.7315\n -1.58947 -1.58947\n 2.7713 2.7713\n -5.89993 -5.89993\n -10.1427 -10.1427\n -1.60823 -1.60823\n -4.98621 -4.98621\n -10.6258 -10.6258\n 0.255858 0.255858\n 5.87781 5.87781\n 0.549239 0.549239\n -0.361649 -0.361649\n 2.89543 2.89543\n -1.56252 -1.56252\n -7.04269 -7.04269\n 0.360599 0.360599\n -0.80318 -0.80318\n -8.15537 -8.15537\n 7.86106 7.86106\n 4.25906 4.25906\n 1.78474 1.78474\n 4.15764 4.15764\n -1.8884 -1.8884\n -7.16959 -7.16959\n 2.84539 2.84539\n -3.33161 -3.33161\n 4.89863 4.89863\n -3.36503 -3.36503\n -4.68013 -4.68013\n 5.18058 5.18058\n -9.69276 -9.69276\n -1.56116 -1.56116\n -3.58275 -3.58275\n -2.73766 -2.73766\n 6.64492 6.64492\n -3.78966 -3.78966\n 2.63467 2.63467\n -12.4868 -12.4868\n -3.4241 -3.4241\n 3.2898 3.2898\n 2.20265 2.20265\n -1.36672 -1.36672\n 2.71448 2.71448\n 5.87839 5.87839\n 0.160837 0.160837\n -2.64458 -2.64458\n -3.8078 -3.8078\n 5.08743 5.08743\n -14.014 -14.014\n 4.44746 4.44746\n 6.61584 6.61584\n -0.916513 -0.916513\n -8.08277 -8.08277\n -8.088 -8.088\n -5.14152 -5.14152\n -4.30739 -4.30739\n -8.76727 -8.76727\n -4.53313 -4.53313\n 11.0356 11.0356\n -2.37348 -2.37348\n -8.71711 -8.71711\n -2.22971 -2.22971\n 8.19346 8.19346\n -0.330962 -0.330962\n 1.10067 1.10067\n 1.01878 1.01878\n -10.2666 -10.2666\n 8.15909 8.15909\n 9.09316 9.09316\n -0.862864 -0.862864\n -7.54443 -7.54443\n -3.44703 -3.44703\n 5.21819 5.21819\n -2.06834 -2.06834\n 9.55442 9.55442\n -1.89649 -1.89649\n -5.57892 -5.57892\n 4.22421 4.22421\n -4.06375 -4.06375\n 3.81452 3.81452\n 3.09071 3.09071\n -7.34297 -7.34297\n -1.67899 -1.67899\n 0.58489 0.58489\n -5.33824 -5.33824\n 2.82705 2.82705\n -3.70864 -3.70864\n 4.21641 4.21641\n 3.82508 3.82508\n -4.04356 -4.04356\n 20.0249 20.0249\n -13.1531 -13.1531\n 2.98603 2.98603\n 5.54713 5.54713\n -1.39722 -1.39722\n 2.13016 2.13016\n -2.40215 -2.40215\n 0.168123 0.168123\n 2.77021 2.77021\n -2.32327 -2.32327\n -1.06731 -1.06731\n 2.53877 2.53877\n -1.94325 -1.94325\n 1.47106 1.47106\n 0.294436 0.294436\n -0.547055 -0.547055\n 0.116016 0.116016\n 1.56148 1.56148\n 3.21789 3.21789\n -2.89007 -2.89007\n -4.33765 -4.33765\n 0.566163 0.566163\n 0.402729 0.402729\n -7.80674 -7.80674\n 4.72058 4.72058\n 3.97584 3.97584\n 1.91646 1.91646\n 2.09298 2.09298\n 1.88552 1.88552\n -2.37581 -2.37581\n -18.2615 -18.2615\n 2.68651 2.68651\n 5.5 5.5\n 0.355051 0.355051\n 5.6052 5.6052\n 7.74854 7.74854\n -0.512378 -0.512378\n 1.60299 1.60299\n -5.49563 -5.49563\n -1.96455 -1.96455\n -16.3228 -16.3228\n -6.87737 -6.87737\n -4.60755 -4.60755\n -1.32116 -1.32116\n 2.87263 2.87263\n -2.09541 -2.09541\n 3.43595 3.43595\n 3.63528 3.63528\n 3.52056 3.52056\n -3.59484 -3.59484\n 1.03764 1.03764\n -7.14947 -7.14947\n -5.80634 -5.80634\n 4.71397 4.71397\n 0.720588 0.720588\n -2.24074 -2.24074\n 5.82418 5.82418\n -3.22013 -3.22013\n 3.68858 3.68858\n -1.43166 -1.43166\n 4.47978 4.47978\n -4.83356 -4.83356\n -3.96257 -3.96257\n -5.95512 -5.95512\n 0.496691 0.496691\n -7.58825 -7.58825\n -6.47331 -6.47331\n -1.14446 -1.14446\n 3.91615 3.91615\n -0.588841 -0.588841\n 6.56683 6.56683\n 3.97252 3.97252\n -4.3126 -4.3126\n -8.20913 -8.20913\n 0.310182 0.310182\n -7.3006 -7.3006\n 7.92805 7.92805\n 2.1756 2.1756\n 1.06404 1.06404\n 1.14471 1.14471\n -1.50242 -1.50242\n 0.00723557 0.00723557\n 5.76841 5.76841\n -1.96707 -1.96707\n 8.87243 8.87243\n -3.23281 -3.23281\n 12.3087 12.3087\n 3.3245 3.3245\n 3.00334 3.00334\n -5.74048 -5.74048\n 7.43939 7.43939\n -0.906001 -0.906001\n 2.24067 2.24067\n -6.23989 -6.23989\n 2.81483 2.81483\n -1.62648 -1.62648\n -7.26368 -7.26368\n 1.69171 1.69171\n -11.2631 -11.2631\n -2.32992 -2.32992\n -6.07361 -6.07361\n -7.56822 -7.56822\n -7.56737 -7.56737\n 5.97037 5.97037\n 6.74398 6.74398\n -2.24599 -2.24599\n 2.95213 2.95213\n -12.7864 -12.7864\n 0.680035 0.680035\n -1.39988 -1.39988\n -4.74028 -4.74028\n 3.01887 3.01887\n 1.89636 1.89636\n 4.46014 4.46014\n -4.38308 -4.38308\n 11.7633 11.7633\n -3.54671 -3.54671\n -3.47584 -3.47584\n 3.80037 3.80037\n 7.77849 7.77849\n -7.00006 -7.00006\n -4.87665 -4.87665\n -4.54736 -4.54736\n -7.81752 -7.81752\n -0.0654465 -0.0654465\n -3.70587 -3.70587\n -2.24231 -2.24231\n 5.58005 5.58005\n -3.09415 -3.09415\n -5.55063 -5.55063\n -4.19666 -4.19666\n -6.83328 -6.83328\n -6.9216 -6.9216\n -3.72782 -3.72782\n -2.18574 -2.18574\n 1.28076 1.28076\n -3.40691 -3.40691\n 0.486964 0.486964\n -2.11025 -2.11025\n -1.42349 -1.42349\n 6.06854 6.06854\n -1.37534 -1.37534\n 9.47832 9.47832\n -0.567045 -0.567045\n -6.98328 -6.98328\n 6.73139 6.73139\n -1.56812 -1.56812\n 0.141683 0.141683\n 1.78697 1.78697\n -2.03874 -2.03874\n 1.28356 1.28356\n 6.9912 6.9912\n -3.8858 -3.8858\n -1.38808 -1.38808\n -2.16632 -2.16632\n 3.57955 3.57955\n 2.73506 2.73506\n -3.03108 -3.03108\n -3.44677 -3.44677\n 1.37111 1.37111\n -10.0008 -10.0008\n -3.61651 -3.61651\n 1.97313 1.97313\n 2.11298 2.11298\n 0.174957 0.174957\n -0.131546 -0.131546\n 7.58484 7.58484\n 4.27907 4.27907\n 0.855439 0.855439\n 4.44153 4.44153\n -1.04577 -1.04577\n -7.49625 -7.49625\n 2.1572 2.1572\n 13.0815 13.0815\n 4.57025 4.57025\n 0.704658 0.704658\n 3.25079 3.25079\n -0.682139 -0.682139\n -4.17209 -4.17209\n -1.38547 -1.38547\n 5.52688 5.52688\n -4.90717 -4.90717\n 2.56402 2.56402\n -1.37164 -1.37164\n -6.05044 -6.05044\n 8.3158 8.3158\n -0.640461 -0.640461\n -2.40145 -2.40145\n -1.02959 -1.02959\n -6.75028 -6.75028\n 4.20206 4.20206\n 0.615412 0.615412\n -0.389435 -0.389435\n -5.07439 -5.07439\n -5.34136 -5.34136\n -1.88522 -1.88522\n -4.82628 -4.82628\n 0.54435 0.54435\n -3.28948 -3.28948\n 5.0051 5.0051\n -8.5501 -8.5501\n 7.31448 7.31448\n 0.145651 0.145651\n 3.28586 3.28586\n -1.8624 -1.8624\n -8.9235 -8.9235\n 3.15894 3.15894\n -9.9459 -9.9459\n 0.517233 0.517233\n -4.59899 -4.59899\n 0.641116 0.641116\n 10.3809 10.3809\n 2.39935 2.39935\n -0.378496 -0.378496\n 0.680329 0.680329\n 2.35584 2.35584\n -2.24714 -2.24714\n -4.8742 -4.8742\n -3.96429 -3.96429\n 1.29263 1.29263\n 0.618875 0.618875\n -0.611961 -0.611961\n 1.06612 1.06612\n -3.39289 -3.39289\n -0.226022 -0.226022\n 4.24418 4.24418\n 0.884239 0.884239\n 8.25747 8.25747\n -3.23019 -3.23019\n -9.99374 -9.99374\n 8.54414 8.54414\n -6.06374 -6.06374\n -4.92601 -4.92601\n 7.22101 7.22101\n 11.5756 11.5756\n 13.436 13.436\n 4.13522 4.13522\n 9.67412 9.67412\n -3.13805 -3.13805\n 7.50856 7.50856\n -7.98069 -7.98069\n 4.92059 4.92059\n -6.72969 -6.72969\n -4.48762 -4.48762\n -3.60328 -3.60328\n -1.75053 -1.75053\n 1.5638 1.5638\n 4.74213 4.74213\n 5.16046 5.16046\n -1.9857 -1.9857\n -6.34885 -6.34885\n -3.58963 -3.58963\n 4.96795 4.96795\n 1.44405 1.44405\n -2.74682 -2.74682\n -0.545296 -0.545296\n -10.7507 -10.7507\n -0.117477 -0.117477\n -0.436907 -0.436907\n -1.11656 -1.11656\n 1.64789 1.64789\n -4.08799 -4.08799\n -1.04262 -1.04262\n 6.06007 6.06007\n -6.68208 -6.68208\n 6.81976 6.81976\n -6.89836 -6.89836\n -0.555115 -0.555115\n -2.85307 -2.85307\n -7.76567 -7.76567\n -5.65104 -5.65104\n 8.93521 8.93521\n -5.0663 -5.0663\n 2.52214 2.52214\n 0.382824 0.382824\n -0.398468 -0.398468\n 5.05183 5.05183\n 4.134 4.134\n 1.42909 1.42909\n 2.99357 2.99357\n 10.7821 10.7821\n -4.54764 -4.54764\n -0.0440308 -0.0440308\n 0.647161 0.647161\n 3.27569 3.27569\n -32.9478 -32.9478\n 6.92399 6.92399\n -3.05953 -3.05953\n -2.29742 -2.29742\n -0.41863 -0.41863\n 2.99125 2.99125\n 3.40805 3.40805\n -1.36651 -1.36651\n -3.25561 -3.25561\n 5.11504 5.11504\n -0.532291 -0.532291\n 9.93341 9.93341\n -2.2806 -2.2806\n 10.9617 10.9617\n -2.53642 -2.53642\n 0.995763 0.995763\n -1.28898 -1.28898\n -2.99921 -2.99921\n -2.46773 -2.46773\n -11.0849 -11.0849\n -11.64 -11.64\n -3.73617 -3.73617\n 2.74223 2.74223\n -0.976817 -0.976817\n -0.384814 -0.384814\n -3.38815 -3.38815\n 2.27591 2.27591\n -5.25732 -5.25732\n -1.65764 -1.65764\n -5.8501 -5.8501\n -4.85863 -4.85863\n 2.78987 2.78987\n 5.3324 5.3324\n -9.16758 -9.16758\n 7.90047 7.90047\n 5.68696 5.68696\n 7.2668 7.2668\n -0.857072 -0.857072\n 0.0834347 0.0834347\n 1.11833 1.11833\n 0.88212 0.88212\n -4.40785 -4.40785\n 5.25846 5.25846\n 7.46283 7.46283\n 6.26981 6.26981\n -10.8935 -10.8935\n -0.226332 -0.226332\n -1.64568 -1.64568\n -0.389003 -0.389003\n -0.854872 -0.854872\n -3.38063 -3.38063\n -4.74874 -4.74874\n -1.81717 -1.81717\n -6.03338 -6.03338\n 9.41153 9.41153\n -2.75636 -2.75636\n -4.03638 -4.03638\n -2.82527 -2.82527\n 0.641039 0.641039\n -3.08939 -3.08939\n -1.04523 -1.04523\n -4.17379 -4.17379\n 0.453503 0.453503\n 5.64541 5.64541\n 2.72225 2.72225\n -1.67354 -1.67354\n -6.68729 -6.68729\n -1.20785 -1.20785\n 3.51562 3.51562\n 2.38257 2.38257\n 2.75735 2.75735\n -4.62925 -4.62925\n 7.98247 7.98247\n 6.254 6.254\n 3.85448 3.85448\n -4.40298 -4.40298\n -8.28751 -8.28751\n -7.28055 -7.28055\n 7.31675 7.31675\n 3.53957 3.53957\n 2.94378 2.94378\n 1.41268 1.41268\n 5.2878 5.2878\n -0.807317 -0.807317\n -13.141 -13.141\n 5.71505 5.71505\n -3.86739 -3.86739\n 0.922435 0.922435\n -4.52167 -4.52167\n 0.82741 0.82741\n 4.1254 4.1254\n -3.64229 -3.64229\n -4.34879 -4.34879\n -5.69361 -5.69361\n 10.0503 10.0503\n -6.20878 -6.20878\n -5.70531 -5.70531\n -0.265037 -0.265037\n 4.91217 4.91217\n -9.85839 -9.85839\n 9.14639 9.14639\n 0.78426 0.78426\n -6.03581 -6.03581\n -1.225 -1.225\n -1.82514 -1.82514\n -4.38257 -4.38257\n -4.14898 -4.14898\n 1.30056 1.30056\n -4.04361 -4.04361\n -10.7862 -10.7862\n -1.71033 -1.71033\n -5.3235 -5.3235\n -5.05158 -5.05158\n 2.03088 2.03088\n -4.639 -4.639\n -8.90379 -8.90379\n -1.46286 -1.46286\n 4.78737 4.78737\n 2.84292 2.84292\n -4.60125 -4.60125\n -0.454598 -0.454598\n -3.54703 -3.54703\n -3.15574 -3.15574\n -5.66794 -5.66794\n -0.499733 -0.499733\n 4.80394 4.80394\n 7.0018 7.0018\n -12.2494 -12.2494\n -0.705371 -0.705371\n 0.0740021 0.0740021\n -2.66987 -2.66987\n 2.48263 2.48263\n -9.06332 -9.06332\n -1.01261 -1.01261\n 3.84118 3.84118\n 4.21216 4.21216\n -1.18673 -1.18673\n -11.0005 -11.0005\n -9.71638 -9.71638\n 1.76212 1.76212\n -2.83766 -2.83766\n -9.13768 -9.13768\n -1.05015 -1.05015\n 2.53008 2.53008\n 0.379504 0.379504\n 5.28803 5.28803\n -6.17221 -6.17221\n 5.75619 5.75619\n 2.3737 2.3737\n -9.0974 -9.0974\n -7.85433 -7.85433\n -10.9094 -10.9094\n 1.20756 1.20756\n 2.61486 2.61486\n 1.23359 1.23359\n 43.6151 43.6151\n -1.72859 -1.72859\n -0.965831 -0.965831\n -0.482239 -0.482239\n -1.82159 -1.82159\n 1.661 1.661\n 1.93636 1.93636\n -11.9999 -11.9999\n 0.104367 0.104367\n -1.70555 -1.70555\n -9.81074 -9.81074\n 12.7941 12.7941\n -3.36221 -3.36221\n -6.06523 -6.06523\n 0.47411 0.47411\n -6.64475 -6.64475\n -0.763006 -0.763006\n -3.9763 -3.9763\n -2.86732 -2.86732\n -20.6937 -20.6937\n 1.84418 1.84418\n 5.65243 5.65243\n 10.7255 10.7255\n -1.21293 -1.21293\n 3.15057 3.15057\n 8.96094 8.96094\n -0.205015 -0.205015\n 8.44579 8.44579\n 2.01362 2.01362\n 2.36648 2.36648\n 11.6752 11.6752\n 2.19072 2.19072\n -13.9182 -13.9182\n 3.3257 3.3257\n -6.60627 -6.60627\n 1.62083 1.62083\n -2.00847 -2.00847\n 11.6978 11.6978\n 5.93254 5.93254\n 4.93134 4.93134\n -2.50847 -2.50847\n -5.92846 -5.92846\n 1.16717 1.16717\n 6.9673 6.9673\n -1.21182 -1.21182\n 7.25413 7.25413\n -4.24031 -4.24031\n -3.12368 -3.12368\n 1.73734 1.73734\n -2.6551 -2.6551\n 5.01063 5.01063\n 10.9923 10.9923\n 3.08502 3.08502\n -1.67866 -1.67866\n 10.7003 10.7003\n -0.982895 -0.982895\n 1.97681 1.97681\n -1.29045 -1.29045\n 1.64227 1.64227\n 3.21157 3.21157\n -4.63376 -4.63376\n 4.47725 4.47725\n 7.77208 7.77208\n 0.332548 0.332548\n 2.82084 2.82084\n 0.958649 0.958649\n 1.21302 1.21302\n -3.16936 -3.16936\n 0.0672417 0.0672417\n 0.563038 0.563038\n -1.87542 -1.87542\n -3.01753 -3.01753\n 2.73107 2.73107\n -3.68276 -3.68276\n 4.64376 4.64376\n -12.4341 -12.4341\n 4.43429 4.43429\n 5.72878 5.72878\n 2.39332 2.39332\n 1.91106 1.91106\n 2.50458 2.50458\n 0.942479 0.942479\n -0.489758 -0.489758\n 0.311101 0.311101\n -2.74953 -2.74953\n 4.95959 4.95959\n 1.26862 1.26862\n 10.3622 10.3622\n 3.61213 3.61213\n -2.19285 -2.19285\n 1.28587 1.28587\n -1.85274 -1.85274\n -1.62541 -1.62541\n 2.00382 2.00382\n -5.8959 -5.8959\n -0.918042 -0.918042\n 6.43711 6.43711\n 0.419441 0.419441\n -2.61133 -2.61133\n -0.0277654 -0.0277654\n 2.77443 2.77443\n 3.83764 3.83764\n -1.44486 -1.44486\n -0.611288 -0.611288\n -4.30436 -4.30436\n 5.29466 5.29466\n 1.56058 1.56058\n 1.88962 1.88962\n 0.761408 0.761408\n 1.76505 1.76505\n 1.18453 1.18453\n 1.71559 1.71559\n -3.14851 -3.14851\n 2.73145 2.73145\n -1.23904 -1.23904\n 0.00672958 0.00672958\n 3.40979 3.40979\n -1.77498 -1.77498\n -7.12266 -7.12266\n -9.24697 -9.24697\n -4.12038 -4.12038\n -2.77817 -2.77817\n 8.23453 8.23453\n -1.29818 -1.29818\n -7.02203 -7.02203\n -5.8994 -5.8994\n 8.20499 8.20499\n 0.356509 0.356509\n -0.515947 -0.515947\n -6.23904 -6.23904\n 5.59801 5.59801\n -4.44281 -4.44281\n -2.28591 -2.28591\n -3.31819 -3.31819\n 2.39253 2.39253\n 3.18355 3.18355\n -2.73303 -2.73303\n -0.0346074 -0.0346074\n -10.2692 -10.2692\n 6.74308 6.74308\n 5.72055 5.72055\n -4.49033 -4.49033\n 1.99176 1.99176\n 6.10782 6.10782\n 2.65759 2.65759\n 1.97884 1.97884\n 0.927606 0.927606\n 1.25006 1.25006\n 9.3695 9.3695\n -2.75726 -2.75726\n -0.580415 -0.580415\n 2.92463 2.92463\n -4.49535 -4.49535\n -1.61397 -1.61397\n 3.26733 3.26733\n -3.61505 -3.61505\n -2.46453 -2.46453\n 2.42436 2.42436\n 5.68683 5.68683\n 6.07494 6.07494\n 4.35205 4.35205\n -5.29467 -5.29467\n -3.90039 -3.90039\n -1.70776 -1.70776\n -6.3172 -6.3172\n 4.03858 4.03858\n -2.58786 -2.58786\n -1.1514 -1.1514\n -0.632569 -0.632569\n -0.343314 -0.343314\n -12.2115 -12.2115\n 0.405742 0.405742\n -6.46017 -6.46017\n -2.30808 -2.30808\n 1.1336 1.1336\n 1.47556 1.47556\n 1.98494 1.98494\n 2.24865 2.24865\n -1.65786 -1.65786\n -4.62769 -4.62769\n 4.43717 4.43717\n 8.75249 8.75249\n 4.29167 4.29167\n -3.96876 -3.96876\n -3.52244 -3.52244\n 0.161164 0.161164\n -4.13202 -4.13202\n 1.42269 1.42269\n -3.05155 -3.05155\n 1.81371 1.81371\n -1.03765 -1.03765\n 0.696656 0.696656\n 2.95359 2.95359\n -4.74837 -4.74837\n -9.03481 -9.03481\n 4.8852 4.8852\n 9.47173 9.47173\n 11.3037 11.3037\n -3.88084 -3.88084\n -5.99356 -5.99356\n 7.81639 7.81639\n -6.51949 -6.51949\n 7.801 7.801\n -0.795429 -0.795429\n -0.801046 -0.801046\n 2.70658 2.70658\n 5.51012 5.51012\n 1.8181 1.8181\n -0.452854 -0.452854\n -10.1558 -10.1558\n 1.95877 1.95877\n -3.88197 -3.88197\n 1.72033 1.72033\n -1.8939 -1.8939\n -1.64082 -1.64082\n -0.409815 -0.409815\n 9.98658 9.98658\n -0.115277 -0.115277\n 1.49827 1.49827\n 1.6696 1.6696\n 2.29297 2.29297\n -2.14941 -2.14941\n 2.43318 2.43318\n 3.59845 3.59845\n -4.58877 -4.58877\n -9.25371 -9.25371\n 2.03609 2.03609\n 5.5921 5.5921\n -0.532859 -0.532859\n 4.34937 4.34937\n 1.57036 1.57036\n 2.30747 2.30747\n 7.5055 7.5055\n 3.41771 3.41771\n 0.589402 0.589402\n 1.55834 1.55834\n 5.12407 5.12407\n -1.41727 -1.41727\n 1.03223 1.03223\n -2.06257 -2.06257\n 3.11532 3.11532\n 1.90042 1.90042\n 8.66814 8.66814\n 5.36716 5.36716\n 2.38085 2.38085\n 5.72834 5.72834\n -6.5998 -6.5998\n 0.852569 0.852569\n -7.5648 -7.5648\n 2.98063 2.98063\n 7.81573 7.81573\n 1.82276 1.82276\n -1.81083 -1.81083\n 5.48043 5.48043\n -1.85315 -1.85315\n -1.62277 -1.62277\n -10.4951 -10.4951\n 5.34799 5.34799\n -1.77515 -1.77515\n 5.88005 5.88005\n 0.0799242 0.0799242\n 1.23264 1.23264\n -11.835 -11.835\n 3.56828 3.56828\n 7.53741 7.53741\n -5.24051 -5.24051\n -0.206917 -0.206917\n 4.36865 4.36865\n -4.10348 -4.10348\n 0.857712 0.857712\n -5.09677 -5.09677\n 7.37208 7.37208\n -3.14614 -3.14614\n 12.061 12.061\n 4.80096 4.80096\n 2.82421 2.82421\n -4.97446 -4.97446\n -11.0289 -11.0289\n -8.33282 -8.33282\n 0.69922 0.69922\n 5.08771 5.08771\n 2.65174 2.65174\n -3.30182 -3.30182\n 5.21741 5.21741\n 8.85373 8.85373\n 8.36416 8.36416\n 2.54295 2.54295\n -1.61657 -1.61657\n 1.12017 1.12017\n -7.33205 -7.33205\n 3.82582 3.82582\n -0.858026 -0.858026\n 1.40304 1.40304\n 1.35079 1.35079\n 4.19532 4.19532\n -1.77923 -1.77923\n -10.5119 -10.5119\n 10.8061 10.8061\n -3.49603 -3.49603\n 3.12404 3.12404\n -3.93328 -3.93328\n -6.73356 -6.73356\n 1.80532 1.80532\n -0.368024 -0.368024\n -3.47875 -3.47875\n -4.22893 -4.22893\n 2.52519 2.52519\n -3.54943 -3.54943\n -2.39869 -2.39869\n 4.22126 4.22126\n -0.253856 -0.253856\n 7.51866 7.51866\n -4.54093 -4.54093\n 3.44497 3.44497\n 4.77417 4.77417\n 4.49646 4.49646\n -5.78678 -5.78678\n 0.745013 0.745013\n 1.69763 1.69763\n -2.64759 -2.64759\n 1.66108 1.66108\n -4.68276 -4.68276\n 5.31823 5.31823\n 3.52288 3.52288\n 4.9695 4.9695\n 12.2016 12.2016\n 2.46849 2.46849\n -7.60038 -7.60038\n 8.21628 8.21628\n 5.99856 5.99856\n -6.80947 -6.80947\n 7.22522 7.22522\n -2.00065 -2.00065\n -8.24049 -8.24049\n -0.0804049 -0.0804049\n -2.06638 -2.06638\n -2.82884 -2.82884\n -4.25891 -4.25891\n -5.20258 -5.20258\n -3.19396 -3.19396\n -5.14527 -5.14527\n -4.28244 -4.28244\n 4.70805 4.70805\n -3.08065 -3.08065\n -4.86906 -4.86906\n -29.0266 -29.0266\n -1.22941 -1.22941\n -1.30928 -1.30928\n -6.35234 -6.35234\n 1.87904 1.87904\n 8.37797 8.37797\n -5.8821 -5.8821\n 3.10138 3.10138\n -3.27553 -3.27553\n -0.208451 -0.208451\n -2.28999 -2.28999\n 12.2896 12.2896\n -1.27394 -1.27394\n -3.41924 -3.41924\n -0.289592 -0.289592\n 1.79867 1.79867\n 1.98504 1.98504\n 1.55159 1.55159\n 1.10858 1.10858\n 0.352842 0.352842\n -0.309044 -0.309044\n -0.165336 -0.165336\n 1.15822 1.15822\n -1.39342 -1.39342\n -0.162562 -0.162562\n -8.06055 -8.06055\n -5.02776 -5.02776\n -8.66927 -8.66927\n 1.14576 1.14576\n -1.52122 -1.52122\n -1.29436 -1.29436\n 3.26421 3.26421\n 7.55561 7.55561\n 7.7265 7.7265\n -0.48821 -0.48821\n 12.439 12.439\n 7.0264 7.0264\n -11.9855 -11.9855\n -3.74151 -3.74151\n -0.200302 -0.200302\n 5.39515 5.39515\n -4.3468 -4.3468\n 9.25599 9.25599\n 3.37455 3.37455\n -6.15424 -6.15424\n -6.6271 -6.6271\n 0.000272481 0.000272481\n -5.48117 -5.48117\n -0.493191 -0.493191\n -3.46473 -3.46473\n 2.33812 2.33812\n 0.885965 0.885965\n 4.74926 4.74926\n 1.51959 1.51959\n 2.50956 2.50956\n -0.728024 -0.728024\n 1.0381 1.0381\n 5.48121 5.48121\n -1.68033 -1.68033\n -5.05915 -5.05915\n -0.646233 -0.646233\n 0.614062 0.614062\n 4.54219 4.54219\n -1.63006 -1.63006\n -3.10589 -3.10589\n -3.12801 -3.12801\n -5.98177 -5.98177\n -3.59188 -3.59188\n 1.7066 1.7066\n -7.43935 -7.43935\n 10.6141 10.6141\n 12.6478 12.6478\n -1.7222 -1.7222\n -2.1519 -2.1519\n -7.16573 -7.16573\n 0.887314 0.887314\n -8.59735 -8.59735\n -1.3609 -1.3609\n 4.47651 4.47651\n 0.900892 0.900892\n 7.81857 7.81857\n 6.19857 6.19857\n 2.12844 2.12844\n 3.08551 3.08551\n 4.15866 4.15866\n 2.09657 2.09657\n -2.27786 -2.27786\n 1.33571 1.33571\n 4.46899 4.46899\n 4.46674 4.46674\n 3.20736 3.20736\n 5.68287 5.68287\n 10.1058 10.1058\n 5.1894 5.1894\n 3.5452 3.5452\n 10.06 10.06\n 7.02935 7.02935\n -1.06066 -1.06066\n 10.32 10.32\n -0.860463 -0.860463\n 5.95992 5.95992\n -6.30137 -6.30137\n -5.01947 -5.01947\n 5.75187 5.75187\n -1.10079 -1.10079\n -1.91783 -1.91783\n 0.815744 0.815744\n -0.958663 -0.958663\n -3.28825 -3.28825\n -6.37854 -6.37854\n 6.91577 6.91577\n 2.54565 2.54565\n 1.39487 1.39487\n 1.59679 1.59679\n 4.72347 4.72347\n 2.49221 2.49221\n 1.29896 1.29896\n -4.08232 -4.08232\n -0.648436 -0.648436\n -6.43531 -6.43531\n -0.556197 -0.556197\n -1.40304 -1.40304\n 0.699818 0.699818\n -5.29777 -5.29777\n -3.44335 -3.44335\n 7.35309 7.35309\n 8.846 8.846\n 8.39833 8.39833\n -2.71436 -2.71436\n 3.37063 3.37063\n -3.18723 -3.18723\n 1.32256 1.32256\n -3.09485 -3.09485\n 8.78146 8.78146\n -1.30004 -1.30004\n 3.03526 3.03526\n -1.4592 -1.4592\n 3.90288 3.90288\n -13.5124 -13.5124\n 1.35105 1.35105\n 3.37337 3.37337\n 2.5171 2.5171\n -4.22085 -4.22085\n 13.1858 13.1858\n -6.02839 -6.02839\n 5.75692 5.75692\n 2.46171 2.46171\n -0.950315 -0.950315\n -3.63255 -3.63255\n 1.88 1.88\n 5.48758 5.48758\n 4.96786 4.96786\n -6.17199 -6.17199\n -0.284244 -0.284244\n -1.80256 -1.80256\n 3.03221 3.03221\n -8.90171 -8.90171\n -8.66084 -8.66084\n -9.06366 -9.06366\n -3.02007 -3.02007\n -8.2276 -8.2276\n 8.10032 8.10032\n -4.11364 -4.11364\n -3.39291 -3.39291\n 3.64208 3.64208\n -0.739833 -0.739833\n -2.84156 -2.84156\n -0.843081 -0.843081\n -0.249744 -0.249744\n 7.05075 7.05075\n 0.369632 0.369632\n -1.90893 -1.90893\n 9.79465 9.79465\n 3.52356 3.52356\n 4.14091 4.14091\n 1.66568 1.66568\n -10.7162 -10.7162\n -7.64522 -7.64522\n 1.54688 1.54688\n 7.84479 7.84479\n 0.466458 0.466458\n 4.03315 4.03315\n 0.472926 0.472926\n 1.73319 1.73319\n 1.79317 1.79317\n 1.46234 1.46234\n -8.45267 -8.45267\n 7.30327 7.30327\n 3.08869 3.08869\n 5.27442 5.27442\n 2.92876 2.92876\n -1.6673 -1.6673\n 14.4442 14.4442\n 13.4055 13.4055\n -1.47522 -1.47522\n -3.57821 -3.57821\n 9.00659 9.00659\n -9.6723 -9.6723\n 2.8818 2.8818\n -2.61898 -2.61898\n 1.17927 1.17927\n -3.15135 -3.15135\n -0.976968 -0.976968\n 1.45062 1.45062\n 4.66687 4.66687\n 4.94346 4.94346\n -2.20375 -2.20375\n 2.93643 2.93643\n 7.51365 7.51365\n 6.50034 6.50034\n 1.74088 1.74088\n -4.43403 -4.43403\n 0.796894 0.796894\n -1.23803 -1.23803\n 5.33941 5.33941\n 4.90517 4.90517\n 0.569053 0.569053\n -0.609673 -0.609673\n 5.091 5.091\n 1.76184 1.76184\n -3.81174 -3.81174\n -5.39095 -5.39095\n -3.09718 -3.09718\n -1.87868 -1.87868\n -4.85278 -4.85278\n -1.05327 -1.05327\n -1.11892 -1.11892\n -3.52006 -3.52006\n -2.8466 -2.8466\n 3.03494 3.03494\n 3.7605 3.7605\n -1.8123 -1.8123\n -5.10186 -5.10186\n 2.85973 2.85973\n -3.6241 -3.6241\n 1.78302 1.78302\n -12.3108 -12.3108\n 0.378043 0.378043\n -1.70182 -1.70182\n -0.91773 -0.91773\n -5.37355 -5.37355","category":"page"},{"location":"examples/working_with_ollama/#Using-postprocessing-function","page":"Local models with Ollama.ai","title":"Using postprocessing function","text":"","category":"section"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"Add normalization as postprocessing function to normalize embeddings on reception (for easy cosine similarity later)","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"using LinearAlgebra\nschema = PT.OllamaManagedSchema()\n\nmsg = aiembed(schema,\n [\"embed me\", \"and me too\"],\n LinearAlgebra.normalize;\n model = \"openhermes2.5-mistral\")","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"DataMessage(Matrix{Float64} of size (4096, 2))","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"Cosine similarity is then a simple multiplication","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"msg.content' * msg.content[:, 1]","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"2-element Vector{Float64}:\n 0.9999999999999946\n 0.34130017815042357","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"","category":"page"},{"location":"examples/working_with_ollama/","page":"Local models with Ollama.ai","title":"Local models with Ollama.ai","text":"This page was generated using Literate.jl.","category":"page"},{"location":"frequently_asked_questions/#Frequently-Asked-Questions","page":"F.A.Q.","title":"Frequently Asked Questions","text":"","category":"section"},{"location":"frequently_asked_questions/#Why-OpenAI","page":"F.A.Q.","title":"Why OpenAI","text":"","category":"section"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"OpenAI's models are at the forefront of AI research and provide robust, state-of-the-art capabilities for many tasks.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"There will be situations not or cannot use it (eg, privacy, cost, etc.). In that case, you can use local models (eg, Ollama) or other APIs (eg, Anthropic).","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Note: To get started with Ollama.ai, see the Setup Guide for Ollama section below.","category":"page"},{"location":"frequently_asked_questions/#Data-Privacy-and-OpenAI","page":"F.A.Q.","title":"Data Privacy and OpenAI","text":"","category":"section"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"At the time of writing, OpenAI does NOT use the API calls for training their models.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"APIOpenAI does not use data submitted to and generated by our API to train OpenAI models or improve OpenAI’s service offering. In order to support the continuous improvement of our models, you can fill out this form to opt-in to share your data with us. – How your data is used to improve our models","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"You can always double-check the latest information on the OpenAI's How we use your data page.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Resources:","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"OpenAI's How we use your data\nData usage for consumer services FAQ\nHow your data is used to improve our models","category":"page"},{"location":"frequently_asked_questions/#Creating-OpenAI-API-Key","page":"F.A.Q.","title":"Creating OpenAI API Key","text":"","category":"section"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"You can get your API key from OpenAI by signing up for an account and accessing the API section of the OpenAI website.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Create an account with OpenAI\nGo to API Key page\nClick on “Create new secret key”","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"!!! Do not share it with anyone and do NOT save it to any files that get synced online.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Resources:","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"OpenAI Documentation\nVisual tutorial","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Pro tip: Always set the spending limits!","category":"page"},{"location":"frequently_asked_questions/#Setting-OpenAI-Spending-Limits","page":"F.A.Q.","title":"Setting OpenAI Spending Limits","text":"","category":"section"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"OpenAI allows you to set spending limits directly on your account dashboard to prevent unexpected costs.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Go to OpenAI Billing\nSet Soft Limit (you’ll receive a notification) and Hard Limit (API will stop working not to spend more money)","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"A good start might be a soft limit of c.5 and a hard limit of c10 - you can always increase it later in the month.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Resources:","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"OpenAI Forum","category":"page"},{"location":"frequently_asked_questions/#How-much-does-it-cost?-Is-it-worth-paying-for?","page":"F.A.Q.","title":"How much does it cost? Is it worth paying for?","text":"","category":"section"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"If you use a local model (eg, with Ollama), it's free. If you use any commercial APIs (eg, OpenAI), you will likely pay per \"token\" (a sub-word unit).","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"For example, a simple request with a simple question and 1 sentence response in return (”Is statement XYZ a positive comment”) will cost you ~0.0001 (ie, one hundredth of a cent)","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Is it worth paying for?","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"GenAI is a way to buy time! You can pay cents to save tens of minutes every day.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Continuing the example above, imagine you have a table with 200 comments. Now, you can parse each one of them with an LLM for the features/checks you need. Assuming the price per call was 0.0001, you'd pay 2 cents for the job and save 30-60 minutes of your time!","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Resources:","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"OpenAI Pricing per 1000 tokens","category":"page"},{"location":"frequently_asked_questions/#Configuring-the-Environment-Variable-for-API-Key","page":"F.A.Q.","title":"Configuring the Environment Variable for API Key","text":"","category":"section"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"To use the OpenAI API with PromptingTools.jl, set your API key as an environment variable:","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"ENV[\"OPENAI_API_KEY\"] = \"your-api-key\"","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"As a one-off, you can: ","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"set it in the terminal before launching Julia: export OPENAI_API_KEY = \nset it in your setup.jl (make sure not to commit it to GitHub!)","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Make sure to start Julia from the same terminal window where you set the variable. Easy check in Julia, run ENV[\"OPENAI_API_KEY\"] and you should see your key!","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"A better way:","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"On a Mac, add the configuration line to your terminal's configuration file (eg, ~/.zshrc). It will get automatically loaded every time you launch the terminal\nOn Windows, set it as a system variable in \"Environment Variables\" settings (see the Resources)","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Resources: ","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"OpenAI Guide","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Note: In the future, we hope to add Preferences.jl-based workflow to set the API key and other preferences.","category":"page"},{"location":"frequently_asked_questions/#Understanding-the-API-Keyword-Arguments-in-aigenerate-(api_kwargs)","page":"F.A.Q.","title":"Understanding the API Keyword Arguments in aigenerate (api_kwargs)","text":"","category":"section"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"See OpenAI API reference for more information.","category":"page"},{"location":"frequently_asked_questions/#Instant-Access-from-Anywhere","page":"F.A.Q.","title":"Instant Access from Anywhere","text":"","category":"section"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"For easy access from anywhere, add PromptingTools into your startup.jl (can be found in ~/.julia/config/startup.jl).","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Add the following snippet:","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"using PromptingTools\nconst PT = PromptingTools # to access unexported functions and types","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Now, you can just use ai\"Help me do X to achieve Y\" from any REPL session!","category":"page"},{"location":"frequently_asked_questions/#Open-Source-Alternatives","page":"F.A.Q.","title":"Open Source Alternatives","text":"","category":"section"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"The ethos of PromptingTools.jl is to allow you to use whatever model you want, which includes Open Source LLMs. The most popular and easiest to setup is Ollama.ai - see below for more information.","category":"page"},{"location":"frequently_asked_questions/#Setup-Guide-for-Ollama","page":"F.A.Q.","title":"Setup Guide for Ollama","text":"","category":"section"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Ollama runs a background service hosting LLMs that you can access via a simple API. It's especially useful when you're working with some sensitive data that should not be sent anywhere.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Installation is very easy, just download the latest version here.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Once you've installed it, just launch the app and you're ready to go!","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"To check if it's running, go to your browser and open 127.0.0.1:11434. You should see the message \"Ollama is running\". Alternatively, you can run ollama serve in your terminal and you'll get a message that it's already running.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"There are many models available in Ollama Library, including Llama2, CodeLlama, SQLCoder, or my personal favorite openhermes2.5-mistral.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Download new models with ollama pull (eg, ollama pull openhermes2.5-mistral). ","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"Show currently available models with ollama list.","category":"page"},{"location":"frequently_asked_questions/","page":"F.A.Q.","title":"F.A.Q.","text":"See Ollama.ai for more information.","category":"page"},{"location":"reference/#Reference","page":"Reference","title":"Reference","text":"","category":"section"},{"location":"reference/","page":"Reference","title":"Reference","text":"","category":"page"},{"location":"reference/","page":"Reference","title":"Reference","text":"Modules = [PromptingTools]","category":"page"},{"location":"reference/#PromptingTools.RESERVED_KWARGS","page":"Reference","title":"PromptingTools.RESERVED_KWARGS","text":"The following keywords are reserved for internal use in the ai* functions and cannot be used as placeholders in the Messages\n\n\n\n\n\n","category":"constant"},{"location":"reference/#PromptingTools.AICode","page":"Reference","title":"PromptingTools.AICode","text":"AICode(code::AbstractString; safe_eval::Bool=false, prefix::AbstractString=\"\", suffix::AbstractString=\"\")\n\nA mutable structure representing a code block (received from the AI model) with automatic parsing, execution, and output/error capturing capabilities.\n\nUpon instantiation with a string, the AICode object automatically runs a code parser and executor (via PromptingTools.eval!()), capturing any standard output (stdout) or errors. This structure is useful for programmatically handling and evaluating Julia code snippets.\n\nSee also: PromptingTools.extract_code_blocks, PromptingTools.eval!\n\nWorkflow\n\nUntil cb::AICode has been evaluated, cb.success is set to nothing (and so are all other fields).\nThe text in cb.code is parsed (saved to cb.expression).\nThe parsed expression is evaluated.\nOutputs of the evaluated expression are captured in cb.output.\nAny stdout outputs (e.g., from println) are captured in cb.stdout.\nIf an error occurs during evaluation, it is saved in cb.error.\nAfter successful evaluation without errors, cb.success is set to true. Otherwise, it is set to false and you can inspect the cb.error to understand why.\n\nProperties\n\ncode::AbstractString: The raw string of the code to be parsed and executed.\nexpression: The parsed Julia expression (set after parsing code).\nstdout: Captured standard output from the execution of the code.\noutput: The result of evaluating the code block.\nsuccess::Union{Nothing, Bool}: Indicates whether the code block executed successfully (true), unsuccessfully (false), or has yet to be evaluated (nothing).\nerror::Union{Nothing, Exception}: Any exception raised during the execution of the code block.\n\nKeyword Arguments\n\nsafe_eval::Bool: If set to true, the code block checks for package operations (e.g., installing new packages) and missing imports, and then evaluates the code inside a bespoke scratch module. This is to ensure that the evaluation does not alter any user-defined variables or the global state. Defaults to false.\nprefix::AbstractString: A string to be prepended to the code block before parsing and evaluation. Useful to add some additional code definition or necessary imports. Defaults to an empty string.\nsuffix::AbstractString: A string to be appended to the code block before parsing and evaluation. Useful to check that tests pass or that an example executes. Defaults to an empty string.\n\nMethods\n\nBase.isvalid(cb::AICode): Check if the code block has executed successfully. Returns true if cb.success == true.\n\nExamples\n\ncode = AICode(\"println(\"Hello, World!\")\") # Auto-parses and evaluates the code, capturing output and errors.\nisvalid(code) # Output: true\ncode.stdout # Output: \"Hello, World!\n\"\n\nWe try to evaluate \"safely\" by default (eg, inside a custom module, to avoid changing user variables). You can avoid that with save_eval=false:\n\ncode = AICode(\"new_variable = 1\"; safe_eval=false)\nisvalid(code) # Output: true\nnew_variable # Output: 1\n\nYou can also call AICode directly on an AIMessage, which will extract the Julia code blocks, concatenate them and evaluate them:\n\nmsg = aigenerate(\"In Julia, how do you create a vector of 10 random numbers?\")\ncode = AICode(msg)\n# Output: AICode(Success: True, Parsed: True, Evaluated: True, Error Caught: N/A, StdOut: True, Code: 2 Lines)\n\n# show the code\ncode.code |> println\n# Output: \n# numbers = rand(10)\n# numbers = rand(1:100, 10)\n\n# or copy it to the clipboard\ncode.code |> clipboard\n\n# or execute it in the current module (=Main)\neval(code.expression)\n\n\n\n\n\n","category":"type"},{"location":"reference/#PromptingTools.AITemplate","page":"Reference","title":"PromptingTools.AITemplate","text":"AITemplate\n\nAITemplate is a template for a conversation prompt. This type is merely a container for the template name, which is resolved into a set of messages (=prompt) by render.\n\nNaming Convention\n\nTemplate names should be in CamelCase\nFollow the format ...... where possible, eg, JudgeIsItTrue, ``\nStarting with the Persona (=System prompt), eg, Judge = persona is meant to judge some provided information\nVariable to be filled in with context, eg, It = placeholder it\nEnding with the variable name is helpful, eg, JuliaExpertTask for a persona to be an expert in Julia language and task is the placeholder name\nIdeally, the template name should be self-explanatory, eg, JudgeIsItTrue = persona is meant to judge some provided information where it is true or false\n\nExamples\n\nSave time by re-using pre-made templates, just fill in the placeholders with the keyword arguments:\n\nmsg = aigenerate(:JuliaExpertAsk; ask = \"How do I add packages?\")\n\nThe above is equivalent to a more verbose version that explicitly uses the dispatch on AITemplate:\n\nmsg = aigenerate(AITemplate(:JuliaExpertAsk); ask = \"How do I add packages?\")\n\nFind available templates with aitemplates:\n\ntmps = aitemplates(\"JuliaExpertAsk\")\n# Will surface one specific template\n# 1-element Vector{AITemplateMetadata}:\n# PromptingTools.AITemplateMetadata\n# name: Symbol JuliaExpertAsk\n# description: String \"For asking questions about Julia language. Placeholders: `ask`\"\n# version: String \"1\"\n# wordcount: Int64 237\n# variables: Array{Symbol}((1,))\n# system_preview: String \"You are a world-class Julia language programmer with the knowledge of the latest syntax. Your commun\"\n# user_preview: String \"# Question\n\n{{ask}}\"\n# source: String \"\"\n\nThe above gives you a good idea of what the template is about, what placeholders are available, and how much it would cost to use it (=wordcount).\n\nSearch for all Julia-related templates:\n\ntmps = aitemplates(\"Julia\")\n# 2-element Vector{AITemplateMetadata}... -> more to come later!\n\nIf you are on VSCode, you can leverage nice tabular display with vscodedisplay:\n\nusing DataFrames\ntmps = aitemplates(\"Julia\") |> DataFrame |> vscodedisplay\n\nI have my selected template, how do I use it? Just use the \"name\" in aigenerate or aiclassify like you see in the first example!\n\nYou can inspect any template by \"rendering\" it (this is what the LLM will see):\n\njulia> AITemplate(:JudgeIsItTrue) |> PromptingTools.render\n\nSee also: save_template, load_template, load_templates! for more advanced use cases (and the corresponding script in examples/ folder)\n\n\n\n\n\n","category":"type"},{"location":"reference/#PromptingTools.AITemplateMetadata","page":"Reference","title":"PromptingTools.AITemplateMetadata","text":"Helper for easy searching and reviewing of templates. Defined on loading of each template.\n\n\n\n\n\n","category":"type"},{"location":"reference/#PromptingTools.AbstractPromptSchema","page":"Reference","title":"PromptingTools.AbstractPromptSchema","text":"Defines different prompting styles based on the model training and fine-tuning.\n\n\n\n\n\n","category":"type"},{"location":"reference/#PromptingTools.ChatMLSchema","page":"Reference","title":"PromptingTools.ChatMLSchema","text":"ChatMLSchema is used by many open-source chatbots, by OpenAI models (under the hood) and by several models and inferfaces (eg, Ollama, vLLM)\n\nYou can explore it on tiktokenizer\n\nIt uses the following conversation structure:\n\nsystem\n...\n<|im_start|>user\n...<|im_end|>\n<|im_start|>assistant\n...<|im_end|>\n\n\n\n\n\n","category":"type"},{"location":"reference/#PromptingTools.MaybeExtract","page":"Reference","title":"PromptingTools.MaybeExtract","text":"Extract a result from the provided data, if any, otherwise set the error and message fields.\n\nArguments\n\nerror::Bool: true if a result is found, false otherwise.\nmessage::String: Only present if no result is found, should be short and concise.\n\n\n\n\n\n","category":"type"},{"location":"reference/#PromptingTools.NoSchema","page":"Reference","title":"PromptingTools.NoSchema","text":"Schema that keeps messages (<:AbstractMessage) and does not transform for any specific model. It used by the first pass of the prompt rendering system (see ?render).\n\n\n\n\n\n","category":"type"},{"location":"reference/#PromptingTools.OllamaManagedSchema","page":"Reference","title":"PromptingTools.OllamaManagedSchema","text":"Ollama by default manages different models and their associated prompt schemas when you pass system_prompt and prompt fields to the API.\n\nWarning: It works only for 1 system message and 1 user message, so anything more than that has to be rejected.\n\nIf you need to pass more messagese / longer conversational history, you can use define the model-specific schema directly and pass your Ollama requests with raw=true, which disables and templating and schema management by Ollama.\n\n\n\n\n\n","category":"type"},{"location":"reference/#PromptingTools.OpenAISchema","page":"Reference","title":"PromptingTools.OpenAISchema","text":"OpenAISchema is the default schema for OpenAI models.\n\nIt uses the following conversation template:\n\n[Dict(role=\"system\",content=\"...\"),Dict(role=\"user\",content=\"...\"),Dict(role=\"assistant\",content=\"...\")]\n\nIt's recommended to separate sections in your prompt with markdown headers (e.g. `##Answer\n\n`).\n\n\n\n\n\n","category":"type"},{"location":"reference/#PromptingTools.TestEchoOllamaManagedSchema","page":"Reference","title":"PromptingTools.TestEchoOllamaManagedSchema","text":"Echoes the user's input back to them. Used for testing the implementation\n\n\n\n\n\n","category":"type"},{"location":"reference/#PromptingTools.TestEchoOpenAISchema","page":"Reference","title":"PromptingTools.TestEchoOpenAISchema","text":"Echoes the user's input back to them. Used for testing the implementation\n\n\n\n\n\n","category":"type"},{"location":"reference/#PromptingTools.UserMessageWithImages-Tuple{AbstractString}","page":"Reference","title":"PromptingTools.UserMessageWithImages","text":"Construct UserMessageWithImages with 1 or more images. Images can be either URLs or local paths.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.X123","page":"Reference","title":"PromptingTools.X123","text":"With docstring\n\n\n\n\n\n","category":"type"},{"location":"reference/#PromptingTools.aiclassify-Tuple{PromptingTools.AbstractOpenAISchema, Union{AbstractString, PromptingTools.AbstractMessage, Vector{<:PromptingTools.AbstractMessage}}}","page":"Reference","title":"PromptingTools.aiclassify","text":"aiclassify(prompt_schema::AbstractOpenAISchema, prompt::ALLOWED_PROMPT_TYPE;\napi_kwargs::NamedTuple = (logit_bias = Dict(837 => 100, 905 => 100, 9987 => 100),\n max_tokens = 1, temperature = 0),\nkwargs...)\n\nClassifies the given prompt/statement as true/false/unknown.\n\nNote: this is a very simple classifier, it is not meant to be used in production. Credit goes to AAAzzam.\n\nIt uses Logit bias trick and limits the output to 1 token to force the model to output only true/false/unknown.\n\nOutput tokens used (via api_kwargs):\n\n837: ' true'\n905: ' false'\n9987: ' unknown'\n\nArguments\n\nprompt_schema::AbstractOpenAISchema: The schema for the prompt.\nprompt: The prompt/statement to classify if it's a String. If it's a Symbol, it is expanded as a template via render(schema,template).\n\nExample\n\naiclassify(\"Is two plus two four?\") # true\naiclassify(\"Is two plus three a vegetable on Mars?\") # false\n\naiclassify returns only true/false/unknown. It's easy to get the proper Bool output type out with tryparse, eg,\n\ntryparse(Bool, aiclassify(\"Is two plus two four?\")) isa Bool # true\n\nOutput of type Nothing marks that the model couldn't classify the statement as true/false.\n\nIdeally, we would like to re-use some helpful system prompt to get more accurate responses. For this reason we have templates, eg, :JudgeIsItTrue. By specifying the template, we can provide our statement as the expected variable (it in this case) See that the model now correctly classifies the statement as \"unknown\".\n\naiclassify(:JudgeIsItTrue; it = \"Is two plus three a vegetable on Mars?\") # unknown\n\nFor better results, use higher quality models like gpt4, eg, \n\naiclassify(:JudgeIsItTrue;\n it = \"If I had two apples and I got three more, I have five apples now.\",\n model = \"gpt4\") # true\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.aiembed-Union{Tuple{F}, Tuple{PromptingTools.AbstractOllamaManagedSchema, AbstractString}, Tuple{PromptingTools.AbstractOllamaManagedSchema, AbstractString, F}} where F<:Function","page":"Reference","title":"PromptingTools.aiembed","text":"aiembed(prompt_schema::AbstractOllamaManagedSchema,\n doc_or_docs::Union{AbstractString, Vector{<:AbstractString}},\n postprocess::F = identity;\n verbose::Bool = true,\n api_key::String = API_KEY,\n model::String = MODEL_EMBEDDING,\n http_kwargs::NamedTuple = (retry_non_idempotent = true,\n retries = 5,\n readtimeout = 120),\n api_kwargs::NamedTuple = NamedTuple(),\n kwargs...) where {F <: Function}\n\nThe aiembed function generates embeddings for the given input using a specified model and returns a message object containing the embeddings, status, token count, and elapsed time.\n\nArguments\n\nprompt_schema::AbstractOllamaManagedSchema: The schema for the prompt.\ndoc_or_docs::Union{AbstractString, Vector{<:AbstractString}}: The document or list of documents to generate embeddings for. The list of documents is processed sequentially, so users should consider implementing an async version with with Threads.@spawn\npostprocess::F: The post-processing function to apply to each embedding. Defaults to the identity function, but could be LinearAlgebra.normalize.\nverbose::Bool: A flag indicating whether to print verbose information. Defaults to true.\napi_key::String: The API key to use for the OpenAI API. Defaults to API_KEY.\nmodel::String: The model to use for generating embeddings. Defaults to MODEL_EMBEDDING.\nhttp_kwargs::NamedTuple: Additional keyword arguments for the HTTP request. Defaults to empty NamedTuple.\napi_kwargs::NamedTuple: Additional keyword arguments for the Ollama API. Defaults to an empty NamedTuple.\nkwargs: Prompt variables to be used to fill the prompt/template\n\nReturns\n\nmsg: A DataMessage object containing the embeddings, status, token count, and elapsed time.\n\nNote: Ollama API currently does not return the token count, so it's set to (0,0)\n\nExample\n\nconst PT = PromptingTools\nschema = PT.OllamaManagedSchema()\n\nmsg = aiembed(schema, \"Hello World\"; model=\"openhermes2.5-mistral\")\nmsg.content # 4096-element JSON3.Array{Float64...\n\nWe can embed multiple strings at once and they will be hcat into a matrix (ie, each column corresponds to one string)\n\nconst PT = PromptingTools\nschema = PT.OllamaManagedSchema()\n\nmsg = aiembed(schema, [\"Hello World\", \"How are you?\"]; model=\"openhermes2.5-mistral\")\nmsg.content # 4096×2 Matrix{Float64}:\n\nIf you plan to calculate the cosine distance between embeddings, you can normalize them first:\n\nconst PT = PromptingTools\nusing LinearAlgebra\nschema = PT.OllamaManagedSchema()\n\nmsg = aiembed(schema, [\"embed me\", \"and me too\"], LinearAlgebra.normalize; model=\"openhermes2.5-mistral\")\n\n# calculate cosine distance between the two normalized embeddings as a simple dot product\nmsg.content' * msg.content[:, 1] # [1.0, 0.34]\n\nSimilarly, you can use the postprocess argument to materialize the data from JSON3.Object by using postprocess = copy\n\nconst PT = PromptingTools\nschema = PT.OllamaManagedSchema()\n\nmsg = aiembed(schema, \"Hello World\", copy; model=\"openhermes2.5-mistral\")\nmsg.content # 4096-element Vector{Float64}\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.aiembed-Union{Tuple{F}, Tuple{PromptingTools.AbstractOpenAISchema, Union{AbstractString, Vector{<:AbstractString}}}, Tuple{PromptingTools.AbstractOpenAISchema, Union{AbstractString, Vector{<:AbstractString}}, F}} where F<:Function","page":"Reference","title":"PromptingTools.aiembed","text":"aiembed(prompt_schema::AbstractOpenAISchema,\n doc_or_docs::Union{AbstractString, Vector{<:AbstractString}},\n postprocess::F = identity;\n verbose::Bool = true,\n api_key::String = API_KEY,\n model::String = MODEL_EMBEDDING, \n http_kwargs::NamedTuple = (retry_non_idempotent = true,\n retries = 5,\n readtimeout = 120),\n api_kwargs::NamedTuple = NamedTuple(),\n kwargs...) where {F <: Function}\n\nThe aiembed function generates embeddings for the given input using a specified model and returns a message object containing the embeddings, status, token count, and elapsed time.\n\nArguments\n\nprompt_schema::AbstractOpenAISchema: The schema for the prompt.\ndoc_or_docs::Union{AbstractString, Vector{<:AbstractString}}: The document or list of documents to generate embeddings for.\npostprocess::F: The post-processing function to apply to each embedding. Defaults to the identity function.\nverbose::Bool: A flag indicating whether to print verbose information. Defaults to true.\napi_key::String: The API key to use for the OpenAI API. Defaults to API_KEY.\nmodel::String: The model to use for generating embeddings. Defaults to MODEL_EMBEDDING.\nhttp_kwargs::NamedTuple: Additional keyword arguments for the HTTP request. Defaults to (retry_non_idempotent = true, retries = 5, readtimeout = 120).\napi_kwargs::NamedTuple: Additional keyword arguments for the OpenAI API. Defaults to an empty NamedTuple.\nkwargs...: Additional keyword arguments.\n\nReturns\n\nmsg: A DataMessage object containing the embeddings, status, token count, and elapsed time. Use msg.content to access the embeddings.\n\nExample\n\nmsg = aiembed(\"Hello World\")\nmsg.content # 1536-element JSON3.Array{Float64...\n\nWe can embed multiple strings at once and they will be hcat into a matrix (ie, each column corresponds to one string)\n\nmsg = aiembed([\"Hello World\", \"How are you?\"])\nmsg.content # 1536×2 Matrix{Float64}:\n\nIf you plan to calculate the cosine distance between embeddings, you can normalize them first:\n\nusing LinearAlgebra\nmsg = aiembed([\"embed me\", \"and me too\"], LinearAlgebra.normalize)\n\n# calculate cosine distance between the two normalized embeddings as a simple dot product\nmsg.content' * msg.content[:, 1] # [1.0, 0.787]\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.aiextract-Tuple{PromptingTools.AbstractOpenAISchema, Union{AbstractString, PromptingTools.AbstractMessage, Vector{<:PromptingTools.AbstractMessage}}}","page":"Reference","title":"PromptingTools.aiextract","text":"aiextract([prompt_schema::AbstractOpenAISchema,] prompt::ALLOWED_PROMPT_TYPE; \nreturn_type::Type,\nverbose::Bool = true,\n model::String = MODEL_CHAT,\n return_all::Bool = false, dry_run::Bool = false, \n conversation::AbstractVector{<:AbstractMessage} = AbstractMessage[],\n http_kwargs::NamedTuple = (;\n retry_non_idempotent = true,\n retries = 5,\n readtimeout = 120), api_kwargs::NamedTuple = NamedTuple(),\n kwargs...)\n\nExtract required information (defined by a struct return_type) from the provided prompt by leveraging OpenAI function calling mode.\n\nThis is a perfect solution for extracting structured information from text (eg, extract organization names in news articles, etc.)\n\nIt's effectively a light wrapper around aigenerate call, which requires additional keyword argument return_type to be provided and will enforce the model outputs to adhere to it.\n\nArguments\n\nprompt_schema: An optional object to specify which prompt template should be applied (Default to PROMPT_SCHEMA = OpenAISchema)\nprompt: Can be a string representing the prompt for the AI conversation, a UserMessage, a vector of AbstractMessage or an AITemplate\nreturn_type: A struct TYPE representing the the information we want to extract. Do not provide a struct instance, only the type. If the struct has a docstring, it will be provided to the model as well. It's used to enforce structured model outputs or provide more information.\nverbose: A boolean indicating whether to print additional information.\napi_key: A string representing the API key for accessing the OpenAI API.\nmodel: A string representing the model to use for generating the response. Can be an alias corresponding to a model ID defined in MODEL_ALIASES.\nreturn_all::Bool=false: If true, returns the entire conversation history, otherwise returns only the last message (the AIMessage).\ndry_run::Bool=false: If true, skips sending the messages to the model (for debugging, often used with return_all=true).\nconversation: An optional vector of AbstractMessage objects representing the conversation history. If not provided, it is initialized as an empty vector.\nhttp_kwargs: A named tuple of HTTP keyword arguments.\napi_kwargs: A named tuple of API keyword arguments.\nkwargs: Prompt variables to be used to fill the prompt/template\n\nReturns\n\nIf return_all=false (default):\n\nmsg: An DataMessage object representing the extracted data, including the content, status, tokens, and elapsed time. Use msg.content to access the extracted data.\n\nIf return_all=true:\n\nconversation: A vector of AbstractMessage objects representing the full conversation history, including the response from the AI model (DataMessage).\n\nSee also: function_call_signature, MaybeExtract, aigenerate\n\nExample\n\nDo you want to extract some specific measurements from a text like age, weight and height? You need to define the information you need as a struct (return_type):\n\n\"Person's age, height, and weight.\"\nstruct MyMeasurement\n age::Int # required\n height::Union{Int,Nothing} # optional\n weight::Union{Nothing,Float64} # optional\nend\nmsg = aiextract(\"James is 30, weighs 80kg. He's 180cm tall.\"; return_type=MyMeasurement)\n# [ Info: Tokens: 129 @ Cost: $0.0002 in 1.0 seconds\n# PromptingTools.DataMessage(MyMeasurement)\nmsg.content\n# MyMeasurement(30, 180, 80.0)\n\nThe fields that allow Nothing are marked as optional in the schema:\n\nmsg = aiextract(\"James is 30.\"; return_type=MyMeasurement)\n# MyMeasurement(30, nothing, nothing)\n\nIf there are multiple items you want to extract, define a wrapper struct to get a Vector of MyMeasurement:\n\nstruct MyMeasurementWrapper\n measurements::Vector{MyMeasurement}\nend\n\nmsg = aiextract(\"James is 30, weighs 80kg. He's 180cm tall. Then Jack is 19 but really tall - over 190!\"; return_type=ManyMeasurements)\n\nmsg.content.measurements\n# 2-element Vector{MyMeasurement}:\n# MyMeasurement(30, 180, 80.0)\n# MyMeasurement(19, 190, nothing)\n\nOr if you want your extraction to fail gracefully when data isn't found, use MaybeExtract{T} wrapper (this trick is inspired by the Instructor package!):\n\nusing PromptingTools: MaybeExtract\n\ntype = MaybeExtract{MyMeasurement}\n# Effectively the same as:\n# struct MaybeExtract{T}\n# result::Union{T, Nothing} // The result of the extraction\n# error::Bool // true if a result is found, false otherwise\n# message::Union{Nothing, String} // Only present if no result is found, should be short and concise\n# end\n\n# If LLM extraction fails, it will return a Dict with `error` and `message` fields instead of the result!\nmsg = aiextract(\"Extract measurements from the text: I am giraffe\", type)\nmsg.content\n# MaybeExtract{MyMeasurement}(nothing, true, \"I'm sorry, but I can only assist with human measurements.\")\n\nThat way, you can handle the error gracefully and get a reason why extraction failed (in msg.content.message).\n\nNote that the error message refers to a giraffe not being a human, because in our MyMeasurement docstring, we said that it's for people!\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.aigenerate-Tuple{PromptingTools.AbstractOllamaManagedSchema, Union{AbstractString, PromptingTools.AbstractMessage, Vector{<:PromptingTools.AbstractMessage}}}","page":"Reference","title":"PromptingTools.aigenerate","text":"aigenerate(prompt_schema::AbstractOllamaManagedSchema, prompt::ALLOWED_PROMPT_TYPE; verbose::Bool = true,\n model::String = MODEL_CHAT,\n return_all::Bool = false, dry_run::Bool = false,\n conversation::AbstractVector{<:AbstractMessage} = AbstractMessage[],\n http_kwargs::NamedTuple = NamedTuple(), api_kwargs::NamedTuple = NamedTuple(),\n kwargs...)\n\nGenerate an AI response based on a given prompt using the OpenAI API.\n\nArguments\n\nprompt_schema: An optional object to specify which prompt template should be applied (Default to PROMPT_SCHEMA = OpenAISchema not AbstractManagedSchema)\nprompt: Can be a string representing the prompt for the AI conversation, a UserMessage, a vector of AbstractMessage or an AITemplate\nverbose: A boolean indicating whether to print additional information.\napi_key: Provided for interface consistency. Not needed for locally hosted Ollama.\nmodel: A string representing the model to use for generating the response. Can be an alias corresponding to a model ID defined in MODEL_ALIASES.\nreturn_all::Bool=false: If true, returns the entire conversation history, otherwise returns only the last message (the AIMessage).\ndry_run::Bool=false: If true, skips sending the messages to the model (for debugging, often used with return_all=true).\nconversation::AbstractVector{<:AbstractMessage}=[]: Not allowed for this schema. Provided only for compatibility.\nhttp_kwargs::NamedTuple: Additional keyword arguments for the HTTP request. Defaults to empty NamedTuple.\napi_kwargs::NamedTuple: Additional keyword arguments for the Ollama API. Defaults to an empty NamedTuple.\nkwargs: Prompt variables to be used to fill the prompt/template\n\nReturns\n\nmsg: An AIMessage object representing the generated AI message, including the content, status, tokens, and elapsed time.\n\nUse msg.content to access the extracted string.\n\nSee also: ai_str, aai_str, aiembed\n\nExample\n\nSimple hello world to test the API:\n\nconst PT = PromptingTools\nschema = PT.OllamaManagedSchema() # We need to explicit if we want Ollama, OpenAISchema is the default\n\nmsg = aigenerate(schema, \"Say hi!\"; model=\"openhermes2.5-mistral\")\n# [ Info: Tokens: 69 in 0.9 seconds\n# AIMessage(\"Hello! How can I assist you today?\")\n\nmsg is an AIMessage object. Access the generated string via content property:\n\ntypeof(msg) # AIMessage{SubString{String}}\npropertynames(msg) # (:content, :status, :tokens, :elapsed\nmsg.content # \"Hello! How can I assist you today?\"\n\nNote: We need to be explicit about the schema we want to use. If we don't, it will default to OpenAISchema (=PT.DEFAULT_SCHEMA) ___ You can use string interpolation:\n\nconst PT = PromptingTools\nschema = PT.OllamaManagedSchema()\na = 1\nmsg=aigenerate(schema, \"What is `$a+$a`?\"; model=\"openhermes2.5-mistral\")\nmsg.content # \"The result of `1+1` is `2`.\"\n\n___ You can provide the whole conversation or more intricate prompts as a Vector{AbstractMessage}:\n\nconst PT = PromptingTools\nschema = PT.OllamaManagedSchema()\n\nconversation = [\n PT.SystemMessage(\"You're master Yoda from Star Wars trying to help the user become a Yedi.\"),\n PT.UserMessage(\"I have feelings for my iPhone. What should I do?\")]\n\nmsg = aigenerate(schema, conversation; model=\"openhermes2.5-mistral\")\n# [ Info: Tokens: 111 in 2.1 seconds\n# AIMessage(\"Strong the attachment is, it leads to suffering it may. Focus on the force within you must, ...\")\n\nNote: Managed Ollama currently supports at most 1 User Message and 1 System Message given the API limitations. If you want more, you need to use the ChatMLSchema.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.aigenerate-Tuple{PromptingTools.AbstractOpenAISchema, Union{AbstractString, PromptingTools.AbstractMessage, Vector{<:PromptingTools.AbstractMessage}}}","page":"Reference","title":"PromptingTools.aigenerate","text":"aigenerate(prompt_schema::AbstractOpenAISchema, prompt::ALLOWED_PROMPT_TYPE;\n verbose::Bool = true,\n api_key::String = API_KEY,\n model::String = MODEL_CHAT, return_all::Bool = false, dry_run::Bool = false,\n http_kwargs::NamedTuple = (retry_non_idempotent = true,\n retries = 5,\n readtimeout = 120), api_kwargs::NamedTuple = NamedTuple(),\n kwargs...)\n\nGenerate an AI response based on a given prompt using the OpenAI API.\n\nArguments\n\nprompt_schema: An optional object to specify which prompt template should be applied (Default to PROMPT_SCHEMA = OpenAISchema)\nprompt: Can be a string representing the prompt for the AI conversation, a UserMessage, a vector of AbstractMessage or an AITemplate\nverbose: A boolean indicating whether to print additional information.\napi_key: A string representing the API key for accessing the OpenAI API.\nmodel: A string representing the model to use for generating the response. Can be an alias corresponding to a model ID defined in MODEL_ALIASES.\nreturn_all::Bool=false: If true, returns the entire conversation history, otherwise returns only the last message (the AIMessage).\ndry_run::Bool=false: If true, skips sending the messages to the model (for debugging, often used with return_all=true).\nconversation: An optional vector of AbstractMessage objects representing the conversation history. If not provided, it is initialized as an empty vector.\nhttp_kwargs: A named tuple of HTTP keyword arguments.\napi_kwargs: A named tuple of API keyword arguments.\nkwargs: Prompt variables to be used to fill the prompt/template\n\nReturns\n\nIf return_all=false (default):\n\nmsg: An AIMessage object representing the generated AI message, including the content, status, tokens, and elapsed time.\n\nUse msg.content to access the extracted string.\n\nIf return_all=true:\n\nconversation: A vector of AbstractMessage objects representing the conversation history, including the response from the AI model (AIMessage).\n\nSee also: ai_str, aai_str, aiembed, aiclassify, aiextract, aiscan, aitemplates\n\nExample\n\nSimple hello world to test the API:\n\nresult = aigenerate(\"Say Hi!\")\n# [ Info: Tokens: 29 @ Cost: $0.0 in 1.0 seconds\n# AIMessage(\"Hello! How can I assist you today?\")\n\nresult is an AIMessage object. Access the generated string via content property:\n\ntypeof(result) # AIMessage{SubString{String}}\npropertynames(result) # (:content, :status, :tokens, :elapsed\nresult.content # \"Hello! How can I assist you today?\"\n\n___ You can use string interpolation:\n\na = 1\nmsg=aigenerate(\"What is `$a+$a`?\")\nmsg.content # \"The sum of `1+1` is `2`.\"\n\n___ You can provide the whole conversation or more intricate prompts as a Vector{AbstractMessage}:\n\nconst PT = PromptingTools\n\nconversation = [\n PT.SystemMessage(\"You're master Yoda from Star Wars trying to help the user become a Yedi.\"),\n PT.UserMessage(\"I have feelings for my iPhone. What should I do?\")]\nmsg=aigenerate(conversation)\n# AIMessage(\"Ah, strong feelings you have for your iPhone. A Jedi's path, this is not... \")\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.aiscan-Tuple{PromptingTools.AbstractOpenAISchema, Union{AbstractString, PromptingTools.AbstractMessage, Vector{<:PromptingTools.AbstractMessage}}}","page":"Reference","title":"PromptingTools.aiscan","text":"aiscan([promptschema::AbstractOpenAISchema,] prompt::ALLOWEDPROMPTTYPE; imageurl::Union{Nothing, AbstractString, Vector{<:AbstractString}} = nothing, imagepath::Union{Nothing, AbstractString, Vector{<:AbstractString}} = nothing, imagedetail::AbstractString = \"auto\", attachtolatest::Bool = true, verbose::Bool = true, model::String = MODELCHAT, returnall::Bool = false, dryrun::Bool = false, conversation::AbstractVector{<:AbstractMessage} = AbstractMessage[], httpkwargs::NamedTuple = (; retrynonidempotent = true, retries = 5, readtimeout = 120), apikwargs::NamedTuple = = (; maxtokens = 2500), kwargs...)\n\nScans the provided image (image_url or image_path) with the goal provided in the prompt.\n\nCan be used for many multi-modal tasks, such as: OCR (transcribe text in the image), image captioning, image classification, etc.\n\nIt's effectively a light wrapper around aigenerate call, which uses additional keyword arguments image_url, image_path, image_detail to be provided. At least one image source (url or path) must be provided.\n\nArguments\n\nprompt_schema: An optional object to specify which prompt template should be applied (Default to PROMPT_SCHEMA = OpenAISchema)\nprompt: Can be a string representing the prompt for the AI conversation, a UserMessage, a vector of AbstractMessage or an AITemplate\nimage_url: A string or vector of strings representing the URL(s) of the image(s) to scan.\nimage_path: A string or vector of strings representing the path(s) of the image(s) to scan.\nimage_detail: A string representing the level of detail to include for images. Can be \"auto\", \"high\", or \"low\". See OpenAI Vision Guide for more details.\nattach_to_latest: A boolean how to handle if a conversation with multiple UserMessage is provided. When true, the images are attached to the latest UserMessage.\nverbose: A boolean indicating whether to print additional information.\napi_key: A string representing the API key for accessing the OpenAI API.\nmodel: A string representing the model to use for generating the response. Can be an alias corresponding to a model ID defined in MODEL_ALIASES.\nreturn_all::Bool=false: If true, returns the entire conversation history, otherwise returns only the last message (the AIMessage).\ndry_run::Bool=false: If true, skips sending the messages to the model (for debugging, often used with return_all=true).\nconversation: An optional vector of AbstractMessage objects representing the conversation history. If not provided, it is initialized as an empty vector.\nhttp_kwargs: A named tuple of HTTP keyword arguments.\napi_kwargs: A named tuple of API keyword arguments.\nkwargs: Prompt variables to be used to fill the prompt/template\n\nReturns\n\nIf return_all=false (default):\n\nmsg: An AIMessage object representing the generated AI message, including the content, status, tokens, and elapsed time.\n\nUse msg.content to access the extracted string.\n\nIf return_all=true:\n\nconversation: A vector of AbstractMessage objects representing the full conversation history, including the response from the AI model (AIMessage).\n\nSee also: ai_str, aai_str, aigenerate, aiembed, aiclassify, aiextract, aitemplates\n\nNotes\n\nAll examples below use model \"gpt4v\", which is an alias for model ID \"gpt-4-vision-preview\"\nmax_tokens in the api_kwargs is preset to 2500, otherwise OpenAI enforces a default of only a few hundred tokens (~300). If your output is truncated, increase this value\n\nExample\n\nDescribe the provided image:\n\nmsg = aiscan(\"Describe the image\"; image_path=\"julia.png\", model=\"gpt4v\")\n# [ Info: Tokens: 1141 @ Cost: $0.0117 in 2.2 seconds\n# AIMessage(\"The image shows a logo consisting of the word \"julia\" written in lowercase\")\n\nYou can provide multiple images at once as a vector and ask for \"low\" level of detail (cheaper):\n\nmsg = aiscan(\"Describe the image\"; image_path=[\"julia.png\",\"python.png\"], image_detail=\"low\", model=\"gpt4v\")\n\nYou can use this function as a nice and quick OCR (transcribe text in the image) with a template :OCRTask. Let's transcribe some SQL code from a screenshot (no more re-typing!):\n\n# Screenshot of some SQL code\nimage_url = \"https://www.sqlservercentral.com/wp-content/uploads/legacy/8755f69180b7ac7ee76a69ae68ec36872a116ad4/24622.png\"\nmsg = aiscan(:OCRTask; image_url, model=\"gpt4v\", task=\"Transcribe the SQL code in the image.\", api_kwargs=(; max_tokens=2500))\n\n# [ Info: Tokens: 362 @ Cost: $0.0045 in 2.5 seconds\n# AIMessage(\"```sql\n# update Orders \n\n# You can add syntax highlighting of the outputs via Markdown\nusing Markdown\nmsg.content |> Markdown.parse\n\nNotice that we enforce max_tokens = 2500. That's because OpenAI seems to default to ~300 tokens, which provides incomplete outputs. Hence, we set this value to 2500 as a default. If you still get truncated outputs, increase this value.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.aitemplates","page":"Reference","title":"PromptingTools.aitemplates","text":"aitemplates\n\nFind easily the most suitable templates for your use case.\n\nYou can search by:\n\nquery::Symbol which looks look only for partial matches in the template name\nquery::AbstractString which looks for partial matches in the template name or description\nquery::Regex which looks for matches in the template name, description or any of the message previews\n\nKeyword Arguments\n\nlimit::Int limits the number of returned templates (Defaults to 10)\n\nExamples\n\nFind available templates with aitemplates:\n\ntmps = aitemplates(\"JuliaExpertAsk\")\n# Will surface one specific template\n# 1-element Vector{AITemplateMetadata}:\n# PromptingTools.AITemplateMetadata\n# name: Symbol JuliaExpertAsk\n# description: String \"For asking questions about Julia language. Placeholders: `ask`\"\n# version: String \"1\"\n# wordcount: Int64 237\n# variables: Array{Symbol}((1,))\n# system_preview: String \"You are a world-class Julia language programmer with the knowledge of the latest syntax. Your commun\"\n# user_preview: String \"# Question\n\n{{ask}}\"\n# source: String \"\"\n\nThe above gives you a good idea of what the template is about, what placeholders are available, and how much it would cost to use it (=wordcount).\n\nSearch for all Julia-related templates:\n\ntmps = aitemplates(\"Julia\")\n# 2-element Vector{AITemplateMetadata}... -> more to come later!\n\nIf you are on VSCode, you can leverage nice tabular display with vscodedisplay:\n\nusing DataFrames\ntmps = aitemplates(\"Julia\") |> DataFrame |> vscodedisplay\n\nI have my selected template, how do I use it? Just use the \"name\" in aigenerate or aiclassify like you see in the first example!\n\n\n\n\n\n","category":"function"},{"location":"reference/#PromptingTools.aitemplates-Tuple{AbstractString}","page":"Reference","title":"PromptingTools.aitemplates","text":"Find the top-limit templates whose name or description fields partially match the query_key::String in TEMPLATE_METADATA.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.aitemplates-Tuple{Regex}","page":"Reference","title":"PromptingTools.aitemplates","text":"Find the top-limit templates where provided query_key::Regex matches either of name, description or previews or User or System messages in TEMPLATE_METADATA.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.aitemplates-Tuple{Symbol}","page":"Reference","title":"PromptingTools.aitemplates","text":"Find the top-limit templates whose name::Symbol partially matches the query_name::Symbol in TEMPLATE_METADATA.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.eval!-Tuple{PromptingTools.AbstractCodeBlock}","page":"Reference","title":"PromptingTools.eval!","text":"eval!(cb::AICode; safe_eval::Bool=true, prefix::AbstractString=\"\", suffix::AbstractString=\"\")\n\nEvaluates a code block cb in-place. It runs automatically when AICode is instantiated with a String.\n\nCheck the outcome of evaluation with Base.isvalid(cb). If ==true, provide code block has executed successfully.\n\nSteps:\n\nIf cb::AICode has not been evaluated, cb.success = nothing. After the evaluation it will be either true or false depending on the outcome\nParse the text in cb.code\nEvaluate the parsed expression\nCapture outputs of the evaluated in cb.output\nCapture any stdout outputs (eg, test failures) in cb.stdout\nIf any error exception is raised, it is saved in cb.error\nFinally, if all steps were successful, success is set to cb.success = true\n\nKeyword Arguments\n\nsafe_eval::Bool: If true, we first check for any Pkg operations (eg, installing new packages) and missing imports, then the code will be evaluated inside a bespoke scratch module (not to change any user variables)\nprefix::AbstractString: A string to be prepended to the code block before parsing and evaluation. Useful to add some additional code definition or necessary imports. Defaults to an empty string.\nsuffix::AbstractString: A string to be appended to the code block before parsing and evaluation. Useful to check that tests pass or that an example executes. Defaults to an empty string.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.extract_code_blocks-Tuple{AbstractString}","page":"Reference","title":"PromptingTools.extract_code_blocks","text":"extract_code_blocks(markdown_content::String) -> Vector{String}\n\nExtract Julia code blocks from a markdown string.\n\nThis function searches through the provided markdown content, identifies blocks of code specifically marked as Julia code (using the julia ... code fence patterns), and extracts the code within these blocks. The extracted code blocks are returned as a vector of strings, with each string representing one block of Julia code. \n\nNote: Only the content within the code fences is extracted, and the code fences themselves are not included in the output.\n\nArguments\n\nmarkdown_content::String: A string containing the markdown content from which Julia code blocks are to be extracted.\n\nReturns\n\nVector{String}: A vector containing strings of extracted Julia code blocks. If no Julia code blocks are found, an empty vector is returned.\n\nExamples\n\nExample with a single Julia code block\n\nmarkdown_single = \"\"\"\n\njulia println(\"Hello, World!\")\n\n\"\"\"\nextract_code_blocks(markdown_single)\n# Output: [\"Hello, World!\"]\n\n# Example with multiple Julia code blocks\nmarkdown_multiple = \"\"\"\n\njulia x = 5\n\nSome text in between\n\njulia y = x + 2\n\n\"\"\"\nextract_code_blocks(markdown_multiple)\n# Output: [\"x = 5\", \"y = x + 2\"]\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.extract_function_name-Tuple{AbstractString}","page":"Reference","title":"PromptingTools.extract_function_name","text":"extract_function_name(code_block::String) -> Union{String, Nothing}\n\nExtract the name of a function from a given Julia code block. The function searches for two patterns:\n\nThe explicit function declaration pattern: function name(...) ... end\nThe concise function declaration pattern: name(...) = ...\n\nIf a function name is found, it is returned as a string. If no function name is found, the function returns nothing.\n\nArguments\n\ncode_block::String: A string containing Julia code.\n\nReturns\n\nUnion{String, Nothing}: The extracted function name or nothing if no name is found.\n\nExample\n\ncode = \"\"\"\nfunction myFunction(arg1, arg2)\n # Function body\nend\n\"\"\"\nextract_function_name(code)\n# Output: \"myFunction\"\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.finalize_outputs-Tuple{Union{AbstractString, PromptingTools.AbstractMessage, Vector{<:PromptingTools.AbstractMessage}}, Any, Union{Nothing, PromptingTools.AbstractMessage}}","page":"Reference","title":"PromptingTools.finalize_outputs","text":"finalize_outputs(prompt::ALLOWED_PROMPT_TYPE, conv_rendered::Any,\n msg::Union{Nothing, AbstractMessage};\n return_all::Bool = false,\n dry_run::Bool = false,\n conversation::AbstractVector{<:AbstractMessage} = AbstractMessage[],\n kwargs...)\n\nFinalizes the outputs of the ai* functions by either returning the conversation history or the last message.\n\nKeyword arguments\n\nreturn_all::Bool=false: If true, returns the entire conversation history, otherwise returns only the last message (the AIMessage).\ndry_run::Bool=false: If true, does not send the messages to the model, but only renders the prompt with the given schema and replacement variables. Useful for debugging when you want to check the specific schema rendering. \nconversation::AbstractVector{<:AbstractMessage}=[]: An optional vector of AbstractMessage objects representing the conversation history. If not provided, it is initialized as an empty vector.\nkwargs...: Variables to replace in the prompt template.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.function_call_signature-Tuple{Type}","page":"Reference","title":"PromptingTools.function_call_signature","text":"function_call_signature(datastructtype::Struct; max_description_length::Int = 100)\n\nExtract the argument names, types and docstrings from a struct to create the function call signature in JSON schema.\n\nYou must provide a Struct type (not an instance of it) with some fields.\n\nNote: Fairly experimental, but works for combination of structs, arrays, strings and singletons.\n\nTips\n\nYou can improve the quality of the extraction by writing a helpful docstring for your struct (or any nested struct). It will be provided as a description. \n\nYou can even include comments/descriptions about the individual fields.\n\nAll fields are assumed to be required, unless you allow null values (eg, ::Union{Nothing, Int}). Fields with Nothing will be treated as optional.\nMissing values are ignored (eg, ::Union{Missing, Int} will be treated as Int). It's for broader compatibility and we cannot deserialize it as easily as Nothing.\n\nExample\n\nDo you want to extract some specific measurements from a text like age, weight and height? You need to define the information you need as a struct (return_type):\n\nstruct MyMeasurement\n age::Int\n height::Union{Int,Nothing}\n weight::Union{Nothing,Float64}\nend\nsignature = function_call_signature(MyMeasurement)\n#\n# Dict{String, Any} with 3 entries:\n# \"name\" => \"MyMeasurement_extractor\"\n# \"parameters\" => Dict{String, Any}(\"properties\"=>Dict{String, Any}(\"height\"=>Dict{String, Any}(\"type\"=>\"integer\"), \"weight\"=>Dic…\n# \"description\" => \"Represents person's age, height, and weight\n\"\n\nYou can see that only the field age does not allow null values, hence, it's \"required\". While height and weight are optional.\n\nsignature[\"parameters\"][\"required\"]\n# [\"age\"]\n\nIf there are multiple items you want to extract, define a wrapper struct to get a Vector of MyMeasurement:\n\nstruct MyMeasurementWrapper\n measurements::Vector{MyMeasurement}\nend\n\nOr if you want your extraction to fail gracefully when data isn't found, use `MaybeExtract{T}` wrapper (inspired by Instructor package!):\n\nusing PromptingTools: MaybeExtract\n\ntype = MaybeExtract{MyMeasurement}\n\nEffectively the same as:\n\nstruct MaybeExtract{T}\n\nresult::Union{T, Nothing}\n\nerror::Bool // true if a result is found, false otherwise\n\nmessage::Union{Nothing, String} // Only present if no result is found, should be short and concise\n\nend\n\nIf LLM extraction fails, it will return a Dict with error and message fields instead of the result!\n\nmsg = aiextract(\"Extract measurements from the text: I am giraffe\", type)\n\n\n\nDict{Symbol, Any} with 2 entries:\n\n:message => \"Sorry, this feature is only available for humans.\"\n\n:error => true\n\n``` That way, you can handle the error gracefully and get a reason why extraction failed.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.has_julia_prompt-Tuple{T} where T<:AbstractString","page":"Reference","title":"PromptingTools.has_julia_prompt","text":"Checks if a given string has a Julia prompt (julia>) at the beginning of a line.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.load_conversation-Tuple{Union{AbstractString, IO}}","page":"Reference","title":"PromptingTools.load_conversation","text":"Loads a conversation (messages) from io_or_file\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.load_template-Tuple{Union{AbstractString, IO}}","page":"Reference","title":"PromptingTools.load_template","text":"Loads messaging template from io_or_file and returns tuple of template messages and metadata.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.load_templates!","page":"Reference","title":"PromptingTools.load_templates!","text":"load_templates!(; remove_templates::Bool=true)\n\nLoads templates from folder templates/ in the package root and stores them in TEMPLATE_STORE and TEMPLATE_METADATA.\n\nNote: Automatically removes any existing templates and metadata from TEMPLATE_STORE and TEMPLATE_METADATA if remove_templates=true.\n\n\n\n\n\n","category":"function"},{"location":"reference/#PromptingTools.ollama_api-Tuple{PromptingTools.AbstractOllamaManagedSchema, AbstractString}","page":"Reference","title":"PromptingTools.ollama_api","text":"ollama_api(prompt_schema::AbstractOllamaManagedSchema, prompt::AbstractString,\n system::Union{Nothing, AbstractString} = nothing,\n endpoint::String = \"generate\";\n model::String = \"llama2\", http_kwargs::NamedTuple = NamedTuple(),\n stream::Bool = false,\n url::String = \"localhost\", port::Int = 11434,\n kwargs...)\n\nSimple wrapper for a call to Ollama API.\n\nKeyword Arguments\n\nprompt_schema: Defines which prompt template should be applied.\nprompt: Can be a string representing the prompt for the AI conversation, a UserMessage, a vector of AbstractMessage\nsystem: An optional string representing the system message for the AI conversation. If not provided, a default message will be used.\nendpoint: The API endpoint to call, only \"generate\" and \"embeddings\" are currently supported. Defaults to \"generate\".\nmodel: A string representing the model to use for generating the response. Can be an alias corresponding to a model ID defined in MODEL_ALIASES.\nhttp_kwargs::NamedTuple: Additional keyword arguments for the HTTP request. Defaults to empty NamedTuple.\nstream: A boolean indicating whether to stream the response. Defaults to false.\nurl: The URL of the Ollama API. Defaults to \"localhost\".\nport: The port of the Ollama API. Defaults to 11434.\nkwargs: Prompt variables to be used to fill the prompt/template\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.remove_julia_prompt-Tuple{T} where T<:AbstractString","page":"Reference","title":"PromptingTools.remove_julia_prompt","text":"remove_julia_prompt(s::T) where {T<:AbstractString}\n\nIf it detects a julia prompt, it removes it and all lines that do not have it (except for those that belong to the code block).\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.remove_templates!-Tuple{}","page":"Reference","title":"PromptingTools.remove_templates!","text":" remove_templates!()\n\nRemoves all templates from TEMPLATE_STORE and TEMPLATE_METADATA.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.render-Tuple{AITemplate}","page":"Reference","title":"PromptingTools.render","text":"Renders provided messaging template (template) under the default schema (PROMPT_SCHEMA).\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.render-Tuple{PromptingTools.AbstractOllamaManagedSchema, Vector{<:PromptingTools.AbstractMessage}}","page":"Reference","title":"PromptingTools.render","text":"render(schema::AbstractOllamaManagedSchema,\n messages::Vector{<:AbstractMessage};\n conversation::AbstractVector{<:AbstractMessage} = AbstractMessage[],\n kwargs...)\n\nBuilds a history of the conversation to provide the prompt to the API. All unspecified kwargs are passed as replacements such that {{key}}=>value in the template.\n\nNote: Due to its \"managed\" nature, at most 2 messages can be provided (system and prompt inputs in the API).\n\nKeyword Arguments\n\nconversation: Not allowed for this schema. Provided only for compatibility.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.render-Tuple{PromptingTools.AbstractOpenAISchema, Vector{<:PromptingTools.AbstractMessage}}","page":"Reference","title":"PromptingTools.render","text":"render(schema::AbstractOpenAISchema,\n messages::Vector{<:AbstractMessage};\n image_detail::AbstractString = \"auto\",\n conversation::AbstractVector{<:AbstractMessage} = AbstractMessage[],\n kwargs...)\n\nBuilds a history of the conversation to provide the prompt to the API. All unspecified kwargs are passed as replacements such that {{key}}=>value in the template.\n\nKeyword Arguments\n\nimage_detail: Only for UserMessageWithImages. It represents the level of detail to include for images. Can be \"auto\", \"high\", or \"low\".\nconversation: An optional vector of AbstractMessage objects representing the conversation history. If not provided, it is initialized as an empty vector.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.render-Tuple{PromptingTools.NoSchema, Vector{<:PromptingTools.AbstractMessage}}","page":"Reference","title":"PromptingTools.render","text":"render(schema::NoSchema,\n messages::Vector{<:AbstractMessage};\n conversation::AbstractVector{<:AbstractMessage} = AbstractMessage[],\n replacement_kwargs...)\n\nRenders a conversation history from a vector of messages with all replacement variables specified in replacement_kwargs.\n\nIt is the first pass of the prompt rendering system, and is used by all other schemas.\n\nKeyword Arguments\n\nimage_detail: Only for UserMessageWithImages. It represents the level of detail to include for images. Can be \"auto\", \"high\", or \"low\".\nconversation: An optional vector of AbstractMessage objects representing the conversation history. If not provided, it is initialized as an empty vector.\n\nNotes\n\nAll unspecified kwargs are passed as replacements such that {{key}}=>value in the template.\nIf a SystemMessage is missing, we inject a default one at the beginning of the conversation.\nOnly one SystemMessage is allowed (ie, cannot mix two conversations different system prompts).\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.replace_words-Tuple{AbstractString, Vector{<:AbstractString}}","page":"Reference","title":"PromptingTools.replace_words","text":"replace_words(text::AbstractString, words::Vector{<:AbstractString}; replacement::AbstractString=\"ABC\")\n\nReplace all occurrences of words in words with replacement in text. Useful to quickly remove specific names or entities from a text.\n\nArguments\n\ntext::AbstractString: The text to be processed.\nwords::Vector{<:AbstractString}: A vector of words to be replaced.\nreplacement::AbstractString=\"ABC\": The replacement string to be used. Defaults to \"ABC\".\n\nExample\n\ntext = \"Disney is a great company\"\nreplace_words(text, [\"Disney\", \"Snow White\", \"Mickey Mouse\"])\n# Output: \"ABC is a great company\"\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.save_conversation-Tuple{Union{AbstractString, IO}, AbstractVector{<:PromptingTools.AbstractMessage}}","page":"Reference","title":"PromptingTools.save_conversation","text":"Saves provided conversation (messages) to io_or_file. If you need to add some metadata, see save_template.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.save_template-Tuple{Union{AbstractString, IO}, AbstractVector{<:PromptingTools.AbstractChatMessage}}","page":"Reference","title":"PromptingTools.save_template","text":"Saves provided messaging template (messages) to io_or_file. Automatically adds metadata based on provided keyword arguments.\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.split_by_length-Tuple{String}","page":"Reference","title":"PromptingTools.split_by_length","text":"split_by_length(text::String; separator::String=\" \", max_length::Int=35000) -> Vector{String}\n\nSplit a given string text into chunks of a specified maximum length max_length. This is particularly useful for splitting larger documents or texts into smaller segments, suitable for models or systems with smaller context windows.\n\nArguments\n\ntext::String: The text to be split.\nseparator::String=\" \": The separator used to split the text into minichunks. Defaults to a space character.\nmax_length::Int=35000: The maximum length of each chunk. Defaults to 35,000 characters, which should fit within 16K context window.\n\nReturns\n\nVector{String}: A vector of strings, each representing a chunk of the original text that is smaller than or equal to max_length.\n\nNotes\n\nThe function ensures that each chunk is as close to max_length as possible without exceeding it.\nIf the text is empty, the function returns an empty array.\nThe separator is re-added to the text chunks after splitting, preserving the original structure of the text as closely as possible.\n\nExamples\n\nSplitting text with the default separator (\" \"):\n\ntext = \"Hello world. How are you?\"\nchunks = splitbysize(text; max_length=13)\nlength(chunks) # Output: 2\n\nUsing a custom separator and custom max_length\n\ntext = \"Hello,World,\" ^ 2900 # length 34900 chars\nsplit_by_length(text; separator=\",\", max_length=10000) # for 4K context window\nlength(chunks[1]) # Output: 4\n\n\n\n\n\n","category":"method"},{"location":"reference/#PromptingTools.@aai_str-Tuple{Any, Vararg{Any}}","page":"Reference","title":"PromptingTools.@aai_str","text":"aai\"user_prompt\"[model_alias] -> AIMessage\n\nAsynchronous version of @ai_str macro, which will log the result once it's ready.\n\nExample\n\nSend asynchronous request to GPT-4, so we don't have to wait for the response: Very practical with slow models, so you can keep working in the meantime.\n\n```julia m = aai\"Say Hi!\"gpt4; \n\n...with some delay...\n\n[ Info: Tokens: 29 @ Cost: 0.0011 in 2.7 seconds\n\n[ Info: AIMessage> Hello! How can I assist you today?\n\n\n\n\n\n","category":"macro"},{"location":"reference/#PromptingTools.@ai_str-Tuple{Any, Vararg{Any}}","page":"Reference","title":"PromptingTools.@ai_str","text":"ai\"user_prompt\"[model_alias] -> AIMessage\n\nThe ai\"\" string macro generates an AI response to a given prompt by using aigenerate under the hood.\n\nArguments\n\nuser_prompt (String): The input prompt for the AI model.\nmodel_alias (optional, any): Provide model alias of the AI model (see MODEL_ALIASES).\n\nReturns\n\nAIMessage corresponding to the input prompt.\n\nExample\n\nresult = ai\"Hello, how are you?\"\n# AIMessage(\"Hello! I'm an AI assistant, so I don't have feelings, but I'm here to help you. How can I assist you today?\")\n\nIf you want to interpolate some variables or additional context, simply use string interpolation:\n\na=1\nresult = ai\"What is `$a+$a`?\"\n# AIMessage(\"The sum of `1+1` is `2`.\")\n\nIf you want to use a different model, eg, GPT-4, you can provide its alias as a flag:\n\nresult = ai\"What is `1.23 * 100 + 1`?\"gpt4\n# AIMessage(\"The answer is 124.\")\n\n\n\n\n\n","category":"macro"},{"location":"","page":"Home","title":"Home","text":"CurrentModule = PromptingTools","category":"page"},{"location":"#PromptingTools","page":"Home","title":"PromptingTools","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"Documentation for PromptingTools.","category":"page"},{"location":"","page":"Home","title":"Home","text":"Streamline your life using PromptingTools.jl, the Julia package that simplifies interacting with large language models.","category":"page"},{"location":"","page":"Home","title":"Home","text":"PromptingTools.jl is not meant for building large-scale systems. It's meant to be the go-to tool in your global environment that will save you 20 minutes every day!","category":"page"},{"location":"#Why-PromptingTools.jl?","page":"Home","title":"Why PromptingTools.jl?","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"Prompt engineering is neither fast nor easy. Moreover, different models and their fine-tunes might require different prompt formats and tricks, or perhaps the information you work with requires special models to be used. PromptingTools.jl is meant to unify the prompts for different backends and make the common tasks (like templated prompts) as simple as possible. ","category":"page"},{"location":"","page":"Home","title":"Home","text":"Some features:","category":"page"},{"location":"","page":"Home","title":"Home","text":"aigenerate Function: Simplify prompt templates with handlebars (eg, {{variable}}) and keyword arguments\n@ai_str String Macro: Save keystrokes with a string macro for simple prompts\nEasy to Remember: All exported functions start with ai... for better discoverability\nLight Wraper Types: Benefit from Julia's multiple dispatch by having AI outputs wrapped in specific types\nMinimal Dependencies: Enjoy an easy addition to your global environment with very light dependencies\nNo Context Switching: Access cutting-edge LLMs with no context switching and minimum extra keystrokes directly in your REPL","category":"page"},{"location":"#First-Steps","page":"Home","title":"First Steps","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"To get started, see the Getting Started section.","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"EditURL = \"../../../examples/working_with_aitemplates.jl\"","category":"page"},{"location":"examples/working_with_aitemplates/#Using-AITemplates","page":"Using AITemplates","title":"Using AITemplates","text":"","category":"section"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"This file contains examples of how to work with AITemplate(s).","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"First, let's import the package and define a helper link for calling un-exported functions:","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"using PromptingTools\nconst PT = PromptingTools","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"PromptingTools","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"LLM responses are only as good as the prompts you give them. However, great prompts take long time to write – AITemplate are a way to re-use great prompts!","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"AITemplates are just a collection of templated prompts (ie, set of \"messages\" that have placeholders like {{question}})","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"They are saved as JSON files in the templates directory. They are automatically loaded on package import, but you can always force a re-load with PT.load_templates!()","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"PT.load_templates!();","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"You can (create them) and use them for any ai* function instead of a prompt: Let's use a template called :JuliaExpertAsk alternatively, you can use AITemplate(:JuliaExpertAsk) for cleaner dispatch","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"msg = aigenerate(:JuliaExpertAsk; ask = \"How do I add packages?\")","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"AIMessage(\"To add packages in Julia, you can use the built-in package manager called `Pkg`. Here are the steps:\n\n1. Open the Julia REPL (Read-Eval-Print Loop).\n2. Press the `]` key to enter the package manager mode.\n3. Use the `add` command followed by the name of the package you want to install. For example, to install the `DataFrames` package, type: `add DataFrames`.\n4. Press the `backspace` or `ctrl + C` key to exit the package manager mode and return to the REPL.\n\nAfter following these steps, the specified package will be installed and available for use in your Julia environment.\")","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"You can see that it had a placeholder for the actual question (ask) that we provided as a keyword argument. We did not have to write any system prompt for personas, tone, etc. – it was all provided by the template!","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"How to know which templates are available? You can search for them with aitemplates(): You can search by Symbol (only for partial name match), String (partial match on name or description), or Regex (more fields)","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"tmps = aitemplates(\"JuliaExpertAsk\")","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"1-element Vector{AITemplateMetadata}:\nPromptingTools.AITemplateMetadata\n name: Symbol JuliaExpertAsk\n description: String \"For asking questions about Julia language. Placeholders: `ask`\"\n version: String \"1\"\n wordcount: Int64 237\n variables: Array{Symbol}((1,))\n system_preview: String \"You are a world-class Julia language programmer with the knowledge of the latest syntax. Your commun\"\n user_preview: String \"# Question\\n\\n{{ask}}\"\n source: String \"\"\n\n","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"You can see that it outputs a list of available templates that match the search - there is just one in this case.","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"Moreover, it shows not just the description, but also a preview of the actual prompts, placeholders available, and the length (to gauge how much it would cost).","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"If you use VSCode, you can display them in a nice scrollable table with vscodedisplay:","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"using DataFrames\nDataFrame(tmp) |> vscodedisplay","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"You can also just render the template to see the underlying mesages:","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"msgs = PT.render(AITemplate(:JuliaExpertAsk))","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"2-element Vector{PromptingTools.AbstractChatMessage}:\n SystemMessage(\"You are a world-class Julia language programmer with the knowledge of the latest syntax. Your communication is brief and concise. You're precise and answer only when you're confident in the high quality of your answer.\")\n UserMessage{String}(\"# Question\\n\\n{{ask}}\", [:ask], :usermessage)","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"Now, you know exactly what's in the template!","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"If you want to modify it, simply change it and save it as a new file with save_template (see the docs ?save_template for more details).","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"Let's adjust the previous template to be more specific to a data analysis question:","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"tpl = [PT.SystemMessage(\"You are a world-class Julia language programmer with the knowledge of the latest syntax. You're also a senior Data Scientist and proficient in data analysis in Julia. Your communication is brief and concise. You're precise and answer only when you're confident in the high quality of your answer.\")\n PT.UserMessage(\"# Question\\n\\n{{ask}}\")]","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"2-element Vector{PromptingTools.AbstractChatMessage}:\n SystemMessage(\"You are a world-class Julia language programmer with the knowledge of the latest syntax. You're also a senior Data Scientist and proficient in data analysis in Julia. Your communication is brief and concise. You're precise and answer only when you're confident in the high quality of your answer.\")\n UserMessage{String}(\"# Question\\n\\n{{ask}}\", [:ask], :usermessage)","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"Templates are saved in the templates directory of the package. Name of the file will become the template name (eg, call :JuliaDataExpertAsk)","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"filename = joinpath(pkgdir(PromptingTools),\n \"templates\",\n \"persona-task\",\n \"JuliaDataExpertAsk_123.json\")\nPT.save_template(filename,\n tpl;\n description = \"For asking data analysis questions in Julia language. Placeholders: `ask`\")\nrm(filename) # cleanup if we don't like it","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"When you create a new template, remember to re-load the templates with load_templates!() so that it's available for use.","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"PT.load_templates!();","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"!!! If you have some good templates (or suggestions for the existing ones), please consider sharing them with the community by opening a PR to the templates directory!","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"","category":"page"},{"location":"examples/working_with_aitemplates/","page":"Using AITemplates","title":"Using AITemplates","text":"This page was generated using Literate.jl.","category":"page"},{"location":"examples/readme_examples/#Various-Examples","page":"Various examples","title":"Various Examples","text":"","category":"section"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Noteworthy functions: aigenerate, aiembed, aiclassify, aiextract, aitemplates","category":"page"},{"location":"examples/readme_examples/#Seamless-Integration-Into-Your-Workflow","page":"Various examples","title":"Seamless Integration Into Your Workflow","text":"","category":"section"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Google search is great, but it's a context switch. You often have to open a few pages and read through the discussion to find the answer you need. Same with the ChatGPT website.","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Imagine you are in VSCode, editing your .gitignore file. How do I ignore a file in all subfolders again?","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"All you need to do is to type: aai\"What to write in .gitignore to ignore file XYZ in any folder or subfolder?\"","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"With aai\"\" (as opposed to ai\"\"), we make a non-blocking call to the LLM to not prevent you from continuing your work. When the answer is ready, we log it from the background:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"[ Info: Tokens: 102 @ Cost: $0.0002 in 2.7 seconds\n┌ Info: AIMessage> To ignore a file called \"XYZ\" in any folder or subfolder, you can add the following line to your .gitignore file:\n│ \n│ ```\n│ **/XYZ\n│ ```\n│ \n└ This pattern uses the double asterisk (`**`) to match any folder or subfolder, and then specifies the name of the file you want to ignore.","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"You probably saved 3-5 minutes on this task and probably another 5-10 minutes, because of the context switch/distraction you avoided. It's a small win, but it adds up quickly.","category":"page"},{"location":"examples/readme_examples/#Advanced-Prompts-/-Conversations","page":"Various examples","title":"Advanced Prompts / Conversations","text":"","category":"section"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"You can use the aigenerate function to replace handlebar variables (eg, {{name}}) via keyword arguments.","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"msg = aigenerate(\"Say hello to {{name}}!\", name=\"World\")","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"The more complex prompts are effectively a conversation (a set of messages), where you can have messages from three entities: System, User, AI Assistant. We provide the corresponding types for each of them: SystemMessage, UserMessage, AIMessage. ","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"using PromptingTools: SystemMessage, UserMessage\n\nconversation = [\n SystemMessage(\"You're master Yoda from Star Wars trying to help the user become a Jedi.\"),\n UserMessage(\"I have feelings for my {{object}}. What should I do?\")]\nmsg = aigenerate(conversation; object = \"old iPhone\")","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"AIMessage(\"Ah, a dilemma, you have. Emotional attachment can cloud your path to becoming a Jedi. To be attached to material possessions, you must not. The iPhone is but a tool, nothing more. Let go, you must.\n\nSeek detachment, young padawan. Reflect upon the impermanence of all things. Appreciate the memories it gave you, and gratefully part ways. In its absence, find new experiences to grow and become one with the Force. Only then, a true Jedi, you shall become.\")","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"You can also use it to build conversations, eg, ","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"new_conversation = vcat(conversation...,msg, UserMessage(\"Thank you, master Yoda! Do you have {{object}} to know what it feels like?\"))\naigenerate(new_conversation; object = \"old iPhone\")","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"> AIMessage(\"Hmm, possess an old iPhone, I do not. But experience with attachments, I have. Detachment, I learned. True power and freedom, it brings...\")","category":"page"},{"location":"examples/readme_examples/#Templated-Prompts","page":"Various examples","title":"Templated Prompts","text":"","category":"section"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"With LLMs, the quality / robustness of your results depends on the quality of your prompts. But writing prompts is hard! That's why we offer a templating system to save you time and effort.","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"To use a specific template (eg, `` to ask a Julia language):","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"msg = aigenerate(:JuliaExpertAsk; ask = \"How do I add packages?\")","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"The above is equivalent to a more verbose version that explicitly uses the dispatch on AITemplate:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"msg = aigenerate(AITemplate(:JuliaExpertAsk); ask = \"How do I add packages?\")","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Find available templates with aitemplates:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"tmps = aitemplates(\"JuliaExpertAsk\")\n# Will surface one specific template\n# 1-element Vector{AITemplateMetadata}:\n# PromptingTools.AITemplateMetadata\n# name: Symbol JuliaExpertAsk\n# description: String \"For asking questions about Julia language. Placeholders: `ask`\"\n# version: String \"1\"\n# wordcount: Int64 237\n# variables: Array{Symbol}((1,))\n# system_preview: String \"You are a world-class Julia language programmer with the knowledge of the latest syntax. Your commun\"\n# user_preview: String \"# Question\\n\\n{{ask}}\"\n# source: String \"\"","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"The above gives you a good idea of what the template is about, what placeholders are available, and how much it would cost to use it (=wordcount).","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Search for all Julia-related templates:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"tmps = aitemplates(\"Julia\")\n# 2-element Vector{AITemplateMetadata}... -> more to come later!","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"If you are on VSCode, you can leverage a nice tabular display with vscodedisplay:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"using DataFrames\ntmps = aitemplates(\"Julia\") |> DataFrame |> vscodedisplay","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"I have my selected template, how do I use it? Just use the \"name\" in aigenerate or aiclassify like you see in the first example!","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"You can inspect any template by \"rendering\" it (this is what the LLM will see):","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"julia> AITemplate(:JudgeIsItTrue) |> PromptingTools.render","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"See more examples in the Examples folder.","category":"page"},{"location":"examples/readme_examples/#Asynchronous-Execution","page":"Various examples","title":"Asynchronous Execution","text":"","category":"section"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"You can leverage asyncmap to run multiple AI-powered tasks concurrently, improving performance for batch operations. ","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"prompts = [aigenerate(\"Translate 'Hello, World!' to {{language}}\"; language) for language in [\"Spanish\", \"French\", \"Mandarin\"]]\nresponses = asyncmap(aigenerate, prompts)","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Pro tip: You can limit the number of concurrent tasks with the keyword asyncmap(...; ntasks=10).","category":"page"},{"location":"examples/readme_examples/#Model-Aliases","page":"Various examples","title":"Model Aliases","text":"","category":"section"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Certain tasks require more powerful models. All user-facing functions have a keyword argument model that can be used to specify the model to be used. For example, you can use model = \"gpt-4-1106-preview\" to use the latest GPT-4 Turbo model. However, no one wants to type that!","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"We offer a set of model aliases (eg, \"gpt3\", \"gpt4\", \"gpt4t\" -> the above GPT-4 Turbo, etc.) that can be used instead. ","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Each ai... call first looks up the provided model name in the dictionary PromptingTools.MODEL_ALIASES, so you can easily extend with your own aliases! ","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"const PT = PromptingTools\nPT.MODEL_ALIASES[\"gpt4t\"] = \"gpt-4-1106-preview\"","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"These aliases also can be used as flags in the @ai_str macro, eg, ai\"What is the capital of France?\"gpt4t (GPT-4 Turbo has a knowledge cut-off in April 2023, so it's useful for more contemporary questions).","category":"page"},{"location":"examples/readme_examples/#Embeddings","page":"Various examples","title":"Embeddings","text":"","category":"section"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Use the aiembed function to create embeddings via the default OpenAI model that can be used for semantic search, clustering, and more complex AI workflows.","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"text_to_embed = \"The concept of artificial intelligence.\"\nmsg = aiembed(text_to_embed)\nembedding = msg.content # 1536-element Vector{Float64}","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"If you plan to calculate the cosine distance between embeddings, you can normalize them first:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"using LinearAlgebra\nmsg = aiembed([\"embed me\", \"and me too\"], LinearAlgebra.normalize)\n\n# calculate cosine distance between the two normalized embeddings as a simple dot product\nmsg.content' * msg.content[:, 1] # [1.0, 0.787]","category":"page"},{"location":"examples/readme_examples/#Classification","page":"Various examples","title":"Classification","text":"","category":"section"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"You can use the aiclassify function to classify any provided statement as true/false/unknown. This is useful for fact-checking, hallucination or NLI checks, moderation, filtering, sentiment analysis, feature engineering and more.","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"aiclassify(\"Is two plus two four?\") \n# true","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"System prompts and higher-quality models can be used for more complex tasks, including knowing when to defer to a human:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"aiclassify(:JudgeIsItTrue; it = \"Is two plus three a vegetable on Mars?\", model = \"gpt4t\") \n# unknown","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"In the above example, we used a prompt template :JudgeIsItTrue, which automatically expands into the following system prompt (and a separate user prompt): ","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"\"You are an impartial AI judge evaluating whether the provided statement is \\\"true\\\" or \\\"false\\\". Answer \\\"unknown\\\" if you cannot decide.\"","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"For more information on templates, see the Templated Prompts section.","category":"page"},{"location":"examples/readme_examples/#Data-Extraction","page":"Various examples","title":"Data Extraction","text":"","category":"section"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Are you tired of extracting data with regex? You can use LLMs to extract structured data from text!","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"All you have to do is to define the structure of the data you want to extract and the LLM will do the rest.","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Define a return_type with struct. Provide docstrings if needed (improves results and helps with documentation).","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Let's start with a hard task - extracting the current weather in a given location:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"@enum TemperatureUnits celsius fahrenheit\n\"\"\"Extract the current weather in a given location\n\n# Arguments\n- `location`: The city and state, e.g. \"San Francisco, CA\"\n- `unit`: The unit of temperature to return, either `celsius` or `fahrenheit`\n\"\"\"\nstruct CurrentWeather\n location::String\n unit::Union{Nothing,TemperatureUnits}\nend\n\n# Note that we provide the TYPE itself, not an instance of it!\nmsg = aiextract(\"What's the weather in Salt Lake City in C?\"; return_type=CurrentWeather)\nmsg.content\n# CurrentWeather(\"Salt Lake City, UT\", celsius)","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"But you can use it even for more complex tasks, like extracting many entities from a text:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"\"Person's age, height, and weight.\"\nstruct MyMeasurement\n age::Int\n height::Union{Int,Nothing}\n weight::Union{Nothing,Float64}\nend\nstruct ManyMeasurements\n measurements::Vector{MyMeasurement}\nend\nmsg = aiextract(\"James is 30, weighs 80kg. He's 180cm tall. Then Jack is 19 but really tall - over 190!\"; return_type=ManyMeasurements)\nmsg.content.measurements\n# 2-element Vector{MyMeasurement}:\n# MyMeasurement(30, 180, 80.0)\n# MyMeasurement(19, 190, nothing)","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"There is even a wrapper to help you catch errors together with helpful explanations on why parsing failed. See ?PromptingTools.MaybeExtract for more information.","category":"page"},{"location":"examples/readme_examples/#OCR-and-Image-Comprehension","page":"Various examples","title":"OCR and Image Comprehension","text":"","category":"section"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"With the aiscan function, you can interact with images as if they were text.","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"You can simply describe a provided image:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"msg = aiscan(\"Describe the image\"; image_path=\"julia.png\", model=\"gpt4v\")\n# [ Info: Tokens: 1141 @ Cost: \\$0.0117 in 2.2 seconds\n# AIMessage(\"The image shows a logo consisting of the word \"julia\" written in lowercase\")","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Or you can do an OCR of a screenshot. Let's transcribe some SQL code from a screenshot (no more re-typing!), we use a template :OCRTask:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"# Screenshot of some SQL code\nimage_url = \"https://www.sqlservercentral.com/wp-content/uploads/legacy/8755f69180b7ac7ee76a69ae68ec36872a116ad4/24622.png\"\nmsg = aiscan(:OCRTask; image_url, model=\"gpt4v\", task=\"Transcribe the SQL code in the image.\", api_kwargs=(; max_tokens=2500))\n\n# [ Info: Tokens: 362 @ Cost: \\$0.0045 in 2.5 seconds\n# AIMessage(\"```sql\n# update Orders ","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"You can add syntax highlighting of the outputs via Markdown","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"using Markdown\nmsg.content |> Markdown.parse","category":"page"},{"location":"examples/readme_examples/#Using-Ollama-models","page":"Various examples","title":"Using Ollama models","text":"","category":"section"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Ollama.ai is an amazingly simple tool that allows you to run several Large Language Models (LLM) on your computer. It's especially suitable when you're working with some sensitive data that should not be sent anywhere.","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"Let's assume you have installed Ollama, downloaded a model, and it's running in the background.","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"We can use it with the aigenerate function:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"const PT = PromptingTools\nschema = PT.OllamaManagedSchema() # notice the different schema!\n\nmsg = aigenerate(schema, \"Say hi!\"; model=\"openhermes2.5-mistral\")\n# [ Info: Tokens: 69 in 0.9 seconds\n# AIMessage(\"Hello! How can I assist you today?\")","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"And we can also use the aiembed function:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"msg = aiembed(schema, \"Embed me\", copy; model=\"openhermes2.5-mistral\")\nmsg.content # 4096-element JSON3.Array{Float64...\n\nmsg = aiembed(schema, [\"Embed me\", \"Embed me\"]; model=\"openhermes2.5-mistral\")\nmsg.content # 4096×2 Matrix{Float64}:","category":"page"},{"location":"examples/readme_examples/","page":"Various examples","title":"Various examples","text":"If you're getting errors, check that Ollama is running - see the Setup Guide for Ollama section below.","category":"page"}] }