Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update default configuration #32

Merged
merged 2 commits into from
Aug 18, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -202,7 +202,7 @@ We apply a Retrieval Augment Generation (RAG) pattern, ie,

This ensures that the answers are not only based on general AI knowledge but are also specifically tailored to Julia's ecosystem and best practices.

The "knowledge packs" are sourced from documentation sites and then processed with DocsScraper.jl.
The "knowledge packs" are sourced from documentation sites and then processed with [DocsScraper.jl](https://github.com/JuliaGenAI/DocsScraper.jl).

> [!NOTE]
> If you would like to set up an automated process to create a new knowledge pack for some package/organization, let us know!
Expand Down Expand Up @@ -235,6 +235,9 @@ A: Tavily's API is used to search the best matching snippets from the documentat
**Q: Can we use Ollama (locally-hosted) models?**
A: Yes, see the Advanced section in the docs.

**Q: How can I build knowledge packs for my package(s)?**
A: Check out package [DocsScraper.jl](https://github.com/JuliaGenAI/DocsScraper.jl). It's what we use to build the knowledge packs loaded in this package!

## Future Directions

AIHelpMe is continuously evolving. Future updates may include:
Expand Down
3 changes: 1 addition & 2 deletions src/pipeline_defaults.jl
Original file line number Diff line number Diff line change
Expand Up @@ -156,7 +156,7 @@ function update_pipeline!(
@warn "Invalid configuration for knowledge packs! For `nomic-embed-text`, `embedding_dimension` must be 0. See the available artifacts."
end
if model_embedding == "text-embedding-3-large" &&
(embedding_dimension [1024, 0] || !isnothing(embedding_dimension))
!(embedding_dimension in [1024, 0] || isnothing(embedding_dimension))
@warn "Invalid configuration for knowledge packs! For `text-embedding-3-large`, `embedding_dimension` must be 0 or 1024. See the available artifacts."
end

Expand Down Expand Up @@ -184,7 +184,6 @@ function update_pipeline!(
## Update GLOBAL variables
MODEL_CHAT = model_chat
MODEL_EMBEDDING = model_embedding
@info embedding_dimension
EMBEDDING_DIMENSION = embedding_dimension

## Set the options
Expand Down
Loading