Skip to content

Quickstart for Vector Search with InterSystems IRIS

Notifications You must be signed in to change notification settings

alvin-isc/treehacks-2024

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

36 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

InterSystems @ treehacks-2024

InterSystems is proud to be a sponsor of Treehacks 2024!

We are excited to introduce the hordes of hackers gathered to InterSystems IRIS and we're offering great cash prizes for the brave souls that take up the InterSystems Treehacks Challenge

What is InterSystems IRIS and IRIS Vector Search?

InterSystems IRIS is a rock-solid wicked-fast swiss-army-knife data platform that you can run anywhere - on-premises, your cloud, or use our Cloud Services! We are launching new Vector Search capabilities to seamlessly power your unstructured data search and Gen AI Retrieval Augmented Generation (RAG) applications in the same platform where you manage the rest of your data -- transactions, documents, analytics, etc.!

InterSystems IRIS Vector Search Quickstart

  1. Clone the repo

    https://github.com/alvin-isc/treehacks-2024.git
    

    In the following steps, replace 'PATH-TO-REPO' with the path to the repo.

  2. Install IRIS (Community Edtion) in a container:

    docker run -d --name iris-comm -p 1972:1972 -p 52773:52773 intersystemsdc/iris-community:2024.1-preview
    
  3. Create a python environment (conda, venv or however you wish) For example:

    conda create --name treehacks-iris python=3.10
    
  4. Install packages for all demos:

    pip install -r requirements.txt
    
  5. For langchain_demo.ipynb and llama_demo.ipynb , you need an OpenAI API Key. Create a .env file in this repo to store the key:

    OPENAI_API_KEY=xxxxxxxxx
    

Basic Demos

IRIS SQL now supports vector search (with filters)! In this demo, we're searching a whiskey dataset for whiskeys that are priced < $100 and have a taste description similar to "earthy and creamy taste".

IRIS now has a langchain integration as a VectorDB! In this demo, we use the langchain framework with IRIS to ingest and search through a document.

IRIS now has a llama_index integration as a VectorDB! In this demo, we use the llama_index framework with IRIS to ingest and search through a document.

Which to use?

If you need to use hybrid search (similarity search with filters), use IRIS SQL.

If you're building a genAI app that uses a variety of tools (agents, chained reasoning, api calls), go for langchain.

If you're building a RAG app, go for llama_indx.

Feel free to contact Alvin / Thomas if you have any questions!

More Demos / References:

Uses langchain-iris to search Youtube Audio transcriptions

Original IRIS langhain demo, that runs the containerized IRIS in the notebook

Original IRIS llama_index demo, that runs the containerized IRIS in the notebook

Official page for InterSystems Documentation

About

Quickstart for Vector Search with InterSystems IRIS

Resources

Stars

Watchers

Forks

Packages

No packages published