Skip to content

Use locally hosted or private cloud scaled up LLM to securely chat with your local documents

Notifications You must be signed in to change notification settings

msouvikrepo/doc-chat-local

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Chat securely with your documents using private LLM

Solution Architecture

doc-chat

Tech Stack

  1. Python 3.10 or higher
  2. Pinecone as Vector Store
  3. Huggingface and Transformers for locally deploying LLM (We are using Google's Flan T5 Base model with 248M parameters (3GB) - so that my laptop doesn't die)
  4. Langchain for Conversation chain wrapper around the LLM
  5. Streamlit for the UI
  6. AWS Sagemaker for scaled up version

Running it locally

Create a virtual environment :

virtualenv .venv

Install requirements :

pip3 install requirements.txt

Keep your documents under doc_chat/assets

Indexing the documents to create embeddings and store it in Pinecone :

python3 indexing.py

Chat with the document :

streamlit run main.py

Snapshots

Sample user Q&A :

Screenshot from 2023-10-01 08-32-31

Fetching query-relevant context from documents :

Screenshot from 2023-10-01 08-34-31

About

Use locally hosted or private cloud scaled up LLM to securely chat with your local documents

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published