Skip to content

Examples for building and running LLM services and applications locally with Podman

Notifications You must be signed in to change notification settings

jeffmaury/ai-lab-recipes

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Locallm

This repo contains artifacts that can be used to build and run LLM (Large Language Model) services locally on your Mac using podman. These containerized LLM services can be used to help developers quickly prototype new LLM based applications, without the need for relying on any other externally hosted services. Since they are already containerized, it also helps developers move from their prototype to production quicker.

Current Locallm Services:

Chatbot

A simple chatbot using the gradio UI. Learn how to build and run this model service here: Chatbot.

Text Summarization

An LLM app that can summarize arbitrarily long text inputs. Learn how to build and run this model service here: Text Summarization.

Fine Tuning

This application allows a user to select a model and a data set they'd like to fine-tune that model on. Once the application finishes, it outputs a new fine-tuned model for the user to apply to other LLM services. Learn how to build and run this model training job here: Fine-tuning.

About

Examples for building and running LLM services and applications locally with Podman

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 54.8%
  • Dockerfile 27.6%
  • Shell 15.3%
  • Makefile 2.3%