Skip to content

kyu1204/alfred-ollama

Repository files navigation

logo

Alfred Ollama

Use Ollama for local llama inference on Alfred.

Requirements

Ollama installed and running on your mac. At least one model need to be installed throw Ollama cli tools.

required install model: llama2

ollama run llama2

How to Use

Command: Chat With Ollama

Command is olla and you can use it to chat with Ollama. screenshot

WIP

  • Add support for multiple models (currently only one model(llama2) is supported)

About

alfred workflow with ollama

Resources

Stars

Watchers

Forks

Packages

No packages published