A Web Interface for chatting with your local LLMs via the ollama API

Ollama GUI: Web Interface for chatting with your local LLMs.

Ollama GUI is a web interface for ollama.ai, a tool that enables running Large Language Models (LLMs) on your local machine.

? Installation

Prerequisites

  1. Download and install ollama CLI.

    ollama pull <model-name>
    ollama serve

Getting Started

  1. Clone the repository and start the development server.

    git clone git@github.com:HelgeSverre/ollama-gui.git
    cd ollama-gui
    yarn install
    yarn dev
  2. Or use the web version, by allowing the origin docs

    OLLAMA_ORIGINS=https://ollama-gui.vercel.app ollama serve

? To-Do List

  • Properly format newlines in the chat message (PHP-land has nl2br basically want the same thing)
  • Browse and download available models
  • Store chat history using IndexedDB
  • Ensure mobile responsiveness

? Built With


? License

Licensed under the MIT License. See the LICENSE.md file for details.

GitHub

View Github