Blog Logo
TAGS

Open WebUI - Extensible, Feature-Rich, and User-Friendly Self-Hosted WebUI

Open WebUI is an extensible, feature-rich, and user-friendly self-hosted WebUI designed to operate entirely offline. It supports various LLM runners, including Ollama and OpenAI-compatible APIs. Effortless setup using Docker or Kubernetes for a hassle-free experience with support for both :ollama and :cuda tagged images. Integration of OpenAI-compatible APIs for versatile conversations alongside Ollama models. Granular permissions and user groups for a secure user environment. Responsive design for a seamless experience across devices. Progressive Web App for mobile devices providing offline access. Full Markdown and LaTeX support for enriched interaction. Hands-free voice/video call features for dynamic communication. Model builder for creating Ollama models via the Web UI. Native Python function calling tool for seamless integration with LLMs. Local RAG Integration for groundbreaking chat interactions. Be sure to check out Open WebUI Documentation for more information.