If you lookup Open WebUI1 and Ollama2 on the internet, you’ll probably get examples that are all using Docker. What about Podman? Well, here you go!

Installation

We’re using homebrew, of course.

brew install ollama
brew install podman-desktop

I’ll not go into the details of setting up Podman Desktop3 and Ollama here because they both have pretty good online documention available (see resources at the bottom).

Configuration details

Ollama

By default, ollama will be listening on localhost:11434 - Make sure you start it before running the container, although, Open WebUI should pick it up afterwards anyway.

Open WebUI Container

The container will run in rootless mode under our user. We’ll use localhost:3000 in the browser and that is the port that’ll be forwarded internally to the container port 8080 To keep the configuration data e.g. User data and settings, we’ll create a directory in our User’s $HOME.

Setup and Start

Pick an LLM that suits your needs - I’m going with Llama 3 8B

ollama run llama3:8b

Start the podman service

podman machine start

Create the data directory

mkdir ~/podman/open-webui

Run the Open WebUI Container

podman run -d --name open-webui -p 3000:8080 --add-host localhost:127.0.0.1 -v ~/podman/open-webui:/app/backend/data ghcr.io/open-webui/open-webui:main

Upon accessing localhost:3000 you’ll be prompted to create an account.


Resources