Open-webui in personal laptop

In 2024, Large Language Models (LLMs) and Generative AI(GenAI) exploded at an unimaginable rate. I didn’t follow the trend. Currently, there is a news every day on new models. Also, the explosion of models reached a stage where local MacBooks can run a decent enough model. I want to have local model with a decent UI support through web or terminal that provides clean user interface.

I stumbled upon open-webui.

Open WebUI is an extensible, feature-rich, and user-friendly self-hosted WebUI designed to operate entirely offline. It supports various LLM runners, including Ollama and OpenAI-compatible APIs. For more information, be sure to check out our Open WebUI Documentation.

I have previously tried llm python package to try out stand alone models.

Installing open-webui

LLM package was setup using uv and python 3.12. Adding open-webui to the existing package failed because of ctranslate version compatability. So I had to run the LLM package and open-webui in Python 3.11 version. After installing open-webui, I expected it to pick up llama model from llm package installation in ~/Library/Application\ Support/io.datasette.llm/. That didn’t work. So I installed ollama mac package with llama 3.2 model.

Then the open-webui picked up the model (see the top left corner of the image) without making any changes. I used simple uv run open-webui serve to run openwebui in the local machine.

Open WebUI  running on a laptop

I tried out the a simple question, When did new year became important global fesatival? Explain the key historical events.

Here is the answer

New year  answer - Part 1

New year  answer - Part 2

The interface looks similar to ChatGPT and usable for long chat.

The voice to text translation was sub-par in the home page, I asked, explain the beginning of new year and major historical events around it. The translation was out right wrong and considered new year as holi festival. It skipped first part of the voice message.

Holi