Install packages:
pip install open-webui mlx-lm
Start Open WebUI server:
open-webui serve
Setup the MLX LM server:
- Click your profile icon -> settings -> connections
- Click plus sign to add new connection
- Enter mlx-lm server address (e.g. http://127.0.0.1:8090/v1)
- Enter
none
for API key - Click save
Start the mlx-lm
server:
mlx_lm.server --port 8090
Choose an MLX LM model and start chatting.
Note, only already downloaded models show up in the list. So if you don't have
any downloaded models, use the mlx-lm
CLI to download them. For example:
mlx_lm.generate --model mlx-community/gemma-3-4b-it-qat-4bit --prompt "Hi"