Skip to content

Instantly share code, notes, and snippets.

@awni
Created April 25, 2025 15:41
Show Gist options
  • Save awni/e54ee219cee307e4c934e34fedc9e93b to your computer and use it in GitHub Desktop.
Save awni/e54ee219cee307e4c934e34fedc9e93b to your computer and use it in GitHub Desktop.
Open WebUI with MLX LM

Setup

Install packages:

pip install open-webui mlx-lm

Start Open WebUI server:

open-webui serve

Setup the MLX LM server:

  1. Click your profile icon -> settings -> connections
  2. Click plus sign to add new connection
  3. Enter mlx-lm server address (e.g. http://127.0.0.1:8090/v1)
  4. Enter none for API key
  5. Click save

Start the mlx-lm server:

mlx_lm.server --port 8090

Choose an MLX LM model and start chatting.

Note, only already downloaded models show up in the list. So if you don't have any downloaded models, use the mlx-lm CLI to download them. For example:

mlx_lm.generate --model mlx-community/gemma-3-4b-it-qat-4bit --prompt "Hi"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment