Created
August 20, 2024 15:43
-
-
Save awni/e6467ae27c8b8ca688bfaebaa733e177 to your computer and use it in GitHub Desktop.
Meta Llama 3.1 with MLX LM and the MLX Python API as Context
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import os | |
import mlx.core as mx | |
from mlx_lm import load, generate | |
filename = os.path.join(os.path.dirname(mx.__file__), "core/__init__.pyi") | |
with open(filename, 'r') as fid: | |
prompt = fid.read() | |
prompt += "\nHow do you write a self-attention layer using the above API in MLX?" | |
model, tokenizer = load("mlx-community/meta-Llama-3.1-8B-Instruct-4bit") | |
messages = [{"role": "user", "content": prompt}] | |
prompt = tokenizer.apply_chat_template( | |
messages, tokenize=False, add_generation_prompt=True | |
) | |
generate( | |
model, | |
tokenizer, | |
prompt, | |
512, | |
verbose=True, | |
temp=0.0, | |
max_kv_size=4096, | |
) |
have you or anyone else found if the method above (getting the methods for MLX via the library) act as the best docs for porting code to MLX?
I haven't tried much there tbh. The API I use above includes the docstrings (from which a lot of the docs are autogenerated) so there would be substantial overlap between using that and using the actual docs.
I've been using the MLX .md docs and formatting them with structure via https://github.com/simonw/files-to-prompt
The docstrings is a good idea.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I haven't tried much there tbh. The API I use above includes the docstrings (from which a lot of the docs are autogenerated) so there would be substantial overlap between using that and using the actual docs.