Skip to content

Instantly share code, notes, and snippets.

@mberman84
Created July 16, 2024 19:31
Show Gist options
  • Save mberman84/de81e435d2d10248551106903f8c0ae5 to your computer and use it in GitHub Desktop.
Save mberman84/de81e435d2d10248551106903f8c0ae5 to your computer and use it in GitHub Desktop.
RouteLLM Script
import os
os.environ["OPENAI_API_KEY"] = "XXX"
os.environ["GROQ_API_KEY"] = "YYY"
from routellm.controller import Controller
client = Controller(
routers=["mf"],
strong_model="gpt-4-1106-preview",
weak_model="groq/llama3-8b-8192"
)
response = client.chat.completions.create(
# This tells RouteLLM to use the MF router with a cost threshold of 0.11593
model="router-mf-0.11593",
messages=[
{"role": "user", "content": "Write the game snake in python"}
]
)
message_content = response['choices'][0]['message']['content']
model_name = response['model']
print(f"Message content: {message_content}")
print(f"Model name: {model_name}")
@Dcamy
Copy link

Dcamy commented Jul 23, 2024

I wonder how I could integrate this in to this project.?.

https://github.com/Dcamy/iBot

@rasingollam
Copy link

Thank you this is a good way to build agents.

@GaryOcean428
Copy link

@GaryOcean428
Copy link

Grrr, i broke it. Or commits broke it from local to github. any tricks to avoid commit issues?

@duongtrung
Copy link

It seems like Controller only supports two models called strong_model and weak_model. How can we add more models and set different router strategies?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment