Skip to content

LLMMultiRoundRouter (inference)

LLMMultiRoundRouter is an inference-time, prompt-based multi-round router. It does not require training artifacts.

Notebook: https://github.com/ulab-uiuc/LLMRouter/blob/main/notebooks/llmmultiroundrouter/01_llmmultiroundrouter_inference.ipynb

Router docs: https://github.com/ulab-uiuc/LLMRouter/blob/main/llmrouter/models/llmmultiroundrouter/README.md

Config

Run (CLI)

Route-only routing:

llmrouter infer --router llmmultiroundrouter --config configs/model_config_test/llmmultiroundrouter.yaml --query "Summarize the pros and cons of RLHF." --route-only

Full inference:

llmrouter infer --router llmmultiroundrouter --config configs/model_config_test/llmmultiroundrouter.yaml --query "Summarize the pros and cons of RLHF."