Skip to content

Routers

Summary

Routers implement the routing interface and are addressed by name in the CLI. Some routers are trainable, while others are inference-only baselines or pretrained models.

Sources: - Inference registry: https://github.com/ulab-uiuc/LLMRouter/blob/main/llmrouter/cli/router_inference.py - Training registry: https://github.com/ulab-uiuc/LLMRouter/blob/main/llmrouter/cli/router_train.py

Key responsibilities

  • Provide route_single and route_batch
  • Return a model name in the routing output
  • Integrate with CLI registries for training and inference

Router table

Router Train Infer Docs Notes
knnrouter yes yes README baseline, fast
svmrouter yes yes README linear classifier
mlprouter yes yes README MLP classifier
mfrouter yes yes README matrix factorization
elorouter yes yes README pairwise ranking
dcrouter yes yes README alias: routerdc
automix yes yes README alias: automixrouter
hybrid_llm yes yes README alias: hybridllm, optional deps
graphrouter yes yes README alias: graph_router, optional deps
causallm_router yes yes README alias: causallmrouter, optional deps
gmtrouter yes yes README alias: gmt_router, optional deps
knnmultiroundrouter yes yes README multi-round
llmmultiroundrouter no yes README LLM-based multi-round
router_r1 no yes README alias: router-r1, requires api_base and api_key
smallest_llm no yes README baseline
largest_llm no yes README baseline

Train/Infer reflect the main branch CLI registries (what llmrouter train / llmrouter infer support). If you installed a released version from PyPI, run llmrouter list-routers to see what is available in your environment.

Routing output contract

Inference extracts the routed model name from one of these keys: - model_name - predicted_llm - predicted_llm_name

Custom routers should return one of these keys to ensure compatibility.

Optional dependencies

Some routers are optional and may be unavailable if their dependencies are not installed. Use llmrouter list-routers to confirm availability.

Example configs (on main)

The ready-to-run YAML configs live on the main branch: - Train configs: https://github.com/ulab-uiuc/LLMRouter/tree/main/configs/model_config_train - Test/inference configs: https://github.com/ulab-uiuc/LLMRouter/tree/main/configs/model_config_test