feat: add LiteLLM as AI gateway model backend#2441
Open
RheagalFire wants to merge 3 commits intoopen-compass:mainfrom
Open
feat: add LiteLLM as AI gateway model backend#2441RheagalFire wants to merge 3 commits intoopen-compass:mainfrom
RheagalFire wants to merge 3 commits intoopen-compass:mainfrom
Conversation
Author
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Motivation
OpenCompass currently requires a separate provider file for each LLM backend. Users who want to evaluate models across Azure, Bedrock, Vertex AI, or Groq need provider-specific code. LiteLLM provides a unified
completion()interface that handles auth, formatting, and provider-specific quirks, enabling cross-providerevaluation with a single configuration change.
Supersedes stale PR #202 (ishaan-jaff, Aug 2023) -- both review points from @gaotongxiao addressed: optional dep in
requirements/api.txt(not core), lazy import inside_generate().Also addresses provider flexibility mentioned in #2147 -- AI/ML API is accessible via LiteLLM as
aiml/<model>.Modification
opencompass/models/litellm_api.py-- newLiteLLMAPIextendingBaseAPIModel(236 lines). Handles all 3 input formats: plainstr, CHATML-shaped dicts, and OpenCompass-nativePromptList(HUMAN/BOT/SYSTEM).opencompass/models/__init__.py-- import + registration in alphabetical orderrequirements/api.txt-- addedlitellm>=1.55,<1.85docs/en/user_guides/models.md-- added LiteLLM to supported API providers listdocs/zh_cn/user_guides/models.md-- added LiteLLM to supported API providers list (Chinese)tests/models/test_litellm_api.py-- 18 unit tests covering init, message translation, generation, retry, error handling, registryKey details:
drop_params=Trueby default -- silently drops provider-unsupported kwargs for cross-provider compatibilityimport litellminside_generate()-- base install unaffectedkey=param or provider-specific env vars (OPENAI_API_KEY,ANTHROPIC_API_KEY, etc.)extra_bodydict for forwarding provider-specific params (reasoning_effort,seed, etc.)BC-breaking
None. Additive only -- existing providers untouched.
litellmis inrequirements/api.txt(optional extra), not in the baserequirements.txt.Usage and Testing
Supported model strings include:
Lint: flake8 --max-line-length 99 + isort --check -> all clean.
Checklist
Before PR:
After PR: