Skip to content

feat: add MiniMax-M2.7 models and fix legacy MiniMax API config#9

Open
octo-patch wants to merge 1 commit intoEvolvingLMMs-Lab:mainfrom
octo-patch:feature/add-minimax-m27-provider
Open

feat: add MiniMax-M2.7 models and fix legacy MiniMax API config#9
octo-patch wants to merge 1 commit intoEvolvingLMMs-Lab:mainfrom
octo-patch:feature/add-minimax-m27-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

  • Upgrade MiniMax models from legacy abab6.5s-chat / abab7-chat-preview to the latest MiniMax-M2.7 and MiniMax-M2.7-highspeed (204K context window)
  • Fix API base URL from deprecated api.minimax.chat to api.minimax.io for all MiniMax entries
  • Fix temperature from 0 to 0.01 — MiniMax API requires temperature in (0.0, 1.0]
  • Extend API key detection in OpenAIWrapper to match MiniMax-M model name pattern alongside legacy abab, with fallback to MINIMAX_API_KEY env var
  • Applied consistently to both VLMEvalKit/ and VLMEvalKit_ov/

Usage

# Set your MiniMax API key
export MiniMax_API_KEY=your-key-here
# or
export MINIMAX_API_KEY=your-key-here

# Run evaluation with MiniMax-M2.7
python run.py --model MiniMax-M2.7 --data MMBench_DEV_EN

# Run with highspeed variant
python run.py --model MiniMax-M2.7-highspeed --data MMBench_DEV_EN

Changes

File Change
VLMEvalKit/vlmeval/config.py Add M2.7 + M2.7-highspeed entries, update abab URL/temp
VLMEvalKit/vlmeval/api/gpt.py Extend key detection for MiniMax-M pattern
VLMEvalKit_ov/vlmeval/config.py Same config updates (mirror)
VLMEvalKit_ov/vlmeval/api/gpt.py Same key detection update (mirror)
VLMEvalKit/tests/test_minimax_provider.py 17 unit + 3 integration tests

Test plan

  • 17 unit tests pass: config registry, API base URL, model names, temperature constraints, key detection
  • 3 integration tests pass: text generation and working() check against live MiniMax API
  • Legacy abab models remain registered and functional (backward compatible)

- Add MiniMax-M2.7 and MiniMax-M2.7-highspeed (204K context) to VLMEvalKit
  model registry as OpenAI-compatible API models
- Update API base URL from api.minimax.chat to api.minimax.io for all
  MiniMax entries (legacy abab models included)
- Fix temperature from 0 to 0.01 (MiniMax API requires strictly > 0)
- Extend API key detection in OpenAIWrapper to match 'MiniMax-M' model
  name pattern, with fallback to MINIMAX_API_KEY env var
- Apply same changes to VLMEvalKit_ov for consistency
- Add 17 unit tests and 3 integration tests
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant