Skip to content
This repository was archived by the owner on Apr 20, 2026. It is now read-only.
This repository was archived by the owner on Apr 20, 2026. It is now read-only.

[share]Using CrewAI with MiniMax - Example Project #363

@zhangshichun

Description

@zhangshichun

Problem:

litellm has issues with MiniMax:

  • Inaccurate model name mapping
  • Unstable API compatibility
  • Cannot properly handle MiniMax's special response formats (e.g., [TOOL_CALL] blocks and thinking blocks)

Solution:

I created an example project that directly uses the Anthropic SDK to call MiniMax's Anthropic-compatible endpoint, bypassing litellm:

👉 https://github.com/zhangshichun/crewai-use-minimax-examples

Key Implementation:

  1. Extend BaseLLM to implement a custom LLM
  2. Handle MiniMax's special format conversion:
    • [TOOL_CALL] blocks → ReAct format
    • thinking blocks → ignored
    • tool_use blocks → ReAct format
  3. CrewAI parses ReAct text and executes tools

Included Demos:

  • surprise_trip/ - Multi-agent collaboration example (travel planning)
  • starter_template/ - Reusable project template

MiniMax Anthropic Endpoint:

https://api.minimaxi.com/anthropic

Model: MiniMax-M2.7


Hope this helps other CrewAI users who want to use MiniMax!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions