Skip to content

feat: add MiniMax AI hello world tutorial in 04_hello_genai#29

Open
octo-patch wants to merge 1 commit intopanaversity:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax AI hello world tutorial in 04_hello_genai#29
octo-patch wants to merge 1 commit intopanaversity:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

  • Add MiniMax AI as a new GenAI provider in the 04_hello_genai tutorial series
  • New 03_hello_minimax directory: standalone hello world project using MiniMax's OpenAI-compatible API with M2.7 model (1M context, streaming support)
  • Update existing LiteLLM example to include MiniMax via openai/ prefix routing with custom api_base

Changes

New: 04_hello_genai/03_hello_minimax/

  • readme.md: Tutorial covering MiniMax API setup, key features, and getting started guide
  • myproject/src/myproject/hello.py: Two functions — minimax() for standard chat completion and minimax_stream() for streaming
  • myproject/pyproject.toml: Project config with openai dependency and script entry points
  • myproject/tests/test_minimax_unit.py: 10 unit tests (client config, model, temperature, streaming, LiteLLM integration)
  • myproject/tests/test_minimax_integration.py: 3 integration tests (chat completion, streaming, highspeed model)

Updated: 04_hello_genai/02_litellm/

  • Added minimax() function using LiteLLM's OpenAI-compatible routing
  • Updated pyproject.toml with minimax script entry
  • Updated readme.md with MiniMax API key setup and code example

Why MiniMax?

MiniMax offers powerful LLMs with an OpenAI-compatible API, making it easy for students to try an alternative provider. The M2.7 model features a 1M token context window and strong multilingual capabilities.

Test plan

  • All 10 unit tests pass
  • All 3 integration tests pass (real MiniMax API calls)
  • Verify uv run minimax and uv run minimax_stream work as documented

- Add 04_hello_genai/03_hello_minimax: standalone MiniMax hello world
  project using OpenAI-compatible API (M2.7 model, streaming support)
- Update LiteLLM example to include MiniMax via openai/ prefix routing
- Include 10 unit tests + 3 integration tests
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant