Skip to content

feat: add MiniMax provider support#37

Open
octo-patch wants to merge 1 commit into
DeepMyst:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax provider support#37
octo-patch wants to merge 1 commit into
DeepMyst:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

  • Add MiniMaxProvider using the OpenAI-compatible HTTP API (https://api.minimax.io/v1) — same pattern as the existing LocalAIProvider
  • Supports MiniMax-M2.7 (default) and MiniMax-M2.7-highspeed models with SSE streaming
  • API key configured via mysti.minimaxApiKey setting or MINIMAX_API_KEY environment variable
  • Register provider in ProviderRegistry, add to ProviderType/AgentType union types, and add AGENT_STYLES entry in BrainstormManager so MiniMax can participate in brainstorm mode
  • Add 6 new package.json settings: minimaxApiKey, minimaxBaseUrl, minimaxModel, minimaxTemperature, minimaxMaxTokens, minimaxRequestTimeout
  • Add 13 unit tests (all passing); full test suite (373 tests) and production build verified

API Reference

Notes

  • MiniMax temperature range is (0.0, 1.0] — the setting enforces a minimum of 0.01 and maximum of 1.0
  • No Embedding support (MiniMax has no embedding models)
  • No TTS support added (Mysti does not have a TTS provider architecture)

- Add MiniMaxProvider using OpenAI-compatible HTTP API (api.minimax.io/v1)
- Add MiniMax-M2.7 and MiniMax-M2.7-highspeed models
- Support MINIMAX_API_KEY env var and mysti.minimaxApiKey setting
- Register provider in ProviderRegistry and BrainstormManager AGENT_STYLES
- Add 'minimax' to ProviderType and AgentType unions
- Add package.json settings for API key, base URL, model, temperature, max tokens
- Add unit tests (13 tests, all passing)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant