feat: add MiniMaxChatGenerator component#11076
feat: add MiniMaxChatGenerator component#11076octo-patch wants to merge 1 commit intodeepset-ai:mainfrom
Conversation
- Add MiniMaxChatGenerator using MiniMax's OpenAI-compatible API - Support MiniMax-M2.7 and MiniMax-M2.7-highspeed models - Add MINIMAX_API_KEY environment variable support - Add unit tests for initialization, serialization, and configuration - Add release note
|
Someone is attempting to deploy a commit to the deepset Team on Vercel. A member of the Team first needs to authorize it. |
|
octo-patch seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account. You have signed the CLA already but the status is still pending? Let us recheck it. |
|
Hi @octo-patch Thank you for opening this pull request. Without the CLA we can't merge this PR. Either way, I recommend to publish and release this integration yourself using our repository template if you think it's beneficial to users. https://github.com/deepset-ai/custom-component |
Summary
This PR adds
MiniMaxChatGenerator, a new Haystack component that integrates MiniMax LLMs via MiniMax's OpenAI-compatible API.MiniMaxChatGeneratortohaystack/components/generators/chat/MiniMax-M2.7(default) andMiniMax-M2.7-highspeedmodelsMINIMAX_API_KEYenvironment variable for authenticationhttps://api.minimax.io/v1OpenAIChatGenerator: streaming, tool calling, serialization/deserialization, async supportUsage
API Reference