feat: upgrade MiniMax default model to M2.7 in bonus notebook#109
Open
octo-patch wants to merge 2 commits intoHandsOnLLM:mainfrom
Open
feat: upgrade MiniMax default model to M2.7 in bonus notebook#109octo-patch wants to merge 2 commits intoHandsOnLLM:mainfrom
octo-patch wants to merge 2 commits intoHandsOnLLM:mainfrom
Conversation
Add a supplementary notebook showing how to use alternative OpenAI-compatible providers (MiniMax) with the book's code examples from Chapters 4 and 7. Update the README with a tip about OpenAI-compatible providers.
- Update default model from MiniMax-M2.5 to MiniMax-M2.7 across all examples - Add MiniMax-M2.7-highspeed to provider summary table - Update model description to reflect M2.7 enhanced reasoning capabilities - Keep all previous patterns and API configuration unchanged
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Changes
bonus/Using Alternative LLM Providers.ipynb: Update allMiniMax-M2.5references toMiniMax-M2.7, addMiniMax-M2.7-highspeedas alternativeWhy
MiniMax-M2.7 is the latest flagship model with enhanced reasoning and coding capabilities, replacing M2.5 as the recommended default.
Testing