feat: add FuturMix AI Gateway as chat model provider#6286
feat: add FuturMix AI Gateway as chat model provider#6286FuturMix wants to merge 2 commits intoFlowiseAI:mainfrom
Conversation
There was a problem hiding this comment.
Code Review
This pull request introduces the FuturMix AI Gateway integration, adding a new credential type and a chat model node. The implementation leverages a wrapper around LangChain's OpenAI class to provide access to various models through the FuturMix API. Feedback was provided regarding the handling of the temperature parameter to prevent potential NaN values when the input is undefined or empty.
| const obj: ChatOpenAIFields = { | ||
| temperature: parseFloat(temperature), | ||
| modelName, | ||
| openAIApiKey: futurmixApiKey, | ||
| apiKey: futurmixApiKey, | ||
| streaming: streaming ?? true | ||
| } |
There was a problem hiding this comment.
The temperature parameter is currently assigned directly using parseFloat(temperature). If temperature is undefined or an empty string, this will result in NaN, which can cause issues in the underlying model request. It should be handled conditionally, consistent with how maxTokens, topP, and other optional parameters are processed later in the function.
| const obj: ChatOpenAIFields = { | |
| temperature: parseFloat(temperature), | |
| modelName, | |
| openAIApiKey: futurmixApiKey, | |
| apiKey: futurmixApiKey, | |
| streaming: streaming ?? true | |
| } | |
| const obj: ChatOpenAIFields = { | |
| modelName, | |
| openAIApiKey: futurmixApiKey, | |
| apiKey: futurmixApiKey, | |
| streaming: streaming ?? true | |
| } | |
| if (temperature) obj.temperature = parseFloat(temperature) |
Move temperature out of the ChatOpenAIFields initializer and apply it conditionally, matching the pattern used by maxTokens, topP, frequencyPenalty and presencePenalty. This prevents parseFloat from returning NaN when the temperature input is undefined or empty. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
8ce4ca1 to
132d978
Compare
Summary
Add FuturMix.ai as a new chat model provider, following the same pattern as ChatOpenRouter.
Files added:
packages/components/credentials/FuturMixApi.credential.ts— API key credentialpackages/components/nodes/chatmodels/ChatFuturMix/ChatFuturMix.ts— Chat model componentpackages/components/nodes/chatmodels/ChatFuturMix/FlowiseChatFuturMix.ts— Multi-modal wrapperpackages/components/nodes/chatmodels/ChatFuturMix/futurmix.svg— Provider iconWhat is FuturMix?
FuturMix.ai is a unified AI gateway providing access to 22+ models (Claude, GPT, Gemini) through a single OpenAI-compatible API with 99.99% SLA. Base URL:
https://futurmix.ai/v1Implementation:
ChatOpenAIwith custombaseURL(same pattern as ChatOpenRouter)IVisionChatModalfor multi-modal image upload supporthttps://futurmix.ai/v1(configurable)