Skip to content

Commit 8225115

Browse files
waleedlatif1claude
andcommitted
fix: handle vLLM models in store provider check
vLLM is a local model server like Ollama and should not require an API key. Add vllm to the store provider check as a safety net for models that may not have the vllm/ prefix. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
1 parent adfdf8f commit 8225115

File tree

2 files changed

+6
-1
lines changed

2 files changed

+6
-1
lines changed

apps/sim/blocks/utils.test.ts

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -172,6 +172,11 @@ describe('getApiKeyCondition / shouldRequireApiKeyForModel', () => {
172172
expect(evaluateCondition('claude-sonnet-4-5')).toBe(true)
173173
})
174174

175+
it('does not require API key when model is in the vLLM store bucket', () => {
176+
mockProviders.value.vllm.models = ['my-custom-model']
177+
expect(evaluateCondition('my-custom-model')).toBe(false)
178+
})
179+
175180
it('requires API key when model is in the fireworks store bucket', () => {
176181
mockProviders.value.fireworks.models = ['fireworks/llama-3']
177182
expect(evaluateCondition('fireworks/llama-3')).toBe(true)

apps/sim/blocks/utils.ts

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -145,7 +145,7 @@ function shouldRequireApiKeyForModel(model: string): boolean {
145145
}
146146

147147
const storeProvider = getProviderFromStore(normalizedModel)
148-
if (storeProvider === 'ollama') return false
148+
if (storeProvider === 'ollama' || storeProvider === 'vllm') return false
149149
if (storeProvider) return true
150150

151151
if (isOllamaConfigured) {

0 commit comments

Comments
 (0)