Skip to content

Add option to deploy vLLM Omni as backend#141

Closed
Alex-Welsh wants to merge 2 commits intomainfrom
vllm-omni
Closed

Add option to deploy vLLM Omni as backend#141
Alex-Welsh wants to merge 2 commits intomainfrom
vllm-omni

Conversation

@Alex-Welsh
Copy link
Copy Markdown
Member

Still currently WIP, need to test changes

@Alex-Welsh Alex-Welsh force-pushed the vllm-omni branch 2 times, most recently from 026fbed to c4264bd Compare January 19, 2026 10:55
@Alex-Welsh
Copy link
Copy Markdown
Member Author

Closing in favor of #142

@Alex-Welsh Alex-Welsh closed this Jan 19, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant