Skip to content

Wrap vllm inputs to compatible with VLLM>=0.10.2#1003

Open
JIElite wants to merge 2 commits intohuggingface:mainfrom
JIElite:upgrade-vllm-to-0.10.2
Open

Wrap vllm inputs to compatible with VLLM>=0.10.2#1003
JIElite wants to merge 2 commits intohuggingface:mainfrom
JIElite:upgrade-vllm-to-0.10.2

Conversation

@JIElite
Copy link
Copy Markdown
Contributor

@JIElite JIElite commented Oct 2, 2025

I've implemented the code to support VLLM>=0.10.2 as mentioned in issue #1002, so that we can evaluate the newest models with lighteval.

Please take a look.
Thank you.

@HuggingFaceDocBuilderDev
Copy link
Copy Markdown
Collaborator

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@sihyeonn
Copy link
Copy Markdown

Hey @JIElite — your analysis in #1002 was right on the money.

#1173 landed in the meantime and handled the bulk of the upgrade. There were still a couple of leftover wrapping patterns though, so I put up #1191 to clean those out. Turns out PromptType takes list[int] directly now so the wrapping was just unnecessary overhead.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants