refactor!: Make user_prompt required in LLM#11152
Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub. 1 Skipped Deployment
|
julian-risch
left a comment
There was a problem hiding this comment.
Looks good to me! Looking at dc-pipeline-templates, required_variables isn't set in all pipelines, for example a quick search found https://github.com/deepset-ai/dc-pipeline-templates/pull/338/changes#diff-a5088c42299eda596ed218f47073d35f457cfd204e60e4475eed668511defb14R264
Coverage Report for CI Build 24665547890Warning No base build found for commit Coverage: 92.857%Details
Uncovered ChangesNo uncovered changes found. Coverage RegressionsRequires a base build to compare against. How to fix this → Coverage Stats
💛 - Coveralls |
Just fixed it a few min ago https://github.com/deepset-ai/dc-pipeline-templates/pull/354#pullrequestreview-4139750934 |
Related Issues
messagesinput parameter in Agent component #11147Proposed Changes:
Came about from a discussion with @julian-risch about making the LLM component more reliable when scheduled in a pipeline. We decided to go forward with the breaking change because it's a relatively new component. We do acknowledge this means that it won't be possible to just run this component with a
messagesinput but from what we understand the desired pathway for platform use is to use theuser_prompt.The
LLMcomponent now requiresuser_promptto be provided at initialization and it must contain at least one Jinja2 template variable (e.g.{{ variable_name }}). This ensures the component always exposes at least one required input socket, which is necessary for correct pipeline scheduling.required_variablesnow defaults to"*"(all variables inuser_promptare required), and passing an empty list raises aValueError.How did you test it?
New tests and updated old ones
Notes for the reviewer
Checklist
fix:,feat:,build:,chore:,ci:,docs:,style:,refactor:,perf:,test:and added!in case the PR includes breaking changes.