feat: upgrade to transformers v5#659
feat: upgrade to transformers v5#659dushyantbehl merged 21 commits intofoundation-model-stack:mainfrom
Conversation
|
Thanks for making a pull request! 😃 |
…xtToText Signed-off-by: Yash Mehan <yashmehan@gmail.com>
…_tokens_seen Signed-off-by: Yash Mehan <yashmehan@gmail.com>
…d llama tokenizer Signed-off-by: Yash Mehan <yashmehan@gmail.com>
Signed-off-by: Yash Mehan <yashmehan@gmail.com>
Signed-off-by: Yash Mehan <yashmehan@gmail.com>
Signed-off-by: Yash Mehan <yashmehan@gmail.com>
…k case Signed-off-by: Yash Mehan <yashmehan@gmail.com>
Signed-off-by: Yash Mehan <yashmehan@gmail.com>
…ay, adding justification for what was renamed Signed-off-by: Yash Mehan <yashmehan@gmail.com>
Signed-off-by: Yash Mehan <yashmehan@gmail.com>
Signed-off-by: Yash Mehan <yashmehan@gmail.com>
Signed-off-by: Yash Mehan <71321431+yash4242@users.noreply.github.com>
dushyantbehl
left a comment
There was a problem hiding this comment.
Delete commented code...retain comments only in code files and not in test files.
fix the pyproject file as per comments.
Signed-off-by: Yash Mehan <yashmehan@gmail.com>
Signed-off-by: Yash Mehan <yashmehan@gmail.com>
Signed-off-by: Yash Mehan <yashmehan@gmail.com>
dushyantbehl
left a comment
There was a problem hiding this comment.
Good progress @yash4242
minor changes requested along with updates to the dockerfile to match the incoming changes.
Signed-off-by: Yash Mehan <yashmehan@gmail.com>
|
/build |
|
Build failed for |
Signed-off-by: Yash Mehan <yashmehan@gmail.com>
Signed-off-by: Yash Mehan <yashmehan@gmail.com>
Signed-off-by: Yash Mehan <yashmehan@gmail.com>
|
Please have a look at the PR. trl >=0.27 is supported, transformers >= 5.2 is supported. It should pass the lint check as well as the other checks. |
|
/build |
dushyantbehl
left a comment
There was a problem hiding this comment.
Looks good to me. Good work @yash4242 !
|
Build succeeded for |
|
Build succeeded for |
This reverts commit 09810e3.
This reverts commit 09810e3.
Description of the change
Making fms-hf-tuning transformers v5 compatible
Related issue number
NA
How to verify the PR
run tests. As of now, some tests fail, most of which are due to a tensor size mismatch. This error comes inside the trainer.train() code
Was the PR tested
Yes.
Tests status as above. I tested out with 9 pytests as follows: