Conversation
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
| SUPPORTED_ARCHITECTURES += ("data2vec-text", "flaubert", "xlm") | ||
|
|
||
| if is_transformers_version("!=", "4.52"): | ||
| SUPPORTED_ARCHITECTURES += ("convbert",) |
There was a problem hiding this comment.
failing in v5.2 https://github.com/huggingface/transformers/blob/v5.2.0/src/transformers/modeling_utils.py#L2315 fixed in v5.3 since huggingface/transformers@a64996e
| "optimum-onnx@git+https://github.com/huggingface/optimum-onnx.git@transformers-v5", | ||
| "transformers>=4.45,<5.1", | ||
| "optimum-onnx@git+https://github.com/huggingface/optimum-onnx.git@xadupre/transformers5", | ||
| "transformers>=4.57,<5.5", |
There was a problem hiding this comment.
setting min transformers version to v4.57, will replicate everywhere once validated cc @rkazants
| "torch>=2.1", | ||
| "optimum-onnx@git+https://github.com/huggingface/optimum-onnx.git@transformers-v5", | ||
| "transformers>=4.45,<5.1", | ||
| "optimum-onnx@git+https://github.com/huggingface/optimum-onnx.git@transformers-v5.5", |
There was a problem hiding this comment.
| "optimum-onnx@git+https://github.com/huggingface/optimum-onnx.git@transformers-v5.5", | |
| "optimum@git+https://github.com/huggingface/optimum", |
should be updated before merging (once #1690 merged and huggingface/optimum#2430 for v5.4)
|
|
||
| # Add the attention_mask inputs when needed | ||
| if "attention_mask" in self.input_names: | ||
| if attention_mask is None: |
There was a problem hiding this comment.
before v5.2 attention_mask was created in generate by calling _prepare_attention_mask_for_generation https://github.com/huggingface/transformers/blob/v5.1.0/src/transformers/generation/utils.py#L2530 not the case of encoder_decoder model since v5.2 https://github.com/huggingface/transformers/blob/v5.2.0/src/transformers/generation/utils.py#L2555
| SUPPORTED_ARCHITECTURES += ("llava_next_video",) | ||
| else: | ||
| UNSUPPORTED_ARCHITECTURES.update({"got_ocr2", "idefics3", "llama4", "llava_next_video", "smolvlm"}) | ||
| _is_model_supported = { |
There was a problem hiding this comment.
TODO: extend for all architectures (waiting for validation min required transformers version set to 4.57 / can be done in a following PR if needed)
| # known issues with marian on OpenVINO 2025.3.x and 2025.4.x | ||
| # TODO: add fix for v5 and update MAX_TRANSFORMERS_VERSION accordingly (mt5) | ||
| _is_model_supported = { | ||
| "m2m_100": is_transformers_version("!=", "5.4"), |
There was a problem hiding this comment.
issue coming from mismatch in https://github.com/huggingface/transformers/blob/v5.4.0/src/transformers/models/m2m_100/configuration_m2m_100.py#L68 (introduced in v5.4, fixed in 5.5)
No description provided.