Skip to content

Inconsistencies in model paths on Hugging Face #431

@msluszniak

Description

@msluszniak

As mentioned in #382 I'm moving this into separate issue:
Inconsistencies in naming across hugging face repository:

  • Only for llama3_2-3B_qat_lora.pte there is - between name and number of parameters, while for the rest is _.
  • For WHISPER_TINY_DECODER en is concatenated in the first part of URL with . and not with - as in the rest of examples.
  • There is word all in efficientnet_v2_s_coreml_all and only for that model.
  • For some of the models initial name is concatenated with - and the model name as well - phi-4-mini and phi-4-mini_bf16.pte while in others there is - and _ e.g. qwen-2.5 and qwen2_5_3b_8da4w.pte.
  • Some model names adds backend info at the beginning: xnnpack_whisper_encoder.pte and some on the end or almost end: style_transfer_udnie_xnnpack.pte.
  • ssdlite320-mobilenet-v3-large v3 here is concatenated with - but in model name not: ssdlite320-mobilenetv3-large.pte.
  • For craft detector, number means size of the input tensor while for the other suffixed numbers means model sizes.
  • Some models B and M meaning billion and million of parameters is capitalized in both first part of url and name: smolLm-2-135M and smolLm2_135M_bf16.pte and for some not: qwen-2.5-1.5B and qwen2_5_0_5b_8da4w.pte.

This should be unified in a separate task

Metadata

Metadata

Assignees

No one assigned

    Labels

    huggingfaceIssues and tasks related to HuggingFace repository

    Type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions