.. toctree:: :maxdepth: 2 :caption: 目录结构 预训练模型是什么 <pretrain_model_description> 预训练分词Subword <subword> ELMo <ELMo> Transformer <transformer> Transformer-XL <Transformer-XL> Longformer <longformer> GPT <GPT> XLNet <XLNet> BERT <bert> SpanBERT <SpanBERT> RoBERTa <RoBERTa> ERINE <erine> ERNIE 3.0 <ERNIE3> ERNIE-Gram <ERNIE-Gram> ERNIE-Doc <ERNIE-Doc> THU-ERNIE <THU-ERNIE> SKEP <skep> KBERT <KBERT> ALBERT <ALBERT> ELECTRA <ELECTRA> Performer <Performer>