Skip to content

BertLMHeadModel.from_pretrained("bert-base-uncased", config=encoder_config) #47

@baipeng1

Description

@baipeng1

https://github.com/OpenMOSS/AnyGPT/blob/main/seed2/seed_qformer/blip2.py中的
Qformer = BertLMHeadModel.from_pretrained("bert-base-uncased", config=encoder_config)
加载预训练的bert时,由于服务器不能访问hugging face,所以自己提前下载了
https://huggingface.co/google-bert/bert-base-uncased/tree/main,
然后报错
Some weights of BertLMHeadModel were not initialized from the model checkpoint at 本地地址 and are newly initialized: xx
请问是我下载的bert-base-uncased不对吗?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions