Our paper: https://doi.org/10.1007/978-3-031-17189-5_4
This project uses the Chinese BERT (BERT-wwm-ext) from Harbin Institute of Technology for BERT pre-training. The GitHub repository of the project can be found at: https://github.com/ymcui/Chinese-BERT-wwm
You can download the BERT pre-trained weights from the following link:
Weight link: https://drive.google.com/file/d/1iNeYFhCBJWeUsIlnW_2K6SMwXkM4gLb_/view
After downloading, please place the BERT pre-trained weights in the pretrained_model/chinese_roberta_wwm_base_ext_pytorch
directory.
python run.py
If you encounter any issues while using this project, please do not hesitate to reach out to me via email.
My email address: [email protected]
Thank you!
我们的论文:https://doi.org/10.1007/978-3-031-17189-5_4
本项目使用哈工大Chinese BERT(BERT-wwm-ext)进行BERT预训练。项目的GitHub地址为:https://github.com/ymcui/Chinese-BERT-wwm
您可以通过以下链接下载BERT预训练权重:
权重链接:https://drive.google.com/file/d/1iNeYFhCBJWeUsIlnW_2K6SMwXkM4gLb_/view
下载完成后,请将BERT预训练权重放置在pretrained_model/chinese_roberta_wwm_base_ext_pytorch
目录中。
python run.py
如果您在使用过程中遇到任何问题,请不要犹豫,积极通过邮件联系本人。
本人邮件地址:[email protected]
谢谢!