Skip to content

ChaoqiLiang/FuDFEND

Repository files navigation

FuDFEND

In English

Our paper: https://doi.org/10.1007/978-3-031-17189-5_4

This project uses the Chinese BERT (BERT-wwm-ext) from Harbin Institute of Technology for BERT pre-training. The GitHub repository of the project can be found at: https://github.com/ymcui/Chinese-BERT-wwm

Downloading Pre-trained Weights

You can download the BERT pre-trained weights from the following link:

Weight link: https://drive.google.com/file/d/1iNeYFhCBJWeUsIlnW_2K6SMwXkM4gLb_/view

After downloading, please place the BERT pre-trained weights in the pretrained_model/chinese_roberta_wwm_base_ext_pytorch directory.

Usage

python run.py

Contact Information

If you encounter any issues while using this project, please do not hesitate to reach out to me via email.

My email address: [email protected]

Thank you!

中文介绍

我们的论文:https://doi.org/10.1007/978-3-031-17189-5_4

本项目使用哈工大Chinese BERT(BERT-wwm-ext)进行BERT预训练。项目的GitHub地址为:https://github.com/ymcui/Chinese-BERT-wwm

下载预训练权重

您可以通过以下链接下载BERT预训练权重:

权重链接:https://drive.google.com/file/d/1iNeYFhCBJWeUsIlnW_2K6SMwXkM4gLb_/view

下载完成后,请将BERT预训练权重放置在pretrained_model/chinese_roberta_wwm_base_ext_pytorch目录中。

使用方法

python run.py

联系方式

如果您在使用过程中遇到任何问题,请不要犹豫,积极通过邮件联系本人。

本人邮件地址:[email protected]

谢谢!

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages