Skip to content

Commit

Permalink
add coling paper link
Browse files Browse the repository at this point in the history
  • Loading branch information
jiesutd committed Jun 13, 2018
1 parent 5933f6e commit 733a04a
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ NCRF++: An Open-source Neural Sequence Labeling Toolkit
======
Sequence labeling models are quite popular in many NLP tasks, such as Named Entity Recognition (NER), part-of-speech (POS) tagging and word segmentation. State-of-the-art sequence labeling models mostly utilize the CRF structure with input word features. LSTM (or bidirectional LSTM) is a popular deep learning based feature extractor in sequence labeling task. And CNN can also be used due to faster computation. Besides, features within word are also useful to represent word, which can be captured by character LSTM or character CNN structure or human-defined neural features.

NCRF++ is a PyTorch based framework with flexiable choices of input features and output structures. The design of neural sequence labeling models with NCRF++ is fully configurable through a configuration file, which does not require any code work. NCRF++ is a neural version of [CRF++](http://taku910.github.io/crfpp/), which is a famous statistical CRF framework. The detailed experiment report using NCRF++ has been accepted at COLING 2018.
NCRF++ is a PyTorch based framework with flexiable choices of input features and output structures. The design of neural sequence labeling models with NCRF++ is fully configurable through a configuration file, which does not require any code work. NCRF++ is a neural version of [CRF++](http://taku910.github.io/crfpp/), which is a famous statistical CRF framework. The detailed experiment report using NCRF++ has been accepted at [COLING 2018](https://arxiv.org/abs/1806.04470).

NCRF++ supports diffent structure combinations of on three levels: character sequence representation, word sequence representation and inference layer.

Expand Down Expand Up @@ -72,7 +72,7 @@ In default, `LSTM` is bidirectional LSTM.
|3| WordCNN | 88.56| 90.46 | 90.30
|4| WordCNN+CRF | 88.90 | 90.70 | 90.43

We have compared twelve neural sequence labeling models (`{charLSTM, charCNN, None} x {wordLSTM, wordCNN} x {softmax, CRF}`) on three benchmarks (POS, Chunking, NER) under statistical experiments, detail results and comparisons can be found in our COLING 2018 paper (coming soon).
We have compared twelve neural sequence labeling models (`{charLSTM, charCNN, None} x {wordLSTM, wordCNN} x {softmax, CRF}`) on three benchmarks (POS, Chunking, NER) under statistical experiments, detail results and comparisons can be found in our COLING 2018 paper [Design Challenges and Misconceptions in Neural Sequence Labeling](https://arxiv.org/abs/1806.04470).


4.External feature defining
Expand Down

0 comments on commit 733a04a

Please sign in to comment.