Skip to content

xiuzbl/Gated-attention-with-Open_NMT

Repository files navigation

Gated attention with Open NMT

Introduction

My main modification in the global_attention(/onmt/modules/global_attention.py) combined the idea of gated attention in the paper 'Not all attention is needed' with Luong global attention methods in the paper 'Effective Approaches to Attention-based Neural Machine Translation'. I also modified the interface function between the source code of Open NMT(encoder and decoder files) and the newly added content. I did not make many changes to the source code, which will allow me to compare other models later.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published