mirror of
https://github.com/zebrajr/pytorch.git
synced 2026-01-15 12:15:51 +00:00
Summary: Implement dot attention as described in https://arxiv.org/abs/1508.04025 This saves the computation of weighted encoder outputs in `rnn_cell.py` When the encoder and decoder dimensions are different, we apply an FC, which corresponds to the general case below Figure 2. Refactored unit tests. Reviewed By: jhcross Differential Revision: D5486976 fbshipit-source-id: f9e9aea675b3b072fbe631bc004199b90a9d95cb