-
Updated
Oct 22, 2020 - Python
attention-mechanism
Here are 612 public repositories matching this topic...
-
Updated
Aug 4, 2020 - Python
-
Updated
Sep 25, 2020 - Python
-
Updated
Sep 11, 2020 - Python
-
Updated
Oct 23, 2020
-
Updated
Apr 12, 2020 - Python
-
Updated
Aug 3, 2020 - Python
-
Updated
Sep 27, 2020 - Python
-
Updated
Oct 29, 2020 - Python
-
Updated
Oct 26, 2020 - Python
-
Updated
Sep 12, 2019 - Python
-
Updated
Nov 2, 2020 - Python
-
Updated
Oct 29, 2020 - Python
-
Updated
Jul 28, 2018 - Jupyter Notebook
-
Updated
Jun 25, 2019 - Python
-
Updated
Sep 13, 2020
-
Updated
Jul 30, 2019 - Python
-
Updated
Oct 14, 2020
-
Updated
Jul 3, 2020 - Python
-
Updated
Jul 29, 2020 - Python
-
Updated
Sep 22, 2019 - Python
-
Updated
Nov 1, 2020 - Jupyter Notebook
-
Updated
Jun 12, 2020 - Python
-
Updated
Mar 10, 2020 - Python
-
Updated
Oct 30, 2016 - Jupyter Notebook
-
Updated
Oct 31, 2020 - Python
-
Updated
Sep 24, 2020 - Python
-
Updated
Oct 26, 2020 - Python
-
Updated
Apr 9, 2020 - Python
Improve this page
Add a description, image, and links to the attention-mechanism topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the attention-mechanism topic, visit your repo's landing page and select "manage topics."


Need help for retraining and cross validation and see if the ROUGE score matches exactly (or better) with the numbers reported in the paper.
I just train for 500k iteration (with batch size 8) with pointer generation enabled + coverage loss disabled and next 100k iteration (with batch size 8) with pointer generation enabled + coverage loss enabled.
It would be great if someone can help re-r