In this repository, there are various implementations of the attention mechanism used often in deep learning. The motivation is to provide some intuition of the attention module in the domain of NLP.
-
Notifications
You must be signed in to change notification settings - Fork 0
OttoVintola/attention
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
About
Implementation, explanation and intuition of the attention module proposed by Google Brain in 2017.
Topics
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published