Skip to content

Implementation, explanation and intuition of the attention module proposed by Google Brain in 2017.

Notifications You must be signed in to change notification settings

OttoVintola/attention

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Attention

In this repository, there are various implementations of the attention mechanism used often in deep learning. The motivation is to provide some intuition of the attention module in the domain of NLP.

About

Implementation, explanation and intuition of the attention module proposed by Google Brain in 2017.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published