In Google's 2017 paper Attention is all you need, the input of the encoder stacks combines word and position embedding, which captures more information regarding word sequence order, this project is based on that paper.
-
Notifications
You must be signed in to change notification settings - Fork 1
Tristan-Chow/PositionEncoding
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
PostionEmbedding part from Google's paper-"Attention is all you need"
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published