Skip to content

Commit

Permalink
src/layers/rotary.jl: Add reference to original source.
Browse files Browse the repository at this point in the history
  • Loading branch information
mashu committed Nov 14, 2024
1 parent d887d93 commit 5abe0e8
Showing 1 changed file with 20 additions and 1 deletion.
21 changes: 20 additions & 1 deletion src/layers/rotary.jl
Original file line number Diff line number Diff line change
@@ -1,3 +1,22 @@
"""
Rotary Position Embeddings (RoPE)
This is a port of the RoPE implementation from NeuralAttentionlib.jl, which is an implementation of
the Rotary Position Embeddings (RoPE) described in the RoFormer paper.
Original sources:
- Paper: "RoFormer: Enhanced Transformer with Rotary Position Embedding"
Authors: Jianlin Su, Yu Lu, Shengfeng Pan, Ahmed Murtadha, Bo Wen
URL: https://arxiv.org/abs/2104.09864
- Code: NeuralAttentionlib.jl
Author: chengchingwen
Repository: https://github.com/chengchingwen/NeuralAttentionlib.jl
RoPE encodes absolute positional information with a rotation matrix that naturally
incorporates explicit relative position dependency in self-attention formulation.
"""

"""
Calculate position-dependent frequency.
"""
Expand Down Expand Up @@ -135,4 +154,4 @@ function ChainRulesCore.rrule(::typeof(with_rotary_position_embedding), x::Abstr
end

return y, rotary_pullback
end
end

0 comments on commit 5abe0e8

Please sign in to comment.