A Julia package providing various positional embedding implementations for enriching sequence data with position information.
- Absolute Positional Embeddings (AbsolutePE): Implementation of the sinusoidal position embeddings from Attention Is All You Need
- Rotary Position Embeddings (RoPE): Implementation of the rotary position embeddings from RoFormer: Enhanced Transformer with Rotary Position Embedding
using Pkg
Pkg.add("PositionalEmbeddings")
using PositionalEmbeddings
# Absolute Positional Embeddings
pe = AbsolutePE(512, 1024) # embedding_size=512, max_length=1024
x = randn(Float32, 100, 512, 32) # (seq_len, channels, batch)
x_with_pos = pe(x)
# Rotary Position Embeddings
rope = RoPE(512, 1024) # head_dim=512, max_length=1024
x = randn(Float32, 512, 100, 2*32) # (head_dim, seq_len, (nhead*batch_size))
x_with_pos = rope(x)
For a complete example of RoPEMultiHeadAttention implementation, please visit documentation. To keep dependencies minimal, trainable parameters must be specified.
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.