Skip to content

Latest commit

 

History

History
48 lines (32 loc) · 1.98 KB

README.md

File metadata and controls

48 lines (32 loc) · 1.98 KB

PositionalEmbeddings.jl

PositionalEmbeddings.jl

Stable Dev Build Status Coverage

A Julia package providing various positional embedding implementations for enriching sequence data with position information.

Features

Installation

using Pkg
Pkg.add("PositionalEmbeddings")

Quick Start

using PositionalEmbeddings

# Absolute Positional Embeddings
pe = AbsolutePE(512, 1024)  # embedding_size=512, max_length=1024
x = randn(Float32, 100, 512, 32)  # (seq_len, channels, batch)
x_with_pos = pe(x)

# Rotary Position Embeddings
rope = RoPE(512, 1024)  # head_dim=512, max_length=1024
x = randn(Float32, 512, 100, 2*32)  # (head_dim, seq_len, (nhead*batch_size))
x_with_pos = rope(x)

For a complete example of RoPEMultiHeadAttention implementation, please visit documentation. To keep dependencies minimal, trainable parameters must be specified.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This project is licensed under the MIT License - see the LICENSE file for details.