Unofficial PyTorch implementation of Attention Free Transformer (AFT) layers
aft-pytorch
Unofficial PyTorch implementation of Attention Free Transformer’s layers by Zhai, et al. [abs, pdf] from Apple Inc.
Installation
You can install aft-pytorch
via pip
:
pip install aft-pytorch
Usage
You can import the AFT-Full or AFT-Simple layer (as described in the paper) from the package like so:
AFTFull
from aft_pytorch import AFTFull
layer = AFTFull(
max_seqlen=20,
dim=512,
hidden_dim=64
)
# a batch of sequences with 10 timesteps of length 512 each
x = torch.rand(32, 10, 512)
y = layer(x) # [32, 10, 512]
AFTSimple
from aft_pytorch import AFTSimple
layer = AFTSimple(
max_seqlen=20,
dim=512,
hidden_dim=64
)
# a batch of sequences with 10 timesteps of length 512 each
x = torch.rand(32, 10, 512)
y = layer(x) # [32, 10, 512]