Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm.
The package can be installed from CRAN using:
The development version, to be used at your peril, can be installed from GitHub using the remotes package.
Development takes place on the GitHub page.
https://github.com/bquast/attention
Bugs can be filed on the issues page on GitHub.