Title: Implementation of Transformer Deep Neural Network with Vignettes
Version: 0.2.0
Description: Transformer is a Deep Neural Network Architecture based i.a. on the Attention mechanism (Vaswani et al. (2017) <doi:10.48550/arXiv.1706.03762>).
License: MIT + file LICENSE
Encoding: UTF-8
RoxygenNote: 7.2.3
Imports: attention (≥ 0.4.0)
Suggests: covr, testthat (≥ 3.0.0)
Config/testthat/edition: 3
NeedsCompilation: no
Packaged: 2023-11-10 12:04:15 UTC; bquast
Author: Bastiaan Quast ORCID iD [aut, cre]
Maintainer: Bastiaan Quast <bquast@gmail.com>
Repository: CRAN
Date/Publication: 2023-11-10 12:30:02 UTC

Feed Forward Layer

Description

Feed Forward Layer

Usage

feed_forward(x, dff, d_model)

Arguments

x

inputs

dff

dimensions of feed-forward model

d_model

dimensions of the model

Value

output of the feed-forward layer


Layer Normalization

Description

Layer Normalization

Usage

layer_norm(x, epsilon = 1e-06)

Arguments

x

inputs

epsilon

scale

Value

outputs of layer normalization


Multi-Headed Attention

Description

Multi-Headed Attention

Usage

multi_head(Q, K, V, d_model, num_heads, mask = NULL)

Arguments

Q

queries

K

keys

V

values

d_model

dimensions of the model

num_heads

number of heads

mask

optional mask

Value

multi-headed attention outputs


Objects exported from other packages

Description

These objects are imported from other packages. Follow the links below to see their documentation.

attention

attention, SoftMax


Row Means

Description

Row Means

Usage

row_means(x)

Arguments

x

matrix

Value

vector with the mean of each of row of the input matrix

Examples

row_means(t(matrix(1:5)))

Row Variances

Description

Row Variances

Usage

row_vars(x)

Arguments

x

matrix

Value

vector with the variance of each of row of the input matrix

Examples

row_vars(t(matrix(1:5)))

Transformer

Description

Transformer

Usage

transformer(x, d_model, num_heads, dff, mask = NULL)

Arguments

x

inputs

d_model

dimensions of the model

num_heads

number of heads

dff

dimensions of feed-forward model

mask

optional mask

Value

output of the transformer layer

Examples

x <- matrix(rnorm(50 * 512), 50, 512)
d_model <- 512
num_heads <- 8
dff <- 2048

output <- transformer(x, d_model, num_heads, dff)