Wraps the AttentionLayer3D taken from the following python implementation

layer_attention_3d(
  object,
  numberOfChannels,
  doGoogleBrainVersion = TRUE,
  trainable = TRUE
)

Arguments

object

Object to compose layer with. This is either a keras::keras_model_sequential to add the layer to or another Layer which this layer will call.

numberOfChannels

numberOfChannels

doGoogleBrainVersion

boolean. Variant described at second url.

trainable

Whether the layer weights will be updated during training.

Value

a keras layer tensor

Details

https://stackoverflow.com/questions/50819931/self-attention-gan-in-keras https://github.com/taki0112/Self-Attention-GAN-Tensorflow

based on the following paper:

https://arxiv.org/abs/1805.08318

Examples

if (FALSE) { library( keras ) library( ANTsRNet ) inputShape <- c( 100, 100, 100, 3 ) input <- layer_input( shape = inputShape ) numberOfFilters <- 64 outputs <- input %>% layer_conv_3d( filters = numberOfFilters, kernel_size = 2 ) outputs <- outputs %>% layer_attention_3d( numberOfFilters ) model <- keras_model( inputs = input, outputs = outputs ) }