Wraps the AttentionLayer2D taken from the following python implementation
layer_attention_2d( object, numberOfChannels, doGoogleBrainVersion = TRUE, trainable = TRUE )
object | Object to compose layer with. This is either a keras::keras_model_sequential to add the layer to or another Layer which this layer will call. |
---|---|
numberOfChannels | numberOfChannels |
doGoogleBrainVersion | boolean. Variant described at second url. |
trainable | Whether the layer weights will be updated during training. |
a keras layer tensor
https://stackoverflow.com/questions/50819931/self-attention-gan-in-keras https://github.com/taki0112/Self-Attention-GAN-Tensorflow
based on the following paper:
https://arxiv.org/abs/1805.08318
if (FALSE) { library( keras ) library( ANTsRNet ) inputShape <- c( 100, 100, 3 ) input <- layer_input( shape = inputShape ) numberOfFilters <- 64 outputs <- input %>% layer_conv_2d( filters = numberOfFilters, kernel_size = 2 ) outputs <- outputs %>% layer_attention_2d( numberOfFilters ) model <- keras_model( inputs = input, outputs = outputs ) }