how to make tied autoencoder with keras and tensorflow 2

42 views Asked by At

i want to create an autoencoder using keras in which the weight matrix of the decoder is the transpose of the encoder, I tried this code:

class Tied(keras.layers.Layer):
def __init__(self, units=6040, tied_to=None):
    super().__init__()
    self.units = units
    self.tied_to = tied_to

def build(self, input_shape):
  if self.tied_to is not None:
    self.w = K.transpose(self.tied_to.get_weights()[0])
    self.b = K.transpose(self.tied_to.get_weights()[1])

def call(self, inputs):
    return tf.matmul(inputs, self.w) + self.b

#encoder
input_layer = Input(shape=(6040,))
encoder_layer = Dense(500, activation='sigmoid')
encoded = encoder_layer(input_layer)
#decoder
decoder_layer = Tied(num_users, tied_to=encoder_layer)
decoded = decoder_layer(encoded)  

but it gives me this error:

Exception encountered when calling layer "tied" (type Tied).

in user code:

    File "<ipython-input-13-f2a1a26e090e>", line 13, in call  *
         return tf.matmul(inputs, self.w) + self.b

    ValueError: Dimensions must be equal, but are 6040 and 500 for 
    '{{node tied/add}} = AddV2[T=DT_FLOAT](tied/MatMul, tied/add/y)' 
    with input shapes: [?,6040], [500].


Call arguments received by layer "tied" (type Tied):
         • inputs=tf.Tensor(shape=(None, 500), dtype=float32)

in the line: decoded = decoder_layer(encoded) and during the training of the model, I do not want the weights of the decoder to be trained

0

There are 0 answers