Unable to run tensorflow distributed - No gradients provided for any variable

25 views Asked by At

I am trying to use MirroredStrategy to run a tensorflow model in parallel. I am getting an error that says Error reported to Coordinator: No gradients provided for any variable

This is my model

strategy = tf.distribute.MirroredStrategy()

dataset = tf.data.Dataset.from_tensor_slices(data)
dataset = dataset.batch(batch_size=32)

with strategy.scope():
    model = build_model()
    model.compile(optimizer='adam', loss='mse')
    model.fit(dataset, epochs=10)

I am using TensorFlow 2.12. What is the mistake in this code, that is causing the no gradient received by variables error reported by the coordinator.

0

There are 0 answers