You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
Thank you for the benchmark. I am seeing a strange behavior. I am trying to run the experiment using synthetic dataset. I am observing that simple network using tf.layers.dense does not give predictions accurately but when I use following code in place of tf.layers.dense then I get values closer to ground truth:
Original logits calculation logits = tf.layers.dense(features, self.num_classes, tf.nn.relu)
LEAF output using tf.layers.dense: [[0.24843943, 0.25128123, 0.24843943, 0.24843943, 0.24843943] LEAF output using tf.nn.relu(tf.matmul(features, weights, name="MatMulLogits") + biases): [[0.2490078 0.2490078 0.2490078 0.2490078 0.2490078]] Actual Values i.e., ground truth multiplication value (input * weights + bias): [[0.249007804 0.249007804 0.249007804 0.249007804 0.249007804]]
As you can see that tf.matmul is producing values closer to ground truth whereas tf.layers.dense is away from actual values in 3rd decimal places. Can you please guide what is the issue and how can we get logits closer to ground truth?
Thank you
Best Regards,
The text was updated successfully, but these errors were encountered:
Hi,
Thank you for the benchmark. I am seeing a strange behavior. I am trying to run the experiment using
synthetic
dataset. I am observing that simple network using tf.layers.dense does not give predictions accurately but when I use following code in place of tf.layers.dense then I get values closer to ground truth:Original logits calculation
logits = tf.layers.dense(features, self.num_classes, tf.nn.relu)
Modified logits calculation
Input: [-0.9564917406583653, 0.6703038000763276, -0.8226291466995398, -1.0770311337470495, -0.785290071358798, 0.3809045777819794, -1.3688283052201289, -0.653962343061565, -0.7081657613676491, 1.3065677857572335]
LEAF output using tf.layers.dense: [[0.24843943, 0.25128123, 0.24843943, 0.24843943, 0.24843943]
LEAF output using tf.nn.relu(tf.matmul(features, weights, name="MatMulLogits") + biases): [[0.2490078 0.2490078 0.2490078 0.2490078 0.2490078]]
Actual Values i.e., ground truth multiplication value (input * weights + bias): [[0.249007804 0.249007804 0.249007804 0.249007804 0.249007804]]
As you can see that tf.matmul is producing values closer to ground truth whereas tf.layers.dense is away from actual values in 3rd decimal places. Can you please guide what is the issue and how can we get logits closer to ground truth?
Thank you
Best Regards,
The text was updated successfully, but these errors were encountered: