Boolean inputs get mapped to F32 when converting to TypedModel
#1161
Replies: 3 comments 3 replies
-
All bools, no ? |
Beta Was this translation helpful? Give feedback.
-
hmmm that's correct. I've traced the issue back to a patch I added a few months ago such that this network could be analyzed: I had to override the InferenceFact for tract to be able to analyse the graph in full (hence why everything gets overwritten with F32 types and explicitly specified tensor shapes). If I don't override this the analysis fails with: ┏ 0 Source input
┃ ━━━ batch_size,3,5,5,F32
┣┻┻ 3 Resize /upsample/Resize
━━━ batch_size,3,10,10,F32
[2023-08-24T12:35:25.886682000Z ERROR tract] Error at stage analyse
Caused by:
0: Failed analyse for node #3 "/upsample/Resize" Resize
1: Failed analyse for node #3 "/upsample/Resize" Resize
2: Infering facts
3: Applying rule Given2Rule { (inputs[0].shape, inputs[2]) }
4: Undetermined symbol in expression: batch_size
I'm pretty sure this graph is correctly specified so this might be an issue ? not sure. It seems like because batch_size isn't determined at analysis time it is unable to apply the expansion along (dim=0), which is just 1 in this case. The pytorch graph that created this is nothing special either: class Circuit(nn.Module):
def __init__(self, inplace=False):
super(Circuit, self).__init__()
self.upsample = nn.Upsample(scale_factor=2, mode='nearest')
def forward(self, x):
y = self.upsample(x)
return y
|
Beta Was this translation helpful? Give feedback.
-
I would not be surprised. Resize implementation is pretty weak and incomplete. I will have a look. |
Beta Was this translation helpful? Give feedback.
-
Boolean inputs in ONNX get mapped to F32
TypedSource
nodes in tract.I was wondering if this was intentional behaviour.
To replicate this you can run the tract CLI on the following graph composed solely of boolean inputs and AND | OR | XOR | EQUALS operations:
boolean.onnx.zip
Beta Was this translation helpful? Give feedback.
All reactions