Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multiple ONNX optimizer tests fail on big-endian #8

Open
JonTriebenbach opened this issue Sep 1, 2020 · 4 comments
Open

Multiple ONNX optimizer tests fail on big-endian #8

JonTriebenbach opened this issue Sep 1, 2020 · 4 comments

Comments

@JonTriebenbach
Copy link

These tests were run on s390x. s390x is big-endian architecture.

Example failure log from pytest:

_________________________________________________ TestOptimizer.test_fuse_pad_into_conv_1d _________________________________________________

self = <optimizer_test.TestOptimizer testMethod=test_fuse_pad_into_conv_1d>

    def test_fuse_pad_into_conv_1d(self):  # type: () -> None
        pad = helper.make_node(
            "Pad",
            ["X", "Pads"],
            ["P"],
            mode="constant"
        )
        conv = helper.make_node("Conv", ["P", "Y"], ["Z"])
        graph = helper.make_graph(
            [pad, conv],
            "test",
            [helper.make_tensor_value_info("X", TensorProto.FLOAT, (1, 5, 30)),
             helper.make_tensor_value_info("Pads", TensorProto.INT64, (6,)),
             helper.make_tensor_value_info("Y", TensorProto.FLOAT, (16, 5, 32))],
            [helper.make_tensor_value_info("Z", TensorProto.FLOAT, (1, 16, 1))],
            [helper.make_tensor("Pads", TensorProto.INT64,
             dims=(6,),
             vals=np.array([0, 0, 1, 0, 0, 1]).astype(np.int64).tobytes(),
             raw=True)])
        optimized_model = self._optimized(graph, ["fuse_pad_into_conv"])
    
        assert len(list(optimized_model.graph.node)) == 1
        assert optimized_model.graph.node[0].op_type == "Conv"
        assert optimized_model.graph.node[0].attribute[0].name == "pads"
>       assert list(optimized_model.graph.node[0].attribute[0].ints) == [1, 1]
E       AssertionError: assert [720575940379...7594037927936] == [1, 1]
E         At index 0 diff: 72057594037927936 != 1
E         Use -v to get the full diff

optimizer_test.py:1038: AssertionError

Notice the big integers: 720575940379. Viewing the number in hex shows an endian swapping issue.

Attached is the patch file for fixing the issue in the testcase helper.py code. The helper.py code needs to account for generating the raw data on a big endian architecture while the ONNX internals always expects little endian values.

A pull request can be submitted if needed.

Attaching corrected patch below.

@JonTriebenbach
Copy link
Author

There is an issue with this helper.py change. Additional investigation underway.

@JonTriebenbach
Copy link
Author

Corrected patch attached.

onnx-optimizer-test.patch.zip

@askhade
Copy link

askhade commented Sep 11, 2020

@JonTriebenbach : Can you send a PR instead?
Onnx optimizers are being moved to a standalone repo under onnx tree https://github.com/onnx/optimizer
If the change is specific to optimizers please open a PR there otherwise open a PR in this repo

@askhade
Copy link

askhade commented Sep 22, 2020

Transferring this issue to onnx\optimizer repo

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants