Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ONNX Integration #32

Open
jafioti opened this issue Mar 3, 2024 · 2 comments
Open

ONNX Integration #32

jafioti opened this issue Mar 3, 2024 · 2 comments
Labels
good first issue Good for newcomers

Comments

@jafioti
Copy link
Owner

jafioti commented Mar 3, 2024

ONNX graphs should be able to be converted to luminal primgraphs. Only tricky part is getting GraphTensors out once the graph is converted, for inputting data and getting output.

This can probably just be implemented in a seperate crate luminal_onnx with a single function onnx_graph(file_name) -> Graph or something. Once the primgraph has been created from the onnx file, normal luminal compilers can be ran on it.

Currently I'm bandwidth constrained so if someone wants to pick this up, it's fairly isolated from the rest of the codebase so should be a decent first issue.

@jafioti jafioti added the good first issue Good for newcomers label Mar 3, 2024
@skewballfox
Copy link

It may be helpful to check out burn-import. I'm pretty familiar with it after I did a major refactor a month or so ago. Essentially the process is this:

  1. model is deserialized into generated grpc code.
  2. all input, output, initializers are loaded into hashmaps where the key is the orignal name in the model, and the value is a struct more useful for burn's purposes. all IO is wrapped in a struct for making sure remapping things happens seamlessly
  3. then deserialized node data is operated on per node. Transformations, of the output and Nodes happen here.
  4. this is then turned into a burn graph, where my understanding of things gets a bit murkier.

Everything (at least last I worked on it) happens at compile time. There is some runtime stuff that onnx supports that can't really happen at this moment, like runtime inference of the dimensions of an output tensor, though I can't say whether that would be the case for luminal.

It would be awesome if there could be some generalized solution for Onnx support that is independent of any specific library.

That being said, I can answer questions and help out in a limited capacity, but I'm also a bit bandwidth constrained at the moment, and likely will be for some time. Maybe in a month or two if no one else picks it up.

@jafioti
Copy link
Owner Author

jafioti commented Mar 24, 2024

Will do, thanks for the writeup! I'm thinking this will be pretty tightly integrated with converting onnx graphs to luminal primgraphs, but I can definitely see after the initial version is working, spinning out the luminal-independant bits into a new crate.

We can also handle runtime inferred dimensions through the dynamic dimensions supported in luminal graphs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers
Projects
None yet
Development

No branches or pull requests

2 participants