Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Early-Exit dev #15

Open
wants to merge 9 commits into
base: master
Choose a base branch
from
Open

Early-Exit dev #15

wants to merge 9 commits into from

Conversation

AlexMontgomerie
Copy link
Owner

No description provided.

return 1.0

def streams_in(self):
def streams_in(self, port_index):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are these streams distinct from the separate ports?

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah, so streams are streams per port, so port 0 could have different number of parallel streams compared to port 1.


def workload_in(self, index):
def workload_in(self, port_index):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the case that you have multiple inputs (ports) into a node, would you want the workload in to be the sum of all of those? Or treat the workload of each port independently?

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah it would be the work done by each port independantly. So say for a Concat layer, one input port might have to do more work (stream in more data) than the other

coarse_in: list[int],
coarse_out: list[int],
ports_in=1,
ports_out=1,
data_width
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure if you get the same error but is the data_width non-default arg allowed to follow the default port args?

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh yeah good point, tbh I haven't tested this. I think this weekend I'm going to get started on the testing. But for now, you can default it to 16. There will be someone working on quantisation soon as well, so all the bitwidth stuff will be changing soon

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried changing it to 16 and got a weird error 😅 I'll try and figure it out

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

also remove all the : list[int] aswell, i think that's breaking it

in the future, I think it would be good to give type hints, but actually lets leave it for now

@biggsbenjamin
Copy link
Collaborator

biggsbenjamin commented Feb 3, 2021

In Layer.py the layer_info() won't work anymore because of the old function definitions it's using but it doesn't seem to be used in the other layers so probably doesn't matter

Putting indices everywhere might not be the best plan - I can change this to adding single port versions of the functions in those layers if that would be better

@AlexMontgomerie
Copy link
Owner Author

Nice work!

I think we should keep the functions with port_index the same even for single input/output, because for the tools.matrix module, it should be able to apply the same functions to each layer regardless of if it's single input/output or not. I think i general this will help in removing special conditions for each layer type.

@@ -26,7 +28,7 @@ def __init__(
sa =0.5,
sa_out =0.5
):
Layer.__init__(self,dim,coarse_in,coarse_out,data_width)
Layer.__init__(self, [rows], [cols], [channels], [coarse_in], [coarse_out], data_width)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't add the port information! Whoops, will change now.

@biggsbenjamin biggsbenjamin changed the title started adding multi-ports to layers Early-Exit dev May 20, 2021
@biggsbenjamin biggsbenjamin force-pushed the dev-multi-exit branch 2 times, most recently from bf5fda6 to 3697d47 Compare July 6, 2021 17:10
AlexMontgomerie and others added 7 commits August 26, 2021 16:06
* started split layer

* working with main branch now (for current tests)

* added test for optimiser

* working on improving resource modelling

* updated resource models

* changes to split layer

* fixed merge conflict

* updated visualiser and fn model for splitlayer

Co-authored-by: AlexMontgomerie <[email protected]>
Co-authored-by: AlexMontgomerie <[email protected]>
* ee parser work bringup

* started updating parser, temp save

* expanded subgraphs, updated explicit edges of subnodes

* added early exit dataflow edges to output of If operation/layer

* adding splitlayers to branching connections, removed extra nodes

* adding buffer layer, reworking ctrl edges

* updated parsering layers

* added Buffer and BufferLayer for hw optimiser

* ignoring egg dir, adding custom setup for recompilation ease

* updated Buffer layer/mod, added Exit layers, updated init

* updated add_hardware with new layers, linking control signals, fixing graph and ctrl edges

* updating add_dimensions function - savepoint

* fixing additional conflicts after rebase

* init hw for split layer, fixed comment typos

* working parser for branchnet onnx graph (somewhat verified)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants