Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update #22

Open
wants to merge 81 commits into
base: gh-pages
Choose a base branch
from
Open

update #22

wants to merge 81 commits into from

Conversation

dreadlord1984
Copy link

No description provided.

… list; Add dimensions to graph output; Add detailed table of network layers; Add summarized table
fix summarization bug (detailed table modified)
recognizes data>transform_param>crop_size as input-size param
moved calculated parameters/dimensions to node.analysis instead of node.dim
added LRN, INNERPRODUCT, BATCHNORM, FLATTEN layers
Convolution parameters separate for W, H (stride, size, ...)
Comp.Complexity / Memory Req. still missing!
highlight color for LRN nodes
update javascript (fully-connected output dimensions)
add computational complexity + memory requirements
add TOTAL row in summary table
pretty-print large numbers
add dynamic page title
join batchnorm/relu to inline in inceptionv3
Add FCN-16s prototxt (fixed up for net. Need name=top in names!)
fix FCN-16s net (false implicit layers removed)
dgschwend and others added 23 commits July 28, 2016 15:18
add zynqnet to presets
Add support for batch > 1
Add support for permute, rpn.proposal_layer, roipooling + priorbox layers
assumed to be 1 MACC per input pixel...
* take group into consideration for convolution

When there is a group parameter, the MACC and number of params were wrong.

* support dilation as well
@dreadlord1984
Copy link
Author

update

@rlnx
Copy link
Contributor

rlnx commented Jan 14, 2018

Hi @dgschwend, could you please provide any description for your changes?

@dgschwend
Copy link

Hi @RuslanIsrafilov! My fork contains code for CNN network analysis (computational complexity, memory requirements for each layer). However, I did not open this pull request — I think there is no need to pull any changes back to the original netscope repo. @dreadlord1984, what‘s your intention with this PR?

sovrasov and others added 4 commits January 19, 2018 15:12
* Add prelu and elu support

* Recompile coffe files
* Add Input data layer

* Recompile coffee scripts
* add support for relu6

* fix relu6
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants