Skip to content
This repository has been archived by the owner on Jan 9, 2018. It is now read-only.

Is padding implemented in the layers ? #1

Open
manjunaths opened this issue May 27, 2016 · 3 comments
Open

Is padding implemented in the layers ? #1

manjunaths opened this issue May 27, 2016 · 3 comments

Comments

@manjunaths
Copy link

Hello,
Is padding implemented in the ConvolutionLayer and other calls ?

Is there an example that demonstrates this ?

Thanks.

@bchandle
Copy link
Member

I just added an example here:

https://github.com/hpe-cct/cct-nn/blob/master/src/test/scala/toolkit/neuralnetwork/examples/AlexNet.scala

The border argument on ConvolutionLayer specifies how padding should be handled. The convolution sizes in this example are exactly consistent with the Caffe reference implementation:

https://github.com/BVLC/caffe/blob/master/models/bvlc_alexnet/deploy.prototxt

A BorderValid convolution produces an output with dimensions equal to the input size minus the (kernel size - 1) in each dimension. A BorderZero convolution will zero-pad such that the output has the same dimensions as the input.

If you want a different padding configuration, the ZeroPad function will let you do that. This combination of manual padding and BorderValid convolution is equivalent to BorderZero convolution:

val c1 = ConvolutionLayer(ZeroPad(data, 5), Shape(11, 11), 96, BorderValid, lr, stride = 4, impl = Space)

@manjunaths
Copy link
Author

Thank you for this. I will try to run this and check.

@manjunaths
Copy link
Author

What is the best way to benchmark this AlexNet network ?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants