Layer | Type | Parameter/Method | Value/Approach |
---|---|---|---|
1 | Image input layer | Data augmentation | Random crop |
Data normalization | Zero center | ||
2 | Convolution layer | Stride | [1] |
Padding | 0 | ||
Learning rate of the weight | 1 | ||
Learning rate of the bias | 1 | ||
L2 regularization for the weight | 1 | ||
L2 regularization for the bias | 1 | ||
3 | Activation layer | Method | ReLU |
4 | Normalization layer | Alpha | 1 × 10−3 |
Beta | 0.75 | ||
K | 2 | ||
5 | Pooling Layer | Method | Max pooling |
Pool size | 2 × 2 | ||
Stride | [2] | ||
Padding | 0 | ||
6 | Fully-connected layer | Learning rate of the weight | 1 |
Learning rate of the bias | 1 | ||
L2 regularization for the weight | 1 | ||
L2 regularization for the bias | 1 | ||
7 | Dropout layer | Probability | 0.5 |
8 | Classification layer | Softmax | Cross-entropy |