You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I found that in you code:
def add_transition(name, l):
shape = l.get_shape().as_list()
in_channel = shape[3]
with tf.variable_scope(name) as scope:
l = BatchNorm('bn1', l)
l = tf.nn.relu(l)
l = Conv2D('conv1', l, in_channel, 1, stride=1, use_bias=False, nl=tf.nn.relu)
l = AvgPooling('pool', l, 2)
return l
After BN and ReLU, there is a 1*1 conv layer. However, you apply nl=tf.nn.relu, do you mean after conv layer, we still need the operation ReLU?
In DenseNet(Caffe version) it is different from your configuration here.
Can you explain it to me ?
Thanks.
The text was updated successfully, but these errors were encountered:
I found that in you code:
def add_transition(name, l):
shape = l.get_shape().as_list()
in_channel = shape[3]
with tf.variable_scope(name) as scope:
l = BatchNorm('bn1', l)
l = tf.nn.relu(l)
l = Conv2D('conv1', l, in_channel, 1, stride=1, use_bias=False, nl=tf.nn.relu)
l = AvgPooling('pool', l, 2)
return l
After BN and ReLU, there is a 1*1 conv layer. However, you apply nl=tf.nn.relu, do you mean after conv layer, we still need the operation ReLU?
In DenseNet(Caffe version) it is different from your configuration here.
Can you explain it to me ?
Thanks.
The text was updated successfully, but these errors were encountered: