This repository has been archived by the owner on Sep 21, 2020. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 116
hello! how to move the GhostModule to my net to replace the normal conv2d layer? #15
Comments
Replace Line 55 in 2c90e67
|
@iamhankai Hi, thanks for doing such a great work. def conv(in_channels, out_channels, kernel_size=3, padding=1, bn=True, dilation=1, stride=1, relu=True, bias=True):
modules = [nn.Conv2d(in_channels, out_channels, kernel_size, stride, padding, dilation, bias=bias)]
if bn:
modules.append(nn.BatchNorm2d(out_channels))
if relu:
modules.append(nn.ReLU(inplace=True))
return nn.Sequential(*modules) I just need to change it into this?: def conv(in_channels, out_channels, kernel_size=3, padding=1, bn=True, dilation=1, stride=1, relu=True, bias=True):
modules = [GhostModule(in_channels, out_channels, kernel_size, stride, padding, dilation, bias=bias)]
if bn:
modules.append(nn.BatchNorm2d(out_channels))
if relu:
modules.append(nn.ReLU(inplace=True))
return nn.Sequential(*modules) Is that ok? Hope to get your help, thanks a lot! |
i have same quetion ,do you solve the problem? |
@LexTran Just change to: def conv(in_channels, out_channels, kernel_size=3, padding=1, bn=True, dilation=1, stride=1, relu=True, bias=True):
modules = [GhostModule(in_channels, out_channels, kernel_size, stride=stride)]
if bn:
modules.append(nn.BatchNorm2d(out_channels))
if relu:
modules.append(nn.ReLU(inplace=True))
return nn.Sequential(*modules) |
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
No description provided.
The text was updated successfully, but these errors were encountered: