张芷铭的个人博客

ResNet 通过残差连接解决深层网络的梯度消失和退化问题,使超深网络可训练。

核心思想

ResNet(Residual Network)由何恺明等人于 2015 年提出,通过跳跃连接(skip connection)实现残差学习

残差块数学表达

$$y = F(x) + x$$

其中:

  • $x$:输入特征
  • $F(x)$:卷积层变换(BN + ReLU + Conv)
  • $F(x) + x$:跳跃连接

残差块类型

Basic Block(ResNet-18/34)

适用于浅层网络,采用两个 3×3 卷积:

  1. 3×3 卷积 + BN + ReLU
  2. 3×3 卷积 + BN
  3. Shortcut Connection:$x$ 与 $F(x)$ 相加

通道数不同时,使用 1×1 卷积匹配。

Bottleneck Block(ResNet-50/101/152)

适用于深层网络,通过 1×1 卷积降维升维:

  1. 1×1 卷积(降维)
  2. 3×3 卷积(特征提取)
  3. 1×1 卷积(升维)
  4. Shortcut Connection

ResNet 变体

版本层数结构参数量 (M)
ResNet-1818Basic Block11.7
ResNet-3434Basic Block21.8
ResNet-5050Bottleneck Block25.6
ResNet-101101Bottleneck Block44.5
ResNet-152152Bottleneck Block60.2

关键改进

改进作用
残差连接避免梯度消失,超深网络可训练
1×1 卷积Bottleneck 结构减少计算量
Batch Normalization加速训练收敛
全局平均池化去掉全连接层,减少参数量

PyTorch 实现

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
import torch.nn as nn

class BasicBlock(nn.Module):
    def __init__(self, in_channels, out_channels, stride=1):
        super(BasicBlock, self).__init__()
        self.conv1 = nn.Conv2d(in_channels, out_channels, 3, stride, 1, bias=False)
        self.bn1 = nn.BatchNorm2d(out_channels)
        self.relu = nn.ReLU(inplace=True)
        self.conv2 = nn.Conv2d(out_channels, out_channels, 3, 1, 1, bias=False)
        self.bn2 = nn.BatchNorm2d(out_channels)

        self.shortcut = nn.Sequential()
        if stride != 1 or in_channels != out_channels:
            self.shortcut = nn.Sequential(
                nn.Conv2d(in_channels, out_channels, 1, stride, bias=False),
                nn.BatchNorm2d(out_channels)
            )

    def forward(self, x):
        out = self.relu(self.bn1(self.conv1(x)))
        out = self.bn2(self.conv2(out))
        out += self.shortcut(x)
        return self.relu(out)

class ResNet(nn.Module):
    def __init__(self, num_classes=1000):
        super(ResNet, self).__init__()
        self.conv1 = nn.Conv2d(3, 64, 7, 2, 3, bias=False)
        self.bn1 = nn.BatchNorm2d(64)
        self.relu = nn.ReLU(inplace=True)
        self.maxpool = nn.MaxPool2d(3, 2, 1)
        self.layer1 = self._make_layer(64, 64, 2)
        self.layer2 = self._make_layer(64, 128, 2, 2)
        self.layer3 = self._make_layer(128, 256, 2, 2)
        self.layer4 = self._make_layer(256, 512, 2, 2)
        self.avgpool = nn.AdaptiveAvgPool2d((1, 1))
        self.fc = nn.Linear(512, num_classes)

    def _make_layer(self, in_ch, out_ch, blocks, stride=1):
        layers = [BasicBlock(in_ch, out_ch, stride)]
        for _ in range(1, blocks):
            layers.append(BasicBlock(out_ch, out_ch))
        return nn.Sequential(*layers)

    def forward(self, x):
        x = self.maxpool(self.relu(self.bn1(self.conv1(x))))
        x = self.layer4(self.layer3(self.layer2(self.layer1(x))))
        x = self.fc(torch.flatten(self.avgpool(x), 1))
        return x

影响

ResNet 广泛用于分类、检测、分割、识别等计算机视觉任务,后续衍生出:

  • ResNeXt:分组卷积
  • DenseNet:特征复用
  • EfficientNet:自动搜索最佳结构

Comments