This repository provides links to custom-improved variants of the ResNet16 architecture.
BResNet16 variants are a set of architectures inspired by ResNet but designed with efficiency in mind. Unlike conventional ResNet models, which use basic residual layers (for ResNet-18 and ResNet-34) and bottleneck residual layers (for ResNet-50 and above), BResNet16 variants are optimized for lightweight performance, making it ideal for edge devices and performance-critical applications.
In traditional ResNet architectures:
- Basic residual layers stack two convolutional layers on the main path and one convolutional layer on the shortcut path.
- Bottleneck residual layers stack three convolutional layers on the main path, with the first and last layers being 1x1 convolutions (bottleneck layers) to reduce computation.
A conventional ResNet model has an input stem, four stages, and an output layer. Each stage typically contains at least two residual blocks, making it impossible to create standard 18 and 34 variants using only bottleneck layers. The closest possible variant is 16, hence the name BResNet16 (Bottleneck Residual Network 16).
The modifications build upon techniques from the paper "Bag of Tricks for Image Classification with Convolutional Neural Networks", along with additional optimizations and personal refinements.
Each variant has specific modifications tailored to its use case. Details about these improvements can be found in their respective repositories.
The "D" in the variant names originates from ResNet-D of the referenced paper, reflecting the inspirations and architectural changes applied in these models.