User:Iandola/SqueezeNet

SqueezeNet is the name of a convolutional neural network that was released in 2016. SqueezeNet was developed by researchers at DeepScale, University of California, Berkeley, and Stanford University. In designing SqueezeNet, the authors' goal was to create a smaller neural network with fewer parameters that can more easily fit into memory and can more easily be transmitted over a computer network.

History
(Trying to figure out how to make this not so subjective.) Prior to 2016, the focus of most work on deep learning was to improve accuracy. However, SqueezeNet was one of the first papers to focus specifically on reducing resource utilization of deep neural networks.

Network Design
Discuss Fire modules

Discuss lack of FC layers

Adoption and Impact
By 2017, SqueezeNet had become one of the standard neural architectures that is released as part of deep learning frameworks such as Caffe2, Apache MXNet, and Apple CoreML. Companies including Baidu, Imagination Technologies, Synopsys, and Xilinx have demonstrated SqueezeNet running on low-power processing platforms.

Descendents
- SqueezeDet

- SqueezeSeg

- ShiftNet

- SqueezeNeXt