• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

pengsun/MatlabCNN: Matlab codes for 2D Convolutional Neural Network

原作者: [db:作者] 来自: 网络 收藏 邀请

开源软件名称(OpenSource Name):

pengsun/MatlabCNN

开源软件地址(OpenSource Url):

https://github.com/pengsun/MatlabCNN

开源编程语言(OpenSource Language):

MATLAB 63.4%

开源软件介绍(OpenSource Introduction):

MatlabCNN

Matlab codes for 2D Convolutional Neural Network

Inspired by "https://github.com/rasmusbergpalm/DeepLearnToolbox" and "https://github.com/vlfeat/matconvnet", but aims at educational purpose. Provides carefully desined matlab class hierachy that helps one to understand the workflow of Convolutional Neural Network and Multi Layer Perceptron (MLP) by simply reading the code.

Summary:

  • Basical layer (M-to-N transform):
  • Full connection, Convolutional[1]
  • Average Pooling, Max Pooling
  • Auxiliary layer: Local Response Normalization[2]
  • Activation layer (pointwise transform): Sigmoid, Relu[2]
  • Regularization: Dropout (implemented as pointwise transform), Max-norm constraint[3]
  • Parameter updating: Stochastic Gradient Descent (mini-batch) with Momentum and Weight Decay[3]
  • Loss: Least Square (for classification/regression), Softmax/Cross-entropy/Logistic-loss (for classification)
  • Visualization: class-model and class-saliency map [4]

Caution: Feel free to use the code, but it is primarily for my personal playing around and the developement is ongoing, so no guarantee for bug-free:)

TODO

code

  • myCNN
  • Display loss
  • Continue training from loaded model
  • trans
  • Convolutional layer picking random subset of input feature maps
  • Maxout?

doc

  • A note giving more mathematical details than those in "Notes on Convolutional Neural Networks" by Jake Bouvrie
    • Multi-dimensional array calculus
    • Derivative for Convolution; Convolution as sum of convoluation of pulses
  • A design manual explaining why the "atomic layer"
    • "Atomic layer" as transformation: M-to-N transform and M-to-M (point-wise) transform
    • Chain rule as Back Propagation
    • Dropout as a point-wise transform

References

[1] LeCun, Yann, et al. "Gradient-based learning applied to document recognition." Proceedings of the IEEE 86.11 (1998): 2278-2324.
[2] Krizhevsky, Alex, Ilya Sutskever, and Geoffrey E. Hinton. "Imagenet classification with deep convolutional neural networks." Advances in neural information processing systems. 2012.
[3] Hinton, Geoffrey E., et al. "Improving neural networks by preventing co-adaptation of feature detectors." arXiv preprint arXiv:1207.0580 (2012).
[4] Simonyan, Karen, Andrea Vedaldi, and Andrew Zisserman. "Deep Inside Convolutional Networks: Visualising Image Classification Models and Saliency Maps." arXiv preprint arXiv:1312.6034 (2013).



鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap