在线时间:8:00-16:00
迪恩网络APP
随时随地掌握行业动态
扫描二维码
关注迪恩网络微信公众号
开源软件名称(OpenSource Name):jdeng/rbm-mnist开源软件地址(OpenSource Url):https://github.com/jdeng/rbm-mnist开源编程语言(OpenSource Language):C++ 100.0%开源软件介绍(OpenSource Introduction):Source CodeThe deep learning algorithm is based on the matlab code provided by Geoff Hinton etc at http://www.cs.toronto.edu/~hinton/MatlabForSciencePaper.html The Conjugate Gradient implementation is based (translated and simplfied) on Carl Edward Rasmussen's matlab code at http://learning.eng.cam.ac.uk/carl/code/minimize/minimize.m
It is done with two weekends so there will be bugs and defects. VisualizationIt is not trivial to monitor an optimization procedure with 1 million parameters or more. One way is to map the weights to colors (assuming most of the weights are within [-1,1]) and show them together as an image. Input is on Y-axis (rows) and output is on X-axis (columns). There are 4 RBMs in the sample image (rbm-131.png): 784->300, 300->300, 300->500, 500->10. Check out the images periodically and you can have a rought idea whether the parameters look right. BuildingC++ 11 is extensivily used and currently only clang 3.1 is tested for building. GraphicsMagick is used to generate the representation and is the only dependency. Under MacOS X, GraphicsMagick installed by brew probably won't work due to libc++/libstdc++ subtleties. But you can use it to install the header files and dependent libraries like libjpeg, libpng, etc. Then you can build a private static GraphicMagick library with:
After that you can build dbn with something like below (replace the path to GrphicsMagick accordingly)
Under Ubuntu it is much easier thanks to the new version of libstdc++. Once you have clang 3.1 installed (Here is a source: http://askubuntu.com/questions/141597/llvm-clang-3-1-libc), you can build the software with
You'll also need to install GraphicsMagick probably with apt-get before that. Train and TestThe data fiels are available at http://yann.lecun.com/exdb/mnist/. The command line looks like: ./dbn where command could be "train", "train-simple", "test", "test-simple", "train-encoder" etc. It is highly recommended that you read through demo.cc before running tests. There are 3 types of topology: simple DBN, fine tuned DBN and Autoencoder.
The default monitoring function (progress in demo.cc) will generate a snapshot of the DBN periodically with a rbm-.png and rbm-.dat file. The png file shows the weight changes in a straightforward way. The .dat file can be used for testing with renaming to the correct name (e.g., dbn.dat). PerformanceThere is no extensive testing result yet. Below are some intial numbers for your information based on training with first half of the 10k testing dataset witha few epoches(< 10). Testing is carried out on the whole 10k dataset. It takes about an hour to train. Testing is fast.
The improvement from the CG fine tuning is obvious. It would not be difficult to tune and reproduce the same results in Hinton's paers. During the tests it turns out that the topology (numbers of hidden/visible units) does not impact much on the performance. |
2023-10-27
2022-08-15
2022-08-17
2022-09-23
2022-08-13
请发表评论