首页 | 官方网站   微博 | 高级检索  
     

深度学习应用技术研究
引用本文:毛勇华,李前,桂小林,贺兴时.深度学习应用技术研究[J].计算机应用研究,2016,33(11).
作者姓名:毛勇华  李前  桂小林  贺兴时
作者单位:西安工程大学理学院;西安交通大学电子与信息工程学院,西安工程大学理学院,西安交通大学电子与信息工程学院,西安工程大学理学院
基金项目:国自然科学基金资助项目(61472316 , 61172090),国家科技重大专项资助项目(2012ZX03002001),高等教育博士点研究基金资助项目(20120201110013)陕西省自然科学基金资助项目(2014JM1006,2014KRM28-01),中央高校基本科研业务费专项资金资助项目(no.XKJC2014008), 陕西省自然科学创新工程资助项目(2013SZS16-Z01/P01/K01). ,李前2,贺兴时2
摘    要:本文针对深度学习应用技术进行了研究性综述。详细阐述了RBM(Restricted Boltzmann Machine)逐层预训练后再用BP(back-propagation)微调的深度学习贪婪层训练方法,对比分析了BP算法中三种梯度下降的方式,建议在线学习系统,采用随机梯度下降,静态离线学习系统采用随机小批量梯度下降;归纳总结了深度学习深层结构特征,并推荐了目前最受欢迎的5层深度网络结构设计方法。分析了前馈神经网络非线性激活函数的必要性及常用的激活函数优点,并推荐ReLU (rectified linear units)激活函数。最后简要概括了深度CNNs(Convolutional Neural Networks), 深度RNNs(recurrent neural networks), LSTM(long short-termmemory networks)等新型深度网络的特点及应用场景,并归纳总结了当前深度学习可能的发展方向。

关 键 词:RBM,DNN    梯度下降,验证集,监督学习,贪婪层训练方法,深度学习,深度学习层次结构
收稿时间:2015/11/23 0:00:00
修稿时间:2016/9/27 0:00:00

A Study on the Application Technology of Deep Learning
Yonghua MAO,Qian LI,Xiaoling GUI and Xingshi HE.A Study on the Application Technology of Deep Learning[J].Application Research of Computers,2016,33(11).
Authors:Yonghua MAO  Qian LI  Xiaoling GUI and Xingshi HE
Abstract:This paper is a review of deep learning algorithms and their applications. It elaborates the greedy layer training algorithm which uses the fine-grained back-propagation (BP) learning following the layer-wise pre-training on each Restricted Boltzmann Machine (RBM) layer. After comparing and analyzing the three ways of gradient descent in the BP algorithm, we suggest applying stochastic gradient descent in online learning and adopting stochastic mini-batch gradient descent in static offline learning. We summarize the characteristic of the network structure in deep learning and recommend the design of state-of-art five-layer network architecture. We also analyze the necessity of the nonlinear activation function in feedforward neural networks and the advantages of the common activation functions, and recommend using ReLU activate function. Finally, the paper provides a brief summary of features and application scenarios of emerging deep neural networks such as Deep CNNs, Deep RNNs and LSTM, as well as the potential directions of future deep learning applications and research.
Keywords:RBM  DNN  Gradient  Descent  Training  Set  Supervised  Learning  Greedy  Layer Training  Deep  Learning  Deep  Learning Network  Architecture
点击此处可从《计算机应用研究》浏览原始摘要信息
点击此处可从《计算机应用研究》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号