기획세션 1-8, 8번째 발표 안내

본 페이지는 2015 한국통계학회 추계학술대회의 모바일 페이지입니다.

항목 내용
발표제목 Dropout regularization method for recurrent neural networks
저자 Taesup Moon (Assistant Professor, Department of Information and Communication Engineering, Daegu Gyoungbuk Institute of Science and Technology (DGIST))
최희열 (삼성종합기술원)
이호식 (삼성종합기술원)
송인철 (삼성종합기술원)
내용요약 Regularization is an essential technique necessary for developing any statistical learning algorithms that need to perform well not only on the training data, but also on the unseen test data. In this talk, I will first give a brief overview on several regularization methods used for deep learning algorithms. I will then introduce dropout, which is a recently developed powerful regularization technique for deep feedforward neural networks. Finally, I will present why the conventional dropout cannot be directly applied to the recurrent neural networks and present a novel dropout technique, RNNDrop, devised particularly for the deep recurrent neural networks. I will conclude with giving promising empirical results of RNNDrop achieving the state-of-the-art performance in representative speech recognition benchmark datasets such as TIMIT and WSJ.
관련용어
id plan23