1. 程式人生 > >Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week1

Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week1

圖片 .com arr neu regular img family nts radi

Normalizing input

技術分享圖片

技術分享圖片

Vanishing/Exploding gradients

deep neural network suffer from these issues. they are huge barrier to training deep neural network.

技術分享圖片

There is a partial solution to solve the above problem but help a lot which is careful choice how you initialize the weights. 主要目的是使得weight W[l

]不要比1太大或者太小,這樣最後在算W的指數級的時候就不會有vanishing 和 exploding的問題

Weight Initialization for Deep Networks

技術分享圖片

Ref:

1. Coursera

Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week1