Neural Network Transfer Learning
阿新 • • 發佈:2018-12-28
Holding certain layers frozen on a network and training is effectively the same as training on a transformed version of the input, the transformed version being the intermediate outputs at the boundary of the frozen layers. This is the process of "feature extraction" from the input data and will be referred to as "featurizing" in this document. The forward pass to "featurize" the input data on large, pertained networks can be time consuming. DL4J also provides a TransferLearningHelper class with the following capabilities. When running multiple epochs users will save on computation time since the expensive forward pass on the frozen layers/vertices will only have to be conducted once.