1. 程式人生 > >neural network vs. support vector machine

neural network vs. support vector machine

Marvin Minsky and Seymor Papert in their book entitled Perceptrons showed that it was not possible for these networks to model simple XOR function in 1969. For many years the book’s citation kept the progress in the ANN area very limited to none. It was only in the 1980s the algorithm resurged into active research, and in 2012 Geoffrey Hinton demonstrated the use of generalized backpropagation algorithm

for training multi-layer neural nets in the Imagenet challenge which revolutionized the field of deep learning.

Growth in DL usage should also be attributed to the enabling fields. Data processing front saw groundbreaking changes in Mid 2010. Hadoop distributed ecosystem changed the way in how data is processed and stored. Single core processor’s processing power has increased manifold compared to processors in 1980s, and The emergence of the Internet of Devices made a vast amount of data collection possible which provided the much-needed training data for neural nets. Graphical Processing Units perform well in matrix multiplication compared to a multi-core processor

, and neural nets heavily depend on matrix operations to fulfill their necessary calculations. Acknowledgments to all the gamers across the world because of them, now neural nets can be trained much faster on GPUs. Without your relentless effort and resolute, there will be no better GPUs in this world.

The fundamental unit of a neural net is a single neuron which was loosely modeled after the neurons in a biological brain. Each neuron in a given layer (i.e., layer 1) will be connected to all or as many neurons in the next layer (i.e., layer 2). The connections between neurons mimic the synapses in the biological brain. A neuron will only fire an output signal if it has received enough input signal (in magnitude to cross a set threshold) from its predecessors.