User:Zzhou365/sandbox

Dropout (neural network)
Dropout is a regularization technique patented by Google for reducing overfitting in Deep neural networks (refers to neural networks with multiple hidden layer) with a large number of parameters by preventing co-adapting between nodes. It is implemented by randomly drop nodes of neural network (along with their connections) during the training process to produce a series of “thinned” networks. These “thinned” networks can be used to summarised to a single “unthinned” network with smaller weights (compared with networks generated without dropout process) by averaging the predictions of corresponding nodes in all “thinned” networks. Multiple researches have shown drop out as a regularization methodology significantly reduces overfitting and improves the performance of deep neural network in supervised scenarios such as computer vision, natural language processing and, computational biology.