Hy Smart People.. This article title is Lecture 8: Neural networks, word embeddings and deep learning, I hope Useful For you
Motivation. The perceptron. Input encoding, sum and activation functions; objective function. Linearity of the perceptron. Neural networks. Training. Backpropagation. Connection to Maximum Entropy. Connection to language. Vector representations. NN for the bigram language model. Word2vec: CBOW and skip-gram. Word embeddings. Deep learning. Language modeling with NN. The big picture.
Thanks For reading Lecture 8: Neural networks, word embeddings and deep learning url link https://nlpapplicative.blogspot.com/2015/05/lecture-8-neural-networks-word.html