Jack Dermody

Word

Sequence to Sequence with LSTM

For example, when building neural networks that learn to translate between different languages it is unlikely that there will be the same number of words in every translated sentence compared to each input sentence.

The single output label “positive” might apply to an entire sentence (which is composed of a sequence of words).

This is how we might build a single embedding from a sequence of words (the document) for the purposes of document comparison.

Introducing Bright Wire

One of the many hard tasks in NLP is assigning part of speech tags to words (this word is a noun, this word is a verb etc).