Jack Dermody


Sequence to Sequence with LSTM

Along the same lines as the many to one example above, the following code creates a vector that summarises a sequence and the sequence itself encoded as one hot vectors.

Convolutional Neural Networks

This function is invoked with the following code.

Extending Bright Wire: Custom Activation Function

We add a batch normalization layer before each SELU activation in the code below.