Skip to content

chrisorm/LTM-July-2017-Talk

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

34 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LTM July 2017 Talk

Specs

  • Python 3.5
  • Tensorflow 1.2

Overview

This repository has the code that accompanies the LTM talk on text classification with RNNs.

This is intended to demonstrate how to go about implementing many of the recently developed architectures in Deep NLP. There are many more complex combinations of these tools you could use - mix and match should be quite straight forward.

This is not intended to be a best practice guide, exceptionally cutting edge or most efficient. Its meant to be easy to follow and to encourage people to get stuck in!

Basic RNN cell

  • BasicRNN - A simple RNN classifier using BasicRNNCell
  • BasicBidirectionalRNN - A Bidirectional RNN classifier using BasicRNNCell
  • BasicBidirectionalRNN-MeanPooling - A Bidirectional RNN with Mean pooling to aggregate hidden states
  • BasicBidirectionalRNN-MaxPooling - A Bidirectional RNN with Max pooling to aggregate hidden states
  • BasicRNNAttention - A unidirectional RNN with attention mechanism

GRU cell

  • GRURNN - A GRU RNN classifier using GRUCell
  • GRUBidirectionalRNN-MeanPooling - A Bidirectional GRU RNN with Mean pooling to aggregate hidden states
  • GRUBidirectionalRNN-MaxPooling - A Bidirectional GRU RNN with Max pooling to aggregate hidden states

LSTM cell

  • LSTMRNN - An RNN classifier using LSTMCell

Suggested Syllabus

Lesson 1

  • Lecture 8 of CS224n Slides
  • First part of WildML blog on RNNs Blog
  • WildML blog of RNNs in Tensorflow. Blog - Some features/locations may have changed between tensorflow versions.
  • Read the documentation about dynamic_rnn, and cell types, on tensorflow website. dynamic_rnn BasicRNNCell

Exercises

  • BasicRNN

Lesson 2

  • Lecture 9 CS224n Slides
  • Colah's Blog on LSTMs Blog
  • Read the documentation about GRUs and LSTMs, on tensorflow website. LSTM GRU

Exercises

  • GRURNN
  • LSTMRNN

Lesson 3

Exercises

  • BasicBidirectionalRNN
  • Modify this code to run bidirectional LSTM and GRU networks.

Lesson

  • Oxford Deep NLP Conditional Language Modelling with attention Slides

Exercises

  • BasicBidirectionalRNN-MeanPooling
  • BasicBidirectionalRNN-MaxPooling
  • GRUBidirectionalRNN-MeanPooling
  • GRUBidirectionalRNN-MaxPooling

Lesson 5

  • Oxford Deep NLP Conditional Language Modelling with attention Slides
  • Wild ML Post on attention Blog
  • Hierchical Attention Networks - Zhang, 2015 Paper

Exercises

  • BasicRNNAttention
  • Modify this code to run attention over an LSTM network.

Lesson 6+

Apply these techniques to other datasets, some examples:

Exercises

  • Yelp dataset Site
  • Hierichal text classification Kaggle
  • Reuters Dataset Site

This is a work in progress - please raise an issue for any errors.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published