Table of Contents

Week 1: Text Classification with Naive Bayes

Week 2: Semi-Supervised Learning with Naive Bayes and Expectation Maximization

Week 3: Text Classification with Maximum Entropy

Week 4: Feature Selection

Week 5: Feature Selection in the Learning Loop

Week 6: Feature Selection as Word Clustering

Week 7: Text Classification with Support Vector Machines


Moving on to text clustering …

Weeks 8 & 9: Clustering with Naive Bayes

Week 10: Bayesian Smoothing

Week 11: Going Beyond Naive Bayes


Extra reading:

Clustering Email

Shorter version: PDF

PDF by Arun C. Surendran, John C. Platt and Erin Renshaw, Conference on Email and Anti-Spam, 21-22 July at Stanford University, 2005.