Machine Learning: Unsupervised Learning by Georgia Tech
Free!
|
Skills Covered: Randomized Optimization, Clustering, Feature Selection, Feature Transformation, Information Theory
ABOUT THIS COURSE
Ever wonder how Netflix can predict what movies you’ll like? Or how e-Bay or Amazon knows what you want to buy before you do? The answer can be found in Unsupervised Learning!
Unsupervised Learning is closely related to pattern recognition and is about analyzing data and looking for patterns. It is an effective tool to identify structure in data. This course teaches how you can use Unsupervised Learning approaches. That includes randomized optimization, clustering, and feature selection and transformation to find structure in unlabeled data.
The entire series of sessions within the course is taught as an engaging dialogue between two eminent Machine Learning professors and friends: Professor Charles Isbell (Georgia Tech) and Professor Michael Littman (Brown University).
WHAT YOU WILL LEARN
Randomized optimization
- Optimization
- Randomized
- Hill climbing
- Random restart hill climbing
- Simulated annealing
- Annealing algorithm
- Properties of simulated annealing
- Genetic algorithms
- GA skeleton
- Crossover example
- MIMIC
- MIMIC: A probability model
- MIMIC: Pseudo code
- MIMIC: Estimating distributions
- Finding dependency trees
- Probability distribution
Clustering
- Clustering and expectation maximization
- Basic clustering problem
- Single linkage clustering (SLC)
- Running time of SLC
- Issues with SLC
- K-means clustering
- K-means in Euclidean space
- K-means as optimization
- Soft clustering
- Maximum likelihood Gaussian
- Expectation Maximization (EM)
- Impossibility theorem
Feature Selection
- Algorithms
- Filtering and Wrapping
- Speed
- Searching
- Relevance
- Relevance vs. Usefulness
Feature Transformation
- Feature Transformation
- Words like Tesla
- Principal Components Analysis
- Independent Components Analysis
- Cocktail Party Problem
- Matrix
- Alternatives
Information Theory
- History -Sending a Message
- The expected size of the message
- Information between two variables
- Mutual information
- Two Independent Coins
- Two Dependent Coins
- Kullback Leibler Divergence
Only logged in customers who have purchased this product may leave a review.
Free!
There are no reviews yet.