Linear Regression is an approach to model the relationship between a dependent variable and one or more explanatory variables via normal equations or gradient descent algorithm
Logistic Classification measures the relationship between a categorical dependent variable (a class label)and one or more independent variables(features)
Support Vector Machines are supervised learning models with associated learning algorithms that recognize pattern. Kernel performs a non-linear classification, such as Gaussian Kernel
Reinforcement learning - to solve the autonomous inverted pendulum via learn a MDP model and Bellman search the optimal policy. Contact the author for the demo
Gaussian Process uses multivariate Gaussian as a prior for Bayesian inference computing the posterior. GP predictions have high confidence around training data and low otherwise. Bayesian optimization updates GP via an acquisition function that trade-off exploration and exploitation
Random forest is an ensemble learning method for classification (and regression) that operate by traing a multitude of decision trees and bagging the class that is the mode of individual trees. E.g. forest output 9 trees' output
Computer panorama stitching matches SIFT keypoints from multiple images, fits the Affine transformation via RANSAC, to build a single panoramic image.
Spatial Pyramid Histogram accurately finds out logo in starbucks.
Latent semantic indexing constructs a semantic space wherein terms and documents are closely associated one another. It use SVD and PCA to construct pseudo documents which idenfity the latent concepts. Documents are clustered into two semantic space
Page Rank - Here uses PMTK sample pages to construct link graph, introduces a random perturbation to construct stationary Markov Chain which converges to stationary page rank . An inverted index search returns pages by its rank.