仓库源文站点原文


layout: post title: "Machine Learning笔记 第02周 " date: "2016-01-27 04:02:41" categories: 计算机科学 excerpt: "本学期学习Machine Learning。本课程在Udacity上有免费版本,并提供详细的笔记。本文的笔记是超级脱水版,目的是自用。 Week..."

auth: conge

本学期学习Machine Learning。本课程在Udacity上有免费版本,并提供详细的笔记。本文的笔记是超级脱水版,目的是自用。

Week 02 tasks: * Lectures: Instance Based Learning, Ensemble B&B, and Kernel Methods & SVMS.

SL4: Instance Based Learning

Instance Based Learning Before

Instance Based Learning Before

 Cost of the House example

K-NN

Quiz 1: compute my neighbors

quiz 2: Domain K-nnowledge

K-NN Bias

 Curse of Dimensionality

Some Other Stuff

Wrap up


SL5: Ensemble Learning Boosting

 Ensemble Learning Boosting

Ensemble learning algrithm

algorithm:

  1. pick up rules from subset of training data and
  2. combine rules.

Ensemble learning algrimth continue

For each step of ensemble learning algorithm, there are multiple ways of doing things.

Quize 1. Ensemble Learning Outputs

Example 1:

What's the ensemble output? A: average of N data points.

Now, another example:

Ensemble learning example

Ensemble Boosting

New subsetting and combination method:

Quiz 2: Error

Quiz 3: Error when instances have different possibility

Weak Learner

Quiz4: Weak Learning

Quiz4: answer, and my answer above also passed the grader

Boosting in code

Quiz, what will happen when D agrees

Final Hypothesis

Boosting example

Quiz 6 answer

Intuition of why boosting will converge to a good answer?

I am lost here, and need to read the proof!

Recap

In practice, Boosting might not have a overfitting problem (testing error goes high)

2016-01-24 SL4 完成
2016-21-25 SL5,初稿完成