仓库源文站点原文


layout: post title: "Machine Learning笔记 第03周" date: "2016-01-28 00:19:15" categories: 计算机科学 excerpt: "Week 03 tasks: Lectures: Kernel Methods & SVMs as well as Computational..."

auth: conge

Week 03 tasks:

SL6: Kernel Methods and SVMs

Quiz 1: select the best line

How to find the best line

Quiz 2: how to measure the distance between two grey plane

SVM

Quiz 3: Optimal Separator

Quiz 4: Linearly Married

Kernel

recap


Why Boosting tends not to overfitting

Quiz 5: Boosting tends to overfit when ___.


SL7: Computational Learning Theory

Quiz 1: how the region was labelled?

Learning Theory

Quiz 2: resource in machine learning

Defining inductive learning

Three ways of selecting training samples

Quiz 3: Teaching Via 20 questions

Quiz 4: Teaching Via 20 questions

Teacher With Constrained Queries

Quiz 5: Reconstructing Hypothesis

Quiz 5: answer

If the teacher with constrains asked questions, he can ask k+2 questions to figure out the hypothesis, which is linear

Learner With Constrained Queries

Leaner with Mistake bands 1

Leaner with Mistake bands 2

Some definitions

Version Spaces difinition

Quiz 6: Terminology

Error of h

PAC Learning

Quiz 7: is the h PAC learnable?

quiz 8: Epsilon Exhausted

Haussler Theorem Haussler Theorem continue

  1. High true error: error<sub>D</sub> is higher than ε
  2. then the probability of candidate hypothesis is consistent with true hypothesis is less than or equal to 1- ε
  3. then is we draw m samples, the probability of h consistent with C on all m samples is (1-ε)<sup>m</sup>
  4. then the probability of at least one h consistent with c is <= K(1-ε)<sup>m</sup> <= |H|(1-ε)<sup>m</sup>
  5. because -ε >= ln(1-ε), so (1-ε)<sup>m</sup> <=e<sup>-εm</sup>, so that in 4, the probability of at least one h consistent with c is <= |H|e<sup>-εm</sup> <=𝛿 (𝛿 is failure possibility)
  6. given all the above info, ln|H|-em<= ln𝛿, so m>=1/ε(ln|h| + ln(1/𝛿).

Quiz 9: PAC Learnable Example

Recap

2016-01-26 完成SL6
2016-01-27 初稿完成