反向傳播及多層感知器

林嶔 (Lin, Chin)

Lesson 2

邏輯斯回歸的限制(1)

LOGISTIC_PLOT <- function (x1, x2, y, fomula) {
  
  require(scales)
  require(plot3D)
  
  model = glm(fomula, family = 'binomial')
  
  x1_seq = seq(min(x1), max(x1), length.out = 100)
  x2_seq = seq(min(x2), max(x2), length.out = 100)
  
  z_matrix = sapply(x2_seq, function(x) {1/(1+exp(-predict(model, data.frame(x1 = x1_seq, x2 = x))))})
  
  
  image2D(z = z_matrix,
          x = x1_seq, xlab = 'x1',
          y = x2_seq, ylab = 'x2',
          shade = 0.2, rasterImage = TRUE,
          col = colorRampPalette(c("#FFA0A0", "#FFFFFF", "#A0A0FF"))(100))
  
  points(x1, x2, col = (y + 1)*2, pch = 19, cex = 0.5)
  
}

set.seed(0)
x1 = rnorm(50, sd = 1) 
x2 = rnorm(50, sd = 1) 
lr1 = 1 + 0.5 * x1 + 2 * x2
y = lr1 > 0

LOGISTIC_PLOT(x1 = x1, x2 = x2, y = y, fomula = y ~ x1 + x2)

邏輯斯回歸的限制(2)

set.seed(0)
x1 = rnorm(50, sd = 1) 
x2 = rnorm(50, sd = 1) 
lr1 = 1 + 0.5 * x1 + 0.7 * x2 + 3 * x1 * x2
y = lr1 > 0

LOGISTIC_PLOT(x1 = x1, x2 = x2, y = y, fomula = y ~ x1 + x2)

LOGISTIC_PLOT(x1 = x1, x2 = x2, y = y, fomula = y ~ x1 + x2 + x1:x2)

邏輯斯回歸的限制(3)

set.seed(0)
x1 = rnorm(50, sd = 1) 
x2 = rnorm(50, sd = 1) 
lr1 = - 0.5 + 0.5 * x1^2 + 0.3 * x2^2 + 0.4 * x1 * x2
y = lr1 > 0

LOGISTIC_PLOT(x1 = x1, x2 = x2, y = y, fomula = y ~ x1 + x2 + x1:x2)

LOGISTIC_PLOT(x1 = x1, x2 = x2, y = y, fomula = y ~ x1 + x2 + x1:x2 + poly(x1, 2) + poly(x2, 2))

神經網路介紹(1)

– 線性迴歸

\[\hat{y} = f(x) = b_{0} + b_{1}x_1 + b_{2}x_2\]

– 邏輯斯回歸

\[log(\frac{{p}}{1-p}) = b_{0} + b_{1}x_1 + b_{2}x_2\]

神經網路介紹(2)

F2_1

F2_2

神經網路介紹(3)

F2_3

– 神經細胞的構造如下,不論是何種神經元皆可分成:接收區、觸發區、傳導區和輸出區。