Chapter 7: LDA on Iris
Red shows input, black shows output
data(iris3)
Iris <- data.frame(rbind(iris3[,,1], iris3[,,2], iris3[,,3]),
+ Sp = rep(c("s","c","v"), rep(50,3)))
train <- sample(1:150, 75) #selects 75 to be training sample
table(Iris$Sp[train])
c s v
27 22 26
## your answer may differ
## c s v
## 22 23 30 (note: total=75)
z <- lda(Sp ~ ., Iris, prior = c(1,1,1)/3, subset = train)
predict(z, Iris[-train, ])$class
[1] s s s s s s s s s s s s s s s s s s s s s s s s s s s s c c c c c c
[35] c c c c v c c c c c c c c c c c c v v v v v v v v v v v v v v v v v
[69] v v v v v v v
Levels: c s v
> ## [1] s s s s s s s s s s s s s s s s s s s s s s s s s s s c c c
> ## [31] c c c c c c c v c c c c v c c c c c c c c c c c c v v v v v
> ## [61] v v v v v v v v v v v v v v v
> (z1 <- update(z, . ~ . - Petal.W.))
Call:
lda(Sp ~ Sepal.L. + Sepal.W. + Petal.L., data = Iris, prior = c(1,
1, 1)/3, subset = train)
Prior probabilities of groups:
c s v
0.3333333 0.3333333 0.3333333
Group means:
Sepal.L. Sepal.W. Petal.L.
c 6.044444 2.751852 4.396296
s 4.954545 3.459091 1.422727
v 6.607692 2.942308 5.557692
Coefficients of linear discriminants:
LD1 LD2
Sepal.L. 1.145946 -1.329531
Sepal.W. 0.823182 3.501668
Petal.L. -3.259627 1.048042
Proportion of trace:
LD1 LD2
0.9913 0.0087
plot(z, dimen=1)
plot(z, type="density", dimen=1)