Reading 3 Machine LearningE
Reading 3 Machine LearningE
CHAPTER 3
MACHINE LEARNING
5. (B) Bottom-up hierarchical clustering begins with each observation being its own
cluster.
Explanation
Agglomerative (bottom-up) hierarchical clustering begins with each observation
being its own cluster. Then, the algorithm finds the two closest clusters, and
combines them into a new, larger cluster. Hierarchical clustering is an
unsupervised iterative algorithm. Divisive (top-down) hierarchical clustering
progressively partitions clusters into smaller clusters until each cluster contains
only one observation.
(Module 3.3, LOS 3.d)
Related Material
SchweserNotes - Book 1
13. (C) bias error plus variance error plus base error.
Explanation
Out-of-sample error equals bias error plus variance error plus base error. Bias error
is the extent to which a model fits the training data. Variance error describes the
degree to which a model's results change in response to new data from validation
and test samples. Base error comes from randomness in the data.
(Module 3.1, LOS 3.b)
Related Material
SchweserNotes - Book 1
20. (B) Typical data analytics tasks for supervised learning include classification and
prediction.
Explanation
Supervised learning utilizes labeled training data to guide the ML program but does
not need “human intervention.” Typical data analytics tasks for supervised learning
include classification and prediction.
(Module 3.1, LOS 3.a)
Related Material
SchweserNotes - Book 1