Chapter 07.03: Benchmarking Trees, Forests, and Bagging k-NN
We compare the performance of random forests vs. (bagged) CART and (bagged) \(k\)-NN.
Lecture video
Lecture slides
Quiz
---
shuffle_questions: false
---
## Which statements are true?
- [x] The OOB error shares similarities with cross-validation estimation. It can also be used for a quicker model selection.
- [x] In random forests for classification, a good rule of thumb is to use mtry = $\sqrt{p}$.