Ensemble Models (Random Forest)

Ensemble learning using multiple decision trees with bootstrap sampling and majority voting

Points:
Class 0
Class 1
Region 0
Region 1

Ensemble Parameters

1 (Single)50 (Ensemble)
1 (Simple)8 (Complex)
50% (High Diversity)100% (Low Diversity)

About Random Forest

Time Complexity:

  • Training: O(n × m × log n × k)
  • Prediction: O(k × log n)

Space Complexity: O(n × k)

Robust: Yes

How It Works

Random Forest combines multiple decision trees trained on different random subsets of data (bootstrap sampling). Each tree votes on the final prediction.

Bagging: Each tree is trained on a random sample (with replacement), creating diverse trees.

Diversity: Shows how much trees disagree. Higher diversity often leads to better performance.

💡 Tip: Try adjusting the number of trees and sample ratio to see how ensemble diversity affects accuracy!