- Boosting, Additive Training
- Y =M(x) +error --- (1)
- error = G(x) + error2 --- (2) (error > error2)
- error2 = H(x) + error3 --- (3) (error2 > error3)
- Y = M(x) + G(x) + H(x) + error3 ---(4)
- Y = alpha * M(x) + beta * G(x) +gamma * H(x) + error4 ---(5)
- Greedy algorithm, search appropriate M, G, H and weight parameter with extreme high speed. it is called tree. and final algorithms is called forest.
- XGBoost is very powerful tool to check Feature Importance.