News in English

Gradient-Boosting anything (alert: high performance)

Posted on October 5, 2024 by T. Moudiki in R bloggers | 0 Comments[This article was first published on T. Moudiki's Webpage - R, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)Want to share your content on R-bloggers? click here if you have a blog, or here if you don't. We’ve always been told that decision trees are best for Gradient Boosting Machine Learning. I’ve always wanted to see for myself. AdaBoostClassifier is working well, but is relatively slow (by my own standards). A few days ago, I noticed that my Cython implementation of LSBoost in Python package mlsauce was already quite generic (never noticed before), and I decided to adapt it to any machine learning model with fit and predict methods. It’s worth mentioning that only regression algorithms are accepted as base learners, and classification is regression-based. The results are promising indeed; I’ll let you see for yourself below. All the algorithms, including xgboost and...

Читайте на 123ru.net