SVMs versus Logistic Regression

SVMs versus Logistic Regression

Like logistic regression (LR), support vector machines (SVMs) can also be generalised to categorical output variables that take more than two values. On the other hand, the kernel trick can also be employed for LR (this is called kernel LR). While LR, like linear regression, also makes use of all data points, points far away from the margin have much less influence because of the logit transform, and so, even though the mathematics is different, they often end up giving results similar to SVMs.

As to the choice of SVMs versus LR, it often makes sense to try both. SVMs sometimes give a better fit and are computationally more efficient - LR uses all data points but then the values away from the margin are discounted, while SVM uses only the support vector data points to begin with. However, SVM is a bit of a “black box” in terms of interpretability. On the other hand, in LR, the contribution of individual variables to the final fit can be better understood, and in back-fitting of the data, the outputs can be directly interpreted as probabilities.

Continue reading https://doi.org/10.1016/B978-0-12-803130-8.00004-X

Malome Tebatso Khomo

Everywhere, knowingly with the bG-Hum; Crusties!

5 年

Your last sentence seems to point to a way of using LR to estimate meaning attached to inputs via SVM. If you can establish the conditions under which those outputs indeed act as probability distributions, then you're done. If I remember vaguely,, it'd have to have a central moment, and its +/- infinity sum finite for a given input space domain, Interesting!

要查看或添加评论,请登录

Joseph Sefara的更多文章

社区洞察

其他会员也浏览了