46

Update on Polynomial Regression in Lieu of Neural Nets

 5 years ago
source link: https://www.tuicool.com/articles/hit/RjAFRzi
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

There was quite a reaction to our paper, “Polynomial Regression as an Alternative to Neural Nets” (by Cheng, Khomtchouk, Matloff and Mohanty), leading to discussions/debates on Twitter, Reddit, Hacker News and so on. Accordingly, we have posted a revised version of the paper. Some of the new features:

  • Though originally we had made the disclaimer that we had not yet done any experiments with image classification, there were comments along the lines of “If the authors had included even one example of image classification, even the MNIST data, I would have been more receptive.” So our revision does exactly that, with the result that polynomial regression does well on MNIST even with only very primitive preprocessing (plain PCA).
  • We’ve elaborated on some of the theory (still quite informal, but could be made rigorous).
  • We’ve added elaboration on other aspects, e.g. overfitting.
  • We’ve added a section titled, “What This Paper Is NOT.” Hopefully those who wish to comment without reading the paper (!) this time will at least read this section. eYf22yR.png!web
  • Updated and expanded results of our data experiments, including more details on how they were conducted.

We are continuing to add features to our associated R package, polyreg . More news on that to come.

Thanks for the interest. Comments welcome!


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK