

TabNet: a new neural-network architecture for tabular data
source link: http://www.donghao.org/2020/10/08/tabnet-a-new-neural-network-architecture-for-tabular-data/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

TabNet: a new neural-network architecture for tabular data
The neural network seems mostly to be used on Computer Vision and Natural Language Processing scenarios, while tree-models like GBDT are mainly used for tabular data.
But why?
Although this article tries to give an explanation of this, it hasn’t been so promising to me. In my humble opinion, the neural network could finally surpass, or at least be competitive, to the GBDT model.
For example, the paper <TabNet: Attentive Interpretable Tabular Learning> describe a Transformer-like model to simulate the tree-model. The PyTorch implementation is here. I have used it on our own data and it finally reached 90% accuracy ( the accuracy of LightGBM is 93%). In spite of the lower accuracy, this is the first neural model reached 90% accuracy when used on our private data. The author has already done a great job.
Like this:
Related
The uneasy way to implement SSDLite by myself2019-09-06In "machine learning"
Using GPU for LightGBM2020-07-10In "machine learning"
Understanding Transformer2020-08-06In "machine learning"
LightGBM, PyTorch, TabNet
Leave a comment
Recommend
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK