Summary: How about a deep learning technique based on decision trees that outperforms CNNs and RNNs, runs on your ordinary desktop, and trains with relatively small datasets. This could be a major disruptor for AI.
Suppose I told you that there is an algorithm that regularly beats the performance of CNNs and RNNs at image and text classification.
- That requires only a fraction of the training data.
- That you can run on your desktop CPU device without need for GPUs.
- That trains just as rapidly and in many cases even more rapidly and lends itself to distributed processing.
- That has far fewer hyperparameters and performs well on the default settings.
- And relies on easily understood random forests instead of completely opaque deep neural nets.
Well there is one just announced by researchers Zhi-Hua Zhou and Ji Feng of the National Key Lab for Novel Software Technology, Nanjing University, Nanjing, China. And it’s called gcForest.
Other Articles Worth Reading:
- Number Representation Systems Explained in One Picture
- How Can Financial Services Keep Pace with Analytics Demand? (Tibco whitepaper – sponsored)