In this paper, a novel, automated process for constructing and initializing deep feedforward neural networks based on decision trees is presented. The proposed algorithm maps a collection of decision trees trained on the data into a collection of initialized neural networks with the structures of the networks determined by the structures of the trees. The tree-informed initialization acts as a warm-start to the neural network training process, resulting in efficiently trained, accurate networks. These models, referred to as "deep jointly informed neural networks" (DJINN), demonstrate high predictive performance for a variety of regression and classification data sets and display comparable performance to Bayesian hyperparameter optimization at a lower computational cost. By combining the user-friendly features of decision tree models with the flexibility and scalability of deep neural networks, DJINN is an attractive algorithm for training predictive models on a wide range of complex data sets.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/TNNLS.2018.2869694 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!