Functional Gradient Descent for n-Tuple Regression
Published in 29th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, 2021
Recommended citation: Katopodis, R. F. ; Lima, P. M. V. ; França, F. M. G. . Functional Gradient Descent for n-Tuple Regression. In: European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, 2021, Bruges. Proc of ESANN 2021. Brussels: i6doc.com, 2021. p. 505-511.
Abstract: n-tuple neural networks have been in the past applied to a wide range of learning domains. However, for the particular area of regression, existing systems have displayed two shortcomings: little flexibility in the objective function being optimized and an inability to handle nonstationarity in an online learning setting. A novel n-tuple system is proposed to address these issues. The new architecture leverages the idea of functional gradient descent, drawing inspiration from its use in kernel methods. Furthermore, its capabilities are showcased in two reinforcement learning tasks, which involves both nonstationary online learning and task-specific objective functions.
