Approximation Spaces of Deep Neural Networks
Authors
Term
4. term
Education
Publication year
2024
Submitted on
2024-05-30
Pages
121
Abstract
This master's thesis explores the relationship between the structure of deep neural networks and their expressivity, which is the ability to approximate functions from certain function classes. Approximation spaces are used as a theoretical framework for the expressivity and the complexity of the neural networks are measured by either the number of connections or the number of neurons. This thesis is divided into different segments; at first, neural networks and their activation functions are explored, and some elementary properties are considered. Then B-splines are introduced and connected to neural networks by proving that B-splines can be realized or approximated by neural networks if these networks are sufficiently complex. The results regarding B-splines are combined with B-spline approximation to construct a neural network with a structure equivalent to B-spline approximation. The performance of the equivalent neural network is explored in an experiment with simulated target functions. Lastly, approximation spaces related to the neural networks are established using general approximation theory. These approximation spaces for the ReLU activation function are discussed, and topics for further analysis are highlighted.
Keywords
Documents
