# Yury Korolev - Two-layer neural networks with values in a Banach space

Dates: | 17 May 2022 |
---|---|

Times: | 14:00 - 15:00 |

What is it: | Seminar |

Organiser: | Department of Mathematics |

Who is it for: | University staff, External researchers, Current University students |

Speaker: | Dr Yury Korolev |

Join us for this research seminar, part of the SQUIDS (Statistics, quantification of uncertainty, inverse problems and data science) seminar series.

Abstract: Approximation properties of infinitely wide neural networks have been studied by several authors in the last few years. New function spaces have been introduced that consist of functions that can be efficiently (i.e., with dimension-independent rates) approximated by neural networks of finite width. Typically, these functions are assumed to act between Euclidean spaces, typically with a high-dimensional input space and a lower-dimensional output space. As neural networks gain popularity in inherently infinite-dimensional settings such as inverse problems and imaging, it becomes necessary to analyse the properties of neural networks as nonlinear operators acting between infinite-dimensional spaces. In this talk, I will present Monte-Carlo rates for neural networks acting between Banach spaces with a partial order (vector lattices, a.k.a. Riesz spaces), where the ReLU nonlinearity will be interpreted as the lattice operation of taking the positive part. I will also consider the problem of finding the optimal representation of such functions via a Radon measure on the latent space from a finite number of samples and obtain convergence rates for this representing measure in a Bregman distance.