BEGIN:VCALENDAR
PRODID:-//Columba Systems Ltd//NONSGML CPNG/SpringViewer/ICal Output/3.3-
M3//EN
VERSION:2.0
CALSCALE:GREGORIAN
METHOD:PUBLISH
BEGIN:VEVENT
DTSTAMP:20220510T111327Z
DTSTART:20220517T130000Z
DTEND:20220517T140000Z
SUMMARY:Yury Korolev - Two-layer neural networks with values in a Banach
space
UID:{http://www.columbasystems.com/customers/uom/gpp/eventid/}e1kg-l28gxl
f5-thyr4
DESCRIPTION:Join us for this research seminar\, part of the SQUIDS (Stati
stics\, quantification of uncertainty\, inverse problems and data scienc
e) seminar series.\n\nAbstract: Approximation properties of infinitely w
ide neural networks have been studied by several authors in the last few
years. New function spaces have been introduced that consist of functio
ns that can be efficiently (i.e.\, with dimension-independent rates) app
roximated by neural networks of finite width. Typically\, these function
s are assumed to act between Euclidean spaces\, typically with a high-di
mensional input space and a lower-dimensional output space. As neural ne
tworks gain popularity in inherently infinite-dimensional settings such
as inverse problems and imaging\, it becomes necessary to analyse the pr
operties of neural networks as nonlinear operators acting between infini
te-dimensional spaces. In this talk\, I will present Monte-Carlo rates f
or neural networks acting between Banach spaces with a partial order (ve
ctor lattices\, a.k.a. Riesz spaces)\, where the ReLU nonlinearity will
be interpreted as the lattice operation of taking the positive part. I w
ill also consider the problem of finding the optimal representation of s
uch functions via a Radon measure on the latent space from a finite numb
er of samples and obtain convergence rates for this representing measure
in a Bregman distance.
STATUS:TENTATIVE
TRANSP:TRANSPARENT
CLASS:PUBLIC
LOCATION:Frank Adams 1\, Alan Turing Building\, Manchester
END:VEVENT
END:VCALENDAR