Deterministic error bounds for kernel-based learning techniques under bounded noise

dc.contributor.authorMaddalena, Emilio T
dc.contributor.authorScharnhorst, Paul
dc.contributor.authorJones, Colin N
dc.date.accessioned2024-02-20T10:11:10Z
dc.date.available2024-02-20T10:11:10Z
dc.date.issued2021-12
dc.description.abstractWe consider the problem of reconstructing a function from a finite set of noise-corrupted samples. Two kernel algorithms are analyzed, namely kernel ridge regression and ε-support vector regression. By assuming the ground-truth function belongs to the reproducing kernel Hilbert space of the chosen kernel, and the measurement noise affecting the dataset is bounded, we adopt an approximation theory viewpoint to establish deterministic, finite-sample error bounds for the two models. Finally, we discuss their connection with Gaussian processes and two numerical examples are provided. In establishing our inequalities, we hope to help bring the fields of non-parametric kernel learning and system identification for robust control closer to each other.
dc.identifier.citationDeterministic error bounds for kernel-based learning techniques under bounded noise, Automatica, Volume 134, 2021, 109896.
dc.identifier.doi10.1016/j.automatica.2021.109896
dc.identifier.issn0005-1098
dc.identifier.urihttps://hdl.handle.net/20.500.12839/1351
dc.language.isoen
dc.titleDeterministic error bounds for kernel-based learning techniques under bounded noise
dc.typeJournal Article
dc.type.csemdivisionsBU-V
dc.type.csemresearchareasDigital Energy
Files
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
2.82 KB
Format:
Item-specific license agreed upon to submission
Description: