A representer theorem for deep kernel learning
A representer theorem for deep kernel learning
dc.contributor.author | Bohn, Bastian | |
dc.contributor.author | Griebel, Michael | |
dc.contributor.author | Rieger, Christian | |
dc.date.accessioned | 2024-08-13T14:18:07Z | |
dc.date.available | 2024-08-13T14:18:07Z | |
dc.date.issued | 05.2019 | |
dc.identifier.uri | https://hdl.handle.net/20.500.11811/11831 | |
dc.description.abstract | In this paper we provide a finite-sample and an infinite-sample representer theorem for the concatenation of (linear combinations of) kernel functions of reproducing kernel Hilbert spaces. These results serve as mathematical foundation for the analysis of machine learning algorithms based on compositions of functions. As a direct consequence in the finite-sample case, the corresponding infinite-dimensional minimization problems can be recast into (nonlinear) finite-dimensional minimization problems, which can be tackled with nonlinear optimization algorithms. Moreover, we show how concatenated machine learning problems can be reformulated as neural networks and how our representer theorem applies to a broad class of state-of-the-art deep learning methods. | en |
dc.format.extent | 29 | |
dc.language.iso | eng | |
dc.relation.ispartofseries | INS Preprints ; 1714 | |
dc.rights | In Copyright | |
dc.rights.uri | http://rightsstatements.org/vocab/InC/1.0/ | |
dc.subject.ddc | 510 Mathematik | |
dc.subject.ddc | 518 Numerische Analysis | |
dc.title | A representer theorem for deep kernel learning | |
dc.type | Preprint | |
dc.publisher.name | Institut für Numerische Simulation (INS) | |
dc.publisher.location | Bonn | |
dc.rights.accessRights | openAccess | |
dc.relation.doi | https://doi.org/10.48550/arXiv.1709.10441 | |
ulbbn.pubtype | Zweitveröffentlichung | |
dc.version | updatedVersion | |
dcterms.bibliographicCitation.url | https://ins.uni-bonn.de/publication/preprints |
Files in this item
This item appears in the following Collection(s)
-
INS Preprints (153)