A reformulation of additive models
Additive models and their fitting algorithms play a pivotal role in the history and development of applied mathematics, machine learning, statistics, and science. Yet, the traditional methodology neglects the means to explicitly incorporate prior knowledge into the fit of an additive model, which is...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Thesis-Doctor of Philosophy |
Language: | English |
Published: |
Nanyang Technological University
2022
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/163311 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | Additive models and their fitting algorithms play a pivotal role in the history and development of applied mathematics, machine learning, statistics, and science. Yet, the traditional methodology neglects the means to explicitly incorporate prior knowledge into the fit of an additive model, which is of great practical importance in regression tasks, especially in the low training example regime, where entirely data-driven learning is not always feasible. In contradistinction, we introduce and develop the stacked methodology, wherein we regard prior predictions from a base regression model as a knowledge transfer mechanism, i.e., an inductive bias. Moreover, we endow additive models and their fitting algorithms with the ability to leverage this inductive bias and, thereby, additively and adaptively improve upon the base regression model in a way that generalizes the data analysis methods of twicing, thricing, and reroughing.
To this end, we introduce and develop the stacked additive model, whereby we supplant the image of an instance under a constant function in the traditional additive model with the image of an instance under a coordinate functional, which allows us to naturally stack the base regression and additive models into an ensemble model. Correspondingly, we show how to modify either a batch or sequential fitting algorithm, whereby we provide a template to do so with other fitting algorithms. Furthermore, we introduce a variant of multi-target stacking, which enables us to automatically handle either single-target or multi-target regression tasks within a single framework, where the latter requires a multi-target loss function that decomposes over the single-targets, e.g., multi-target squared error decomposes over the single-targets.
Numerical experiments on real-world, superconducting quantum device calibration and Taylor series data sets demonstrate the viability of the stacked methodology, as it outperforms the state-of-the-art, superconducting quantum device calibration model and the most widely applied gradient boosting libraries, which invoke the traditional methodology to solve regression tasks.
Also, on a different front, we introduce the charged string, picture language, whereby we establish a link between the theory of planar para algebras, which combines insights from the mathematics of subfactor theory and planar algebras with the mathematical physics of parafermions, and the science of quantum information and computation. As a result, we provide a bidirectional dictionary between quantum circuits and charged string pictures, which enables translation between an algebraic and a topological approach to the science of quantum information and computation. |
---|