64,00 €
inkl. MwSt.
Versandkostenfrei*
Versandfertig in 1-2 Wochen
payback
0 °P sammeln
  • Broschiertes Buch

Normal with unknown variance (NUV) priors are a central idea of sparse Bayesian learning and allow variational representations of non-Gaussian priors. More specifically, such variational representations can be seen as parameterized Gaussians, wherein the parameters are generally unknown. The advantage is apparent: for fixed parameters, NUV priors are Gaussian, and hence computationally compatible with Gaussian models. Moreover, working with (linear-)Gaussian models is particularly attractive since the Gaussian distribution is closed under affine transformations, marginalization, and…mehr

Produktbeschreibung
Normal with unknown variance (NUV) priors are a central idea of sparse Bayesian learning and allow variational representations of non-Gaussian priors. More specifically, such variational representations can be seen as parameterized Gaussians, wherein the parameters are generally unknown. The advantage is apparent: for fixed parameters, NUV priors are Gaussian, and hence computationally compatible with Gaussian models. Moreover, working with (linear-)Gaussian models is particularly attractive since the Gaussian distribution is closed under affine transformations, marginalization, and conditioning. Interestingly, the variational representation proves to be rather universal than restrictive: many common sparsity-promoting priors (among them, in particular, the Laplace prior) can be represented in this manner. In estimation problems, parameters or variables of the underlying model are often subject to constraints (e.g., discrete-level constraints). Such constraints cannot adequately be represented by linear-Gaussian models and generally require special treatment. To handle such constraints within a linear-Gaussian setting, we extend the idea of NUV priors beyond its original use for sparsity. In particular, we study compositions of existing NUV priors, referred to as composite NUV priors, and show that many commonly used model constraints can be represented in this way.
Autorenporträt
Raphael Keusch was born in Muri (AG), Switzerland, in 1989 and grew up in Buttwil (AG), Switzerland. He received his diploma as an electronics technician from Roche Diagnostics Ltd., Rotkreuz, Switzerland, in 2009. Subsequently, he enrolled in the electrical engineering and information technology program at ETH Zurich, Switzerland, from which he received his BSc and MSc degrees in 2014 and 2016, respectively. During his master¿s degree, he spent a semester as an exchange student at the KTH Royal Institute of Technology, Stockholm, Sweden. After graduation, he worked as a signal processing engineer for Sensirion AG, Stäfa, Switzerland. Since 2018, he has been a PhD candidate and a full research assistant at the Signal and Information Processing Laboratory (ISI) at ETH Zurich under the supervision of Prof. Hans-Andrea Loeliger. His research interests include statistical signal processing, control, machine learning and electronics.