Project

Project data

Shallow priors and deep learning: The potential of Bayesian statistics as an agent for deep Gaussian mixture models

Initiative: "Experiment!" (completed)
Call : Exploratory Phase
Allocation: Dec 19, 2019
Period of funding: 1 Year 6 Months

Project information

Despite significant overlap and synergy, machine learning and statistical science have developed largely in parallel. Deep Gaussian mixture models, a recently introduced model class in machine learning, are concerned with the unsupervised tasks of density estimation and high-dimensional clustering used for pattern recognition in many applied areas. In order to avoid over-parameterized solutions, dimension reduction by factor models can be applied at each layer of the architecture. However, the choice of architectures can be interpreted as a Bayesian model choice problem, meaning that every possible model satisfying the constraints is then fitted. The authors propose a much simpler approach: Only one large model needs to be trained and unnecessary components will empty out. The idea that parameters can be assigned prior distributions is highly unorthodox but extremely simple bringing together two sciences, namely machine learning and Bayesian statistics.

Project participants

  • Prof. Dr. Nadja Klein

    Humboldt-Universität Berlin
    Wirtschaftswissenschaftliche Fakultät
    Statistik
    Berlin