Architectural Constraints of Normalizing Flows

  • Tuesday, 15. October 2024, 16:00
  • Mathematikon, SR 11
    • Felix Matthias Draxler
  • Address

    Mathematikon
    Seminar Room 11 (Room 5.102)

  • Organizer

  • Event Type

We consider Normalizing Flows, a class of models that leverage neural networks to represent probability distributions, enabling efficient sampling and density estimation. The focus of our work is to develop versatile normalizing flows, that is flexible methods readily applicable to arbitrary problems. We therefore theoretically examine the expressivity of existing architectures. We find that volume- preserving flows are fundamentally biased and identify a fix. We improve on universality guarantees for coupling-based flows, showing that well-conditioned affine coupling flows are universal. We find that the latter scale favorably with dimension in comparison to Gaussianization flows. We then proceed to lift architectural restrictions from normalizing flows altogether via the introduction of Free-Form Flows. This framework trains arbitrary neural network architectures as normalizing flows. This allows for the first time, among others, cheap rotation-equivariant normalizing flows, normalizing flows on arbitrary Riemannian manifolds, and injective flows based on feed-forward autoencoders. This model is significantly more flexible to adapt to novel problems, performs comparable or better than existing normalizing flows and competitive with methods with iterative inference such as diffusion models and flow matching.