Automatic Differentiation for Statistical and Topological Losses

07/28/2023, 8:00 PM — 8:30 PM UTC
32-124

Abstract:

We present a new Julia library, TDAOpt.jl, which provides a unified framework for automatic differentiation and gradient-based optimization of statistical and topological losses using persistent homology. TDAOpt.jl is designed to be efficient and easy to use as well as highly flexible and modular. This allows users to easily incorporate topological regularization into machine learning models in order to optimize shapes, encode domain-specific knowledge, and improve model interpretability

Description:

Persistent homology is a mathematical framework for studying topological features of data, such as connected components, loops, and voids. It has a wide range of applications, including data analysis, computer vision, and shape optimization. However, the use of persistent homology in optimization and machine learning has been limited by the difficulty of computing derivatives of topological quantities.

In our presentation, we will introduce the basics of persistent homology and demonstrate how to use our library to optimize statistical and topological losses in a variety of settings, including shape optimization of point clouds and generative models. We will also discuss the benefits of using Julia for this type of work and how our library fits into the broader Julia ecosystem.

We believe it will be of interest to a wide range of practitioners, including machine learning researchers and practitioners, as well as those working in fields related to topology and scientific computing.

Platinum sponsors

JuliaHub

Gold sponsors

ASML

Silver sponsors

Pumas AIQuEra Computing Inc.Relational AIJeffrey Sarnoff

Bronze sponsors

Jolin.ioBeacon BiosignalsMIT CSAILBoeing

Academic partners

NAWA

Local partners

Postmates

Fiscal Sponsor

NumFOCUS