Google Calendar Version is provided at the bottom of this page

 

8:45 AM - 9:00 AM

Introduction

  

9:00 AM - 10:00 AM

Mark Girolami/Michael Betancourt - The Geometrical Foundations of Hamiltonian Monte Carlo

Abstract

Because of its inherent complexities, big data must be complemented by both big models and the statistical algorithms that can fit them. Hamiltonian Monte Carlo has empirically demonstrated performance that can scale to these problems but a corresponding theoretical understanding has not been as forthcoming.

In this talk I will discuss how Hamiltonian Monte Carlo arises naturally when considering the problem from a differential geometric perspective, and how the resulting theory motivates robust implementations of the algorithm with scalable performance.  In particular, I will examine the critical role that Riemannian geometry plays in the construction of the algorithm

 

10:00 AM - 10:30 AM

Coffee break

 

10:30 AM - 11:30 AM

Thomas Fletcher - Probabilistic Geodesic Models for Regression and Dimensionality Reduction on Riemannian Manifolds

 Abstract

Manifold representations are useful for many different types of data, including directional data, transformation matrices, tensors, and shape. Statistical analysis of these data is an important problem in a wide range of image analysis and computer vision applications. This talk will cover two closely related statistical models on Riemannian manifolds: geodesic regression and principal geodesic analysis. These are direct generalizations of linear regression and principal component analysis to the manifold setting. Previous work on modeling manifold data has treated the problem as a geometric optimization, e.g., minimizing the sum-of-squared geodesic distances from a model to the data. In this talk, I will discuss recent developments in putting these models into a coherent probabilistic formulation.

 

11:30 AM – 12:30 PM

Richard Hartley - Kernels on Riemannian Manifolds

Abstract

I will talk about recent results from a number of people in my group on Riemannian manifolds in computer vision.  In many Vision problems Riemannian manifolds come up as a natural model.  Data related to a problem can be naturally represented as a point on a Riemannian manifold. This talk will give an intuitive introduction to Riemannian manifolds, and show how they can be applied in many situations. 

Manifolds of interest include the manifold of Positive Definite matrices and the Grassman Manifolds, which have a role in object recognition and classification, and the Kendall shape manifold, which represents the shape of 2D objects.

Of particular interest is the question of when one can define positive-definite kernels on Riemannian manifolds.  This would allow the application of kernel techniques of SVMs, Kernel FDA, dictionary learning etc directly on the manifold.

 

12:30 PM - 2:30 PM

Lunch break

 

2:30 PM - 3:30 PM

Anuj Srivastava - Riemannian Geometries of Function Spaces for Use in Computer Vision and Statistical Analysis

Abstract

Functional data analysis is gaining prominence in both statistics and computer vision due to increasing availability of function data. While the classical approach uses time-series models for capturing observed variability, the new idea is to view these functions as elements of an appropriate Hilbert space and its subsets. One imposes Riemannian structures on these infinite-dimensional spaces to obtain metrics that are useful in generating data summaries and discovering modes of variability using functional PCA. The key property in choosing a metric is that it should be invariant to re-parameterizations or warpings of functions. A well-known metric with this property, used previously for analyzing PDFs and CDFs, is the non-parametric Fisher-Rao metric. I will present its extensions to several data objects that are relevant in computer vision: (1) real-valued function spaces with applications bio-signals, (2) curves in arbitrary Euclidean spaceswith applications to shape analysis, (2) surfaces in R^3 with applications to medical shape analysis and graphics, and (4) trajectories on Riemannian manifolds with applications to activity recognition and visual speech recognition. 

 

3:30 PM - 4:30 PM

Bart Vandereycken - Riemannian Optimization on Low-rank Matrices and Tensors

Abstract

Minimizing a smooth objective function on the variety of matrices of bounded rank leads to a class of problems that are typically well suited for Riemannian optimization techniques. I will illustrate this by explaining how the embedded geometry of this set can be used in a very simple, yet highly efficient algorithm for solving the matrix completion problem. 

Matrix product states (tensor trains) and Tucker tensors are two numerically attractive generalizations of low-rank matrices to tensors. For both formats, the tensors of fixed rank form a smooth Riemannian manifold as well. I will show how the generalization of matrix completion, namely low-rank tensor completion, can be solved efficiently by Riemannian algorithms as well.

 

4:30 PM - 5:00 PM

Coffee break

 

5:00 PM - 6:00 PM

Panel discussion and conclusions

 

×
Privacy and Data Protection

IIT's website uses the following types of cookies: browsing/session, analytics, functional and third party cookies. Users can choose whether or not to accept the use of cookies and access the website. By clicking on "Further Information", the full information notice on the types of cookies will be displayed and you will be able to choose whether or not to accept them whilst browsing on the website.

Further Information I Understand