Optimization and Numerical Analysis seminar

The Optimisation and Numerical Analysis seminar usually takes place on Wednesdays at 12:00 during term time in Room R17/18, Watson building.

Spring 2020

A quotient geometry on the manifold of fixed-rank positive-semidefinite matrices

  • Estelle Massart (NPL-postdoc, University of Oxford)
  • Wednesday 11 March 2020, 12:00
  • Physics West 103

Riemannian optimization aims to design optimization algorithms for constrained problems, where the constraints enforce the variables to belong to a Riemannian manifold. Classical examples of Riemannian manifolds include, e.g., the set of orthogonal matrices, the set of subspaces of a given dimension (called the Grassman manifold), and the set of fixed-rank matrices.

After a quick introduction to Riemannian optimization, and more specifically Riemannian gradient descent (RGD), we will present the tools needed to run RGD on the manifold of fixed-rank positive-semidefinite matrices, seen as a quotient of the set of full-rank rectangular matrices by the orthogonal group. We will also present recent results about related geometrical tools on that manifold. This manifold is particularly relevant when dealing with low-rank approximations of large positive-(semi)definite matrices.

Fenchel Duality Theory and a Primal-Dual Algorithm on Riemannian Manifolds

  • Ronny Bergmann (TU Chemnitz, Germany)
  • Wednesday 26 February 2020, 12:00
  • Physics West 103

In this talk we introduces a new duality theory that generalizes the classical Fenchel conjugation to functions defined on Riemannian manifolds. We investigate its properties, e.g., the Fenchel–Young inequality and the characterization of the convex subdifferential using the analogue of the Fenchel—Moreau Theorem. These properties of the Fenchel conjugate are employed to derive a Riemannian primal-dual optimization algorithm, and to prove its convergence for the case of Hadamard manifolds under appropriate assumptions.Numerical results illustrate the performance of the algorithm, which competes with the recently derived Douglas–Rachford algorithm on manifolds of nonpositive curvature. Furthermore we show numerically that our novel algorithm even converges on manifolds of positive curvature.

Multi-dimensional vector assignment problems (MVA) : Complexity, Approximation and Algorithms

  • Trivikram Dokka (Lancaster University)
  • Wednesday 19 February 2020, 12:00
  • Physics West 103
  • Tea and coffee will be provided after the talk at the common room

I will formally introduce Multi-dimensional (binary) vector assignment problems (MVA) and discuss some motivation to study these problems. I will then review the complexity and approximability results on MVA . As major part of my talk I will discuss column generation approaches both exact and heuristic to solve the problem. I will also discuss some ongoing and future work to solve large scale assignment problems.

Operator Preconditioning and Some Recent Developments for Boundary Integral Equations

  • Carolina Urzua Torres (University of Oxford)
  • Wednesday 12 February 2020, 12:00
  • Physics West 103

In this talk, I will give an introduction to operator preconditioning as a general and robust strategy to precondition linear systems arising from Galerkin discretization of PDEs or boundary integral equations. Then, in order to illustrate the applicability of this preconditioning technique, I will discuss the simple case of weakly singular and hypersingular integral equations, arising from exterior Dirichlet and Neumann BVPs for the Laplacian in 3D. Finally, I will show how we can also tackle operators with a more difficult structure, like the electric field integral equation (EFIE) on screens, which models the scattering of time-harmonic electromagnetic waves at perfectly conducting bounded infinitely thin objects, like patch antennas in 3D.

Accurate and efficient numerical methods for molecular dynamics and data science using adaptive thermostats

  • Xiaocheng Shang (University of Birmingham, School of Mathematics)
  • Wednesday 05 February 2020, 12:00
  • Physics West 103

I will discuss the design of state-of-the-art numerical methods for sampling probability measures in high dimension where the underlying model is only approximately identified with a gradient system. Extended stochastic dynamical methods, known as adaptive thermostats that automatically correct thermodynamic averages using a negative feedback loop, are discussed which have application to molecular dynamics and Bayesian sampling techniques arising in emerging machine learning applications. I will also discuss the characteristics of different algorithms, including the convergence of averages and the accuracy of numerical discretizations.

First and second order shape optimization based on restricted mesh deformations

  • Roland Herzog (TU Chemnitz, Germany)
  • Wednesday 29 January 2020, 12:00
  • Physics West 103 

We consider shape optimization problems subject to elliptic partial differential equations.In the context of the finite element method, the geometry to be optimized is represented by the computational mesh, and the optimization proceeds by repeatedly updating the mesh node positions.It is well known that such a procedure eventually may lead to a deterioration of mesh quality.As a remedy, we propose a restriction in the admissible mesh deformations, inspired by the Hadamard structure theorem. First and second order methods are considered in this setting.Numerical results show that mesh degeneracy can be overcome, avoiding the need for remeshing or other strategies.

Evaluation of sources of intelligence using a multi-armed bandit framework

  • Livia Stark (Lancaster University)
  • Wednesday 22 January 2020, 12:00
  • Physics West 103

Intelligence is the product resulting from the collection, processing, integration, evaluation, analysis and interpretation of available information concerning foreign nations, hostile or potentially hostile forces or elements or areas of actual or potential operations. An issue regarding intelligence is that more data is collected than what can be processed into intelligence, as analytical capabilities lag behind. Our aim is to determine which pieces of information, otherwise known as tips, will provide useful information even before evaluation takes place. That can be achieved by focusing on the sources of the tips, learning their properties and evaluating them in a Bayesian fashion.The novelty of our approach is to model the sources as semi-Markov bandits, where the time between the decision times is random, corresponding to the random evaluation time of a tip. New policies were developed to direct decision making under this modelling assumption.

Find out more

There is a complete list of talks in the Optimisation and Numerical Analysis Seminar on talks@bham.