North British Differential Equations Seminar

Talks by Professor Papanicolau

Basic mathematical problems in array imaging

In array imaging probing signals are sent out and from the echoes received and recorded we construct images of the objects that produced the echoes. This is the case in seismic imaging, in ultrasonic non-destructive testing, in wide band radar, in sonar and elsewhere. Imaging is done by some form of computational back propagation of the recorded signals at the array. Since the background medium in which the scattering objects lie is also not known this back propagation or migration process is very much affected by how we model this background. I will discuss the mathematical problems that arise in trying to assess the resolution of the images and their sensitivity to the background model. I will also discuss and compare imaging with physical time-reversal in which the background medium is "known" in all its details since the back propagation is done physically by emitting signals from the array. I will show how the focusing properties of time reversal improve when the background medium is complex or random.

Imaging in clutter

When the objects that we want to image are in a cluttered or randomly inhomogeneous medium that we cannot possibly know in detail, what should we do with the array data? We should not back propagate them because they are corrupted by the clutter and we will get very unstable images. We should instead compute space-time correlations of the data and back propagate them. This is interferometric imaging, and it is easy to see that it will do better because correlations tend to enhance the signal in the data and diminish the noise. The problem now is in finding the all important implementation algorithms for computing the correlations. I will discuss these issues in detail and show the results of extensive numerical simulations. I will also describe some theoretical results that assess the resolution of images in clutter. Finally I will describe some heuristic, at present, adaptive interferometric imaging algorithms that are quite promising.

Diffusion in cellular flows at high Peclet number

I will analyze convection-diffusion problems in which the convection dominates so that the Peclet number is high. I will consider only steady, incompressible flows that are periodic (cellular) or random, in two dimensions. The key analytical technique for dealing with such problems is a rather unusual form of saddle point variational principles. I will introduce these variational principles in a general way and then explain how they are used for estimating the effective diffusivity in high Peclet number convection-diffusion. I will also briefly connect these problems to the stability of two dimensional cellular flows at high Reynolds number using the eddy viscosity and variational principles for its assessment.

Stochastic volatility models for financial markets and applications to portfolio optimization and derivative pricing

It has been known for a long time that the constant volatility stochastic models underlying the Black-Scholes-Merton derivative pricing theory do not capture the complexity of observed prices of derivatives such as options on equities. Models with variable volatilities, either deterministic or random, are the natural extension to consider but it has turned out to be very difficult to estimate these volatilities from market data. I will describe why even simple mean-reverting stochastic volatility models are hard to calibrate to markets. I will then introduce and motivate an asymptotic theory for fast mean reverting stochastic volatility for which calibration involves only the observable implied volatilities, in a parsimonious way. I will apply this theory to derivative pricing and portfolio optimization.
GDL Last revised: