Michael J. Black, Brown University, Feb. 22
Stochastic Tracking of Human Motion
A probabilistic method for tracking 3D articulated human figures in
monocular image sequences is presented. Within a Bayesian framework,
we define a generative model of image appearance, a robust likelihood
function based on image graylevel differences, and a prior probability
distribution over pose and joint angles that models how humans move.
The posterior probability distribution over model parameters is
represented using a discrete set of samples and is propagated over
time using particle filtering. The approach extends previous work on
parameterized optical flow estimation to exploit a complex 3D
articulated motion model. It also extends previous work on human
motion tracking by including a perspective camera model, by modeling
limb self occlusion, and by recovering 3D motion from a monocular
sequence. The explicit posterior probability distribution represents
ambiguities due to image matching, model singularities, and
perspective projection. The method relies only on a frame-to-frame
assumption of brightness constancy and hence is able to track people
under changing viewpoints, in grayscale image sequences, and with
complex unknown backgrounds.