Synthetic Vegatable Poster

Representation and Synthesis of Visual Texture

Javier Portilla and Eero P. Simoncelli

What is a "Visual Texture"?

There is a long history of research in the study of visual texture, but the definition is rather imprecise. Most researchers agree that an image of visual texture should be spatially homogeneous, and typically contains repeated structures, often with some random variation (e.g., random positions, orientations or colors). This loose definition suggests that textures might best be synthesized by laying down texture elements at periodic or random locations within an image. But this approach does not lend itself well to the analysis of texture images.

Texture Representation

Loosely stated, our goal is to establish a minimal set of statistical measurements such that two textures are identical in appearance if and only if they agree on these measurements. This definition is perceptual, and was first proposed by Bela Julesz in the 1960's. Julesz studied this using hand-constructed binary (black and white) images with matching pairwise, and later, triplet pixel statistics. Our model is based on the same conceptual framework, but (1) uses statistics that are inspired by the representations used by the early stages of the human visual system, and (2) uses the power of modern computing to synthesize example images with matching statistics.

We first decompose an example texture image using multi-scale oriented linear filters (specifically, we use a Steerable Pyramid). The resulting coefficients are complex-valued: the real and imaginary parts correspond to even and odd-symmetric (quadrature pair) filters. For each pair of coefficients at nearby positions, orientations and scales, we then measure the spatially averaged values of their product, the product of their magnitudes, and their relative phase. We also measure a few statistical moments of the pixel distribution (sample mean, variance, skew, kurtosis, range).

The number of statistical parameters depends on the number of subbands, and the size of spatial neighborhoods used. For the examples given below, we used a total of 710 parameters. Note that:
* We find that this set of parameters provides sufficient constraint to generate a wide variety of patterns (see examples below).
* We show (in the paper below) that each type of included statistic is necessary, in that removing it leads to a failure to synthesize at least one type of texture. Nevetheless, for any given type of texture, it is usually possible to obtain good results using a subset of these parameters.
* The result is only interesting when the representation is minimal. The larger the set of statistical constraints, the less diverse will be the set of images that will satisfy them. When the set of constraints becomes very large (larger than the number of input pixels - see results by Gatys et. al. 2015), only one image (and all translated copies of that image) will satisfy the constraints, and this will, of course, look "the same" as the original!

Texture Synthesis

Texture synthesis provides a stringent test of the quality of a texture representation (much more so than commonly-used classification tasks). We have developed a method of synthesizing a random texture that matches the statistics of any given reference image. Abstractly, these images are samples from the equivalence class of all images that share those statistics. Starting with an image of Gaussian white noise, we alternate between 1) forcing the sample statistics of each steerable pyramid subband to match those of a reference texture image, 2) reconstructing an image from the pyramid, and forcing the sample statistics of the resulting pixels to match those of the reference image. We enforce the statistical constraints by moving in the direction of the gradient of the constraint function, until the constraint is satisfied. Note that apart from the choice of initial image, the algorithm is deterministic. Although we cannot measure the entropy of the resulting distribution of images, it seems to be quite diverse. Details may be found in the references given below.

Synthesis Examples

Click on any thumbnail to pop up a window filled with synthesized texture. The reference texture (256x256) is shown inside of a yellow box (Note: no attempt is made to align the contents with the surrounding texture). Labels on photographic textures refer to the following sources:

Artificial/periodic:More Examples...
Artificial/non-periodic:More Examples...
Photographic/quasi-periodic:More Examples...
Photographic/random:More Examples...
Photographic/structured:More Examples...
Inhomogeneous (non-textures):More Examples...
Color:More Examples...

Texture "Painting"

The algorithm can be modified (see refs below) to extrapolate a texture image beyond its boundaries, or to fill in holes (sometimes called "texture inpainting"). Some examples were shown in the original paper (IJCV, 2000) and more are provided here.

By initializing the synthesis process with a photograph (e.g., a face, instead of white noise), interesting global structures can be injected into the result (unpublished). For example:

Texture Synthesis through "Resampling"

A number of recent algorithms create synthetic texture images by clever resampling from the original texture image. Although these do not provide an explicit model for texture, the results are visually stunning! If you're looking for a good texture synthesizer for graphics applications, we suggest you look at these examples (listed in reverse chronological order): Xu etal, Bar-Joseph etal, Wei, Efros & Leung, DeBonet & Viola

Another related idea appears in the fractal image compression literature, in which blocks of coefficients at one scale are copied into a finer scale (with suitable translation, rotation, or other parameteric distortion), see "Fractal Image Coding: A Review", A. Jacquin, Proc. IEEE, Oct 1993.

Software

A Matlab implementation of our texture synthesis algorithm is available (released March, 2001)
Further information: Source code (GitHub)
Extension to color images (released Apr, 2013): Source code

References: This model

References: Perceptual and physiological extensions/tests of this model

References: Texture Models that Inspired our Work

This material is based upon work partially supported by the National Science Foundation under CAREER Grant No. 9796040. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Updated: January 13 2020.
Created: Aug 1999.
top