Understanding Appearance


Texture Perception

Large parts of our visual environment consists of textures such as cloth, bark, or gravel. Humans are adept at perceiving subtle variations in material properties. To investigate image features important for texture perception, we psychophysically compare a recent parametric model of texture appearance that uses the features encoded by a deep CNN with the venerable Portilla and Simoncelli model (Wallis et al., 2017).
textures-in-scenes
Generated textures.


Peripheral Vision

We are interested in scene appearance and how the human brain generates peripheral vision. We find that the visibility of texture-like distortions depends more on local image content than on retinal eccentricity, and suggest that texture summary models may not be sufficient to explain peripheral appearance (Wallis, Funke et al., 2019).
textures-in-scenes
Texture distortions

Key Papers

T. S. A. Wallis, C. M. Funke, A. S. Ecker, L. A. Gatys, F. A. Wichmann, and M. Bethge
A Parametric Texture Model Based on Deep Convolutional Features Closely Matches Texture Appearance for Humans
Journal of Vision, 17(12), 2017
#visual textures, #style transfer, #perceptual image synthesis, #cnns, #psychophysics, #appearance
Code, URL, DOI, Stimuli, Preprint, BibTex

T. S. A. Wallis, C. M. Funke, A. S. Ecker, L. A. Gatys, F. A. Wichmann, and M. Bethge
Image content is more important than Bouma's Law for scene metamers
bioRXiv, 2018
URL, DOI, BibTex
University of Tuebingen BCCN CIN MPI