Mapping textured images on smoothly approximated surfaces is often used to conceal the loss of their real, fine-grained relief. A limitation of mapping a fixed texture in such cases is that it will only be correct for one viewing and one illumination direction. The presence of geometric surface details causes changes that simple foreshortening and global color scaling cannot model well. Hence, one would like to synthesize different textures for different viewing conditions. A texture model is presented that takes account of viewpoint dependent changes in texture appearance. It is highly compact and avoids copy-and-paste like repetitions. The model is learned from example images taken from different viewpoints. It supports texture synthesis for previously unseen conditions.
Multiview texture models
01.01.2001
1215240 byte
Aufsatz (Konferenz)
Elektronische Ressource
Englisch
British Library Conference Proceedings | 2001
|3D Articulated Models and Multiview Tracking with Physical Forces
British Library Online Contents | 2001
|Oriented Visibility for Multiview Reconstruction
British Library Conference Proceedings | 2006
|Betrieb eines berührempfindlichen Multiview-Anzeigeschirms
Europäisches Patentamt | 2023
|Multiview reconstruction of space curves
IEEE | 2003
|