Context-aware gaze prediction applied to game level design, level-of-detail and stereo manipulation

Abstract

The prediction of visual attention can significantly improve many aspects of computer graphics and games. For example, image synthesis can be accelerated by reducing complex computations on non-attended scene regions. Current gaze prediction models often fail to accurately predict user fixations since they include limited information about the context of the scene; they commonly rely on low level image features such as luminance, contrast and motion. These features do not drive user attention reliably when interacting with an interactive synthetic scene, e.g. in a video game. In such cases the user is in control of the view-port consciously ignoring low level salient features in order to navigate the scene or perform a task. This dissertation contributes two novel predictive scene context-based models of attention that yield more accurate attention predictions than those derived from state-of-the-art methods. Both models presented take into account critical high level scene context fea ...
show more
You must be a registered user to access all the services of EADD  Log In /Register

All items in National Archive of Phd theses are protected by copyright.

Handle URL
http://hdl.handle.net/10442/hedi/36421
ND
36421
Alternative title
Πρόβλεψη ακολουθίας βλέμματος με γνώση του γενικού πλαισίου σκηνής εφαρμοσμένη στο σχεδιασμό επιπέδων βιντεο-παιχνιδιών, βαθμού οπτικής πιστότητας γραφικών και διαχείρισης στερεογραφικών παραμέτρων
Author
Koulieris, Georgios-Alexandros
Date
2015
Degree Grantor
Technical University of Crete (TUC)
Committee members
Μανιά Αικατερίνη
Χριστοδουλάκης Σταύρος
Cunningham Douglas
Μπάλας Κωνσταντίνος
Ζερβάκης Μιχαήλ
Λαγουδάκης Μιχαήλ
McNamara Ann
Discipline
Natural Sciences
Computer and Information Sciences
Keywords
3D computer graphics; Stereo Manipulation; Attention Prediction; Visual perception; Level of Detail
Country
Greece
Language
English
Description
157 σ., im., tbls., fig., ch., ind.