ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Biologically Inspired Computational Models of Visual Attention for Personalized Autonomous Agents: A Survey
Cited 0 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Jin-Young Moon, Hyung-Gik Lee, Chang-Seok Bae
Issue Date
2011-10
Citation
International Conference on Information Technology Convergence and Services (ITCS) / FTRA International Conference on Intelligent Robotics, Automations, Telecommunication Facilities, and Applications (IRoA) 2011 (LNEE 107), v.107, pp.547-555
Publisher
Springer
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1007/978-94-007-2598-0_58
Abstract
Perception is one of essential capabilities for personalized autonomous agents that act like their users without intervention of the users in order to understand the environment for themselves like a human being. Visual perception in humans plays a major role to interact with objects or entities within the environment by interpreting their visual sensing information. The major technical obstacle of visual perception is to efficiently process enormous amount of visual stimuli in real-time. Therefore, computational models of visual attention that decide where to focus in the scene have been proposed to reduce the visual processing load by mimicking human visual system. This chapter provides the background knowledge of cognitive theories that the models were founded on and analyzes the computational models necessary to build a personalized autonomous agent that acts like a specific person as well as typical human beings. © 2011 Springer Science+Business Media B.V.
KSP Keywords
Background knowledge, Biologically inspired, Cognitive theories, Computational Model, Processing load, Real-time, Visual attention, Visual perception, Visual stimuli, autonomous agents, human being