ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Pixel-based Continuous State Prediction with Perceptual Loss
Cited 1 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Donghun Lee, Jun Hee Park
Issue Date
2021-10
Citation
International Conference on Information and Communication Technology Convergence (ICTC) 2021, pp.1117-1119
Publisher
IEEE
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1109/ICTC52510.2021.9621189
Abstract
Predicting the future is an essential but challenging process because of inherently uncertain model dynamics in our nature. In this paper, a perceptual loss is utilized to improve uncertainty and blurriness of video prediction in a stochastic approach. High dimensionality is reduced to a latent variable using a variation of the beta variational autoencoder. The latent variable contains temporal information from the consecutive state images and utilizes to predict the future states from the past. Perceptual loss is used rather than per-pixel loss between the reconstructed image and the original image to extract high-level image features. The perceptual loss using pre-trained network and KL divergence loss are combined during the training process. The proposed algorithm could show improved reconstruction ability and result in clear future state predictions.
KSP Keywords
Continuous state, Image Features, KL divergence, Latent variables, Pixel-based, Reconstructed image, Reconstruction ability, State Prediction, Stochastic approach, Uncertain model, Video prediction