ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Pixel-based Continuous State Prediction with Perceptual Loss
Cited 1 time in scopus Download 4 time Share share facebook twitter linkedin kakaostory
Authors
Donghun Lee, Park Jun Hee
Issue Date
202110
Source
International Conference on Information and Communication Technology Convergence (ICTC) 2021, pp.1117-1119
DOI
https://dx.doi.org/10.1109/ICTC52510.2021.9621189
Project Code
21ZR1100, A Study of Hyper-Connected Thinking Internet Technology by autonomous connecting, controlling and evolving ways, Park Jun Hee
Abstract
Predicting the future is an essential but challenging process because of inherently uncertain model dynamics in our nature. In this paper, a perceptual loss is utilized to improve uncertainty and blurriness of video prediction in a stochastic approach. High dimensionality is reduced to a latent variable using a variation of the beta variational autoencoder. The latent variable contains temporal information from the consecutive state images and utilizes to predict the future states from the past. Perceptual loss is used rather than per-pixel loss between the reconstructed image and the original image to extract high-level image features. The perceptual loss using pre-trained network and KL divergence loss are combined during the training process. The proposed algorithm could show improved reconstruction ability and result in clear future state predictions.
KSP Keywords
Continuous state, High dimensionality, Image feature, KL divergence, Latent Variable, Pixel-based, Reconstruction ability, State Prediction, Stochastic approach, Uncertain model, Video prediction