ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술지 Brain-inspired Predictive Coding Improves the Performance of Machine Challenging Tasks
Cited 1 time in scopus Download 40 time Share share facebook twitter linkedin kakaostory
저자
이장호, 조정희, 이병화, 이정훈, 윤성로
발행일
202211
출처
Frontiers in Computational Neuroscience, v.16, pp.1-14
ISSN
1662-5188
출판사
Frontiers Media
DOI
https://dx.doi.org/10.3389/fncom.2022.1062678
협약과제
22ZS1100, 자율성장형 복합인공지능 원천기술 연구, 송화전
초록
Backpropagation has been regarded as the most favorable algorithm for training artificial neural networks. However, it has been criticized for its biological implausibility because its learning mechanism contradicts the human brain. Although backpropagation has achieved super-human performance in various machine learning applications, it often shows limited performance in specific tasks. We collectively referred to such tasks as machine-challenging tasks (MCTs) and aimed to investigate methods to enhance machine learning for MCTs. Specifically, we start with a natural question: Can a learning mechanism that mimics the human brain lead to the improvement of MCT performances? We hypothesized that a learning mechanism replicating the human brain is effective for tasks where machine intelligence is difficult. Multiple experiments corresponding to specific types of MCTs where machine intelligence has room to improve performance were performed using predictive coding, a more biologically plausible learning algorithm than backpropagation. This study regarded incremental learning, long-tailed, and few-shot recognition as representative MCTs. With extensive experiments, we examined the effectiveness of predictive coding that robustly outperformed backpropagation-trained networks for the MCTs. We demonstrated that predictive coding-based incremental learning alleviates the effect of catastrophic forgetting. Next, predictive coding-based learning mitigates the classification bias in long-tailed recognition. Finally, we verified that the network trained with predictive coding could correctly predict corresponding targets with few samples. We analyzed the experimental result by drawing analogies between the properties of predictive coding networks and those of the human brain and discussing the potential of predictive coding networks in general machine learning.
KSP 제안 키워드
Artificial Neural Network, Brain-inspired, Catastrophic forgetting, Experimental Result, Few Samples, Incremental learning, Predictive Coding, Super-human, human brain, human performance, learning algorithms
본 저작물은 크리에이티브 커먼즈 저작자 표시 (CC BY) 조건에 따라 이용할 수 있습니다.
저작자 표시 (CC BY)