ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Journal Article Method for Expanding Search Space With Hybrid Operations in DynamicNAS
Cited 1 time in scopus Download 40 time Share share facebook twitter linkedin kakaostory
Authors
Iksoo Shin, Changsik Cho, Seon-Tae Kim
Issue Date
2024-01
Citation
IEEE Access, v.12, pp.10242-10253
ISSN
2169-3536
Publisher
Institute of Electrical and Electronics Engineers Inc.
Language
English
Type
Journal Article
DOI
https://dx.doi.org/10.1109/ACCESS.2024.3350732
Abstract
Recently, a novel neural architecture search method, which is referred to as DynamicNAS (Dynamic Neural Architecture Search) in this paper, has shown great potential. Not only can various sizes of models be trained with a single training session through DynamicNAS, but the subnets trained by DynamicNAS show improved performance compared to the subnets trained by conventional methods. Although DynamicNAS has many strengths compared to conventional NAS, it has the drawback that different types of operations cannot be used simultaneously within a layer as a search space. In this paper, we present a method that allows DynamicNAS to use different types of operations in a layer as a search space, without undermining the benefits of DynamicNAS, such as one-time training and superior subnet performance. Our experiments show that common operation mixing methods, such as convex combination and set sampling, are inadequate for the problem, although they have a structure that is similar to the proposed method. The proposed method finds, from a supernet of hybrid operations, a superior architecture that cannot be found from a single-operation supernet.
KSP Keywords
Conventional methods, Search Space, convex combination, improved performance, search method, training session
This work is distributed under the term of Creative Commons License (CCL)
(CC BY)
CC BY