Registered
SENTENCE EMBEDDING METHOD AND APPARATUS BASED ON SUBWORD EMBEDDING AND SKIP-THOUGHTS
- Inventors
-
Euisok Chung, Jung Ho Young, Hyun Woo Kim, Hwa Jeon Song, Yoo Rhee Oh, Kang Byung Ok, Park Jeon Gue, Lee Yunkeun
- Application No.
-
16671773 (2019.11.01)
- Publication No.
-
20200175119 (2020.06.04)
- Registration No.
- 11423238 (2022.08.23)
- Country
- UNITED STATES
- Project Code
-
18ZS1100, Core Technology Research for Self-Improving Artificial Intelligence System,
Lee Yunkeun
- Abstract
- Provided are sentence embedding method and apparatus based on subword embedding and skip-thoughts. To integrate skip-thought sentence embedding learning methodology with a subword embedding technique, a skip-thought sentence embedding learning method based on subword embedding and methodology for simultaneously learning subword embedding learning and skip-thought sentence embedding learning, that is, multitask learning methodology, are provided as methodology for applying intra-sentence contextual information to subword embedding in the case of subword embedding learning. This makes it possible to apply a sentence embedding approach to agglutinative languages such as Korean in a bag-of-words form. Also, skip-thought sentence embedding learning methodology is integrated with a subword embedding technique such that intra-sentence contextual information can be used in the case of subword embedding learning. A proposed model minimizes additional training parameters based on sentence embedding such that most training results may be accumulated in a subword embedding parameter.
- KSP Keywords
- Agglutinative languages, Bag-of-words, Contextual information, Embedding Technique, Learning methods, Proposed model, Training results, embedding learning, embedding method, learning methodologies, multi-task learning
- Family
-