Distance Based Korean WordNet(alias. KorLex) Embedding Model
The objective of this study was to create graph embedding vectors using Korean WordNet (KorLex) and apply them to neural network word-embedding models. Semantic knowledge, especially lexical semantic knowledge in a language, can be represented by word-embedding vectors or graph structures of lexical...
Saved in:
| Main Authors: | , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Taylor & Francis Group
2024-12-01
|
| Series: | Applied Artificial Intelligence |
| Online Access: | https://www.tandfonline.com/doi/10.1080/08839514.2024.2398920 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1850064198788186112 |
|---|---|
| author | SeongReol Park JoongMin Shin Sanghyun Cho Hyuk-Chul Kwon Jung-Hun Lee |
| author_facet | SeongReol Park JoongMin Shin Sanghyun Cho Hyuk-Chul Kwon Jung-Hun Lee |
| author_sort | SeongReol Park |
| collection | DOAJ |
| description | The objective of this study was to create graph embedding vectors using Korean WordNet (KorLex) and apply them to neural network word-embedding models. Semantic knowledge, especially lexical semantic knowledge in a language, can be represented by word-embedding vectors or graph structures of lexical databases, such as WordNet. Both representations capture common semantics; however, some semantic knowledge is only captured in a specific way or not at all. In a previous study, Path2vec mapped WordNet graphs to graph-embedding vectors using similarity scores between two words. In this study, we propose two main approaches. First, we mapped the knowledge in the Korean lexical database KorLex onto graph-embedding vectors. We then applied these embedding vectors to deep neural network word embeddings to capture additional semantic knowledge in the Korean language. On a custom test set, the proposed approach improved performance by capturing additional semantic knowledge in similarity and analogy analyses. We plan to apply a variant of this to other deep neural embedding models. |
| format | Article |
| id | doaj-art-acae19d3f125427ca17de4814fb2f602 |
| institution | DOAJ |
| issn | 0883-9514 1087-6545 |
| language | English |
| publishDate | 2024-12-01 |
| publisher | Taylor & Francis Group |
| record_format | Article |
| series | Applied Artificial Intelligence |
| spelling | doaj-art-acae19d3f125427ca17de4814fb2f6022025-08-20T02:49:22ZengTaylor & Francis GroupApplied Artificial Intelligence0883-95141087-65452024-12-0138110.1080/08839514.2024.2398920Distance Based Korean WordNet(alias. KorLex) Embedding ModelSeongReol Park0JoongMin Shin1Sanghyun Cho2Hyuk-Chul Kwon3Jung-Hun Lee4Department of Information Convergence Engineering, Pusan National University, Busan, South KoreaDepartment of Information Convergence Engineering, Pusan National University, Busan, South KoreaDepartment of Information Convergence Engineering, Pusan National University, Busan, South KoreaDepartment of Information Convergence Engineering, Pusan National University, Busan, South KoreaDepartment of Artificial Intelligence, Dong-Eui University, Busan, South KoreaThe objective of this study was to create graph embedding vectors using Korean WordNet (KorLex) and apply them to neural network word-embedding models. Semantic knowledge, especially lexical semantic knowledge in a language, can be represented by word-embedding vectors or graph structures of lexical databases, such as WordNet. Both representations capture common semantics; however, some semantic knowledge is only captured in a specific way or not at all. In a previous study, Path2vec mapped WordNet graphs to graph-embedding vectors using similarity scores between two words. In this study, we propose two main approaches. First, we mapped the knowledge in the Korean lexical database KorLex onto graph-embedding vectors. We then applied these embedding vectors to deep neural network word embeddings to capture additional semantic knowledge in the Korean language. On a custom test set, the proposed approach improved performance by capturing additional semantic knowledge in similarity and analogy analyses. We plan to apply a variant of this to other deep neural embedding models.https://www.tandfonline.com/doi/10.1080/08839514.2024.2398920 |
| spellingShingle | SeongReol Park JoongMin Shin Sanghyun Cho Hyuk-Chul Kwon Jung-Hun Lee Distance Based Korean WordNet(alias. KorLex) Embedding Model Applied Artificial Intelligence |
| title | Distance Based Korean WordNet(alias. KorLex) Embedding Model |
| title_full | Distance Based Korean WordNet(alias. KorLex) Embedding Model |
| title_fullStr | Distance Based Korean WordNet(alias. KorLex) Embedding Model |
| title_full_unstemmed | Distance Based Korean WordNet(alias. KorLex) Embedding Model |
| title_short | Distance Based Korean WordNet(alias. KorLex) Embedding Model |
| title_sort | distance based korean wordnet alias korlex embedding model |
| url | https://www.tandfonline.com/doi/10.1080/08839514.2024.2398920 |
| work_keys_str_mv | AT seongreolpark distancebasedkoreanwordnetaliaskorlexembeddingmodel AT joongminshin distancebasedkoreanwordnetaliaskorlexembeddingmodel AT sanghyuncho distancebasedkoreanwordnetaliaskorlexembeddingmodel AT hyukchulkwon distancebasedkoreanwordnetaliaskorlexembeddingmodel AT junghunlee distancebasedkoreanwordnetaliaskorlexembeddingmodel |