Extending Embedding Representation by Incorporating Latent Relations
The semantic representation of words is a fundamental task in natural language processing and text mining. Learning word embedding has shown its power on various tasks. Most studies are aimed at generating embedding representation of a word based on encoding its context information. However, many la...
Saved in:
| Main Authors: | Gao Yang, Wang Wenbo, Liu Qian, Huang Heyan, Yuefeng Li |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2018-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/8444048/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Slovene and Croatian word embeddings in terms of gender occupational analogies
by: Matej Ulčar, et al.
Published: (2021-07-01) -
Enhancing Word Embeddings for Improved Semantic Alignment
by: Julian Szymański, et al.
Published: (2024-12-01) -
Combining computational linguistics with sentence embedding to create a zero-shot NLIDB
by: Yuriy Perezhohin, et al.
Published: (2024-12-01) -
Advancing Arabic Word Embeddings: A Multi-Corpora Approach with Optimized Hyperparameters and Custom Evaluation
by: Azzah Allahim, et al.
Published: (2024-11-01) -
Text Alignment in the Service of Text Reuse Detection
by: Hadar Miller, et al.
Published: (2025-03-01)