GDRMA: Graph Neural Networks for Document Retrievals With Mean Aggregation
With the proliferation of cloud services and high-capacity hard drives, the volume of stored document data is rapidly increasing. Consequently, large-scale document retrieval tasks have been attracting significant attention. Recently, embedding-based methods including language models and graph neura...
Saved in:
| Main Author: | |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2024-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/10781331/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | With the proliferation of cloud services and high-capacity hard drives, the volume of stored document data is rapidly increasing. Consequently, large-scale document retrieval tasks have been attracting significant attention. Recently, embedding-based methods including language models and graph neural networks (GNNs) have been developed to effectively handle synonyms in documents. However, a major limitation of these approaches is scalability. When taking N-grams into account, it is important to remember that many query keywords are unsupported by language models and that existing GNN-based methods can cause GPU memory shortages. To address this issue, we propose Graph neural networks for Document Retrievals with Mean Aggregation (GDRMA). First, we carefully select a subset of words as important words and derive document embeddings using our novel GNNs on the important words-documents graph to save GPU memory usage. Then, we quickly learn an embedding of the target query keyword using “mean aggregation” and generate a ranking of related documents on CPUs. The main advantage is that our provided GNN connects the two steps mentioned above smoothly, and the generated ranking incorporates synonyms based on a co-occurrence relationship. We conducted exhaustive experiments on real datasets and confirmed that GDRMA is superior to comparable methods. |
|---|---|
| ISSN: | 2169-3536 |