Research on RAG-Based Cognitive Large Language Model Training Method for Power Standard Knowledge
Electrical standards encompass complex technical requirements across multiple disciplines, making their management and application a significant challenge that urgently requires efficient solutions. This paper proposes a knowledge graph retrieval-enhanced training method for large language models (...
Saved in:
| Main Authors: | , , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Ital Publication
2025-06-01
|
| Series: | HighTech and Innovation Journal |
| Subjects: | |
| Online Access: | https://hightechjournal.org/index.php/HIJ/article/view/969 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1849244577684258816 |
|---|---|
| author | Sai Zhang Xiaoxuan Fan Bochuan Song Xiao Liang Qiang Zhang Zhihao Wang Bo Zhang |
| author_facet | Sai Zhang Xiaoxuan Fan Bochuan Song Xiao Liang Qiang Zhang Zhihao Wang Bo Zhang |
| author_sort | Sai Zhang |
| collection | DOAJ |
| description |
Electrical standards encompass complex technical requirements across multiple disciplines, making their management and application a significant challenge that urgently requires efficient solutions. This paper proposes a knowledge graph retrieval-enhanced training method for large language models (LLMs). By leveraging a pre-trained language model (PLM), highly similar subgraphs are retrieved from the electrical standards knowledge graph. These subgraphs are then parsed into triples using entity linking and semantic reasoning. The triples are converted into natural language text by the LLM, which combines them with the input question to perform reasoning and generate accurate answers. The proposed method addresses the complexity of question answering for electrical standards and offers a novel approach for managing and applying these standards in the field of electrical engineering. Experimental results demonstrate that this approach significantly enhances the model's understanding of electrical standards, enabling it to generate more accurate answers.
|
| format | Article |
| id | doaj-art-010e05eb5b73494bac9aa45e96b7918e |
| institution | Kabale University |
| issn | 2723-9535 |
| language | English |
| publishDate | 2025-06-01 |
| publisher | Ital Publication |
| record_format | Article |
| series | HighTech and Innovation Journal |
| spelling | doaj-art-010e05eb5b73494bac9aa45e96b7918e2025-08-20T03:59:08ZengItal PublicationHighTech and Innovation Journal2723-95352025-06-016210.28991/HIJ-2025-06-02-05Research on RAG-Based Cognitive Large Language Model Training Method for Power Standard KnowledgeSai Zhang0https://orcid.org/0009-0000-0963-8893Xiaoxuan Fan1https://orcid.org/0009-0006-2463-8983Bochuan Song2Xiao Liang3Qiang Zhang4Zhihao Wang5Bo Zhang6State Grid Laboratory of Grid Advanced Computing and Applications, China Electric Power Research Institute Co., Ltd., BeijingState Grid Laboratory of Grid Advanced Computing and Applications, China Electric Power Research Institute Co., Ltd., BeijingState Grid Laboratory of Grid Advanced Computing and Applications, China Electric Power Research Institute Co., Ltd., BeijingState Grid Laboratory of Grid Advanced Computing and Applications, China Electric Power Research Institute Co., Ltd., BeijingState Grid Laboratory of Grid Advanced Computing and Applications, China Electric Power Research Institute Co., Ltd., BeijingState Grid Laboratory of Grid Advanced Computing and Applications, China Electric Power Research Institute Co., Ltd., BeijingState Grid Wuxi Power Supply Company of Jiangsu Electric Power Co., Ltd., Wuxi Electrical standards encompass complex technical requirements across multiple disciplines, making their management and application a significant challenge that urgently requires efficient solutions. This paper proposes a knowledge graph retrieval-enhanced training method for large language models (LLMs). By leveraging a pre-trained language model (PLM), highly similar subgraphs are retrieved from the electrical standards knowledge graph. These subgraphs are then parsed into triples using entity linking and semantic reasoning. The triples are converted into natural language text by the LLM, which combines them with the input question to perform reasoning and generate accurate answers. The proposed method addresses the complexity of question answering for electrical standards and offers a novel approach for managing and applying these standards in the field of electrical engineering. Experimental results demonstrate that this approach significantly enhances the model's understanding of electrical standards, enabling it to generate more accurate answers. https://hightechjournal.org/index.php/HIJ/article/view/969Electric Standards KnowledgeLLMRAGKnowledge GraphSemantic Reasoning |
| spellingShingle | Sai Zhang Xiaoxuan Fan Bochuan Song Xiao Liang Qiang Zhang Zhihao Wang Bo Zhang Research on RAG-Based Cognitive Large Language Model Training Method for Power Standard Knowledge HighTech and Innovation Journal Electric Standards Knowledge LLM RAG Knowledge Graph Semantic Reasoning |
| title | Research on RAG-Based Cognitive Large Language Model Training Method for Power Standard Knowledge |
| title_full | Research on RAG-Based Cognitive Large Language Model Training Method for Power Standard Knowledge |
| title_fullStr | Research on RAG-Based Cognitive Large Language Model Training Method for Power Standard Knowledge |
| title_full_unstemmed | Research on RAG-Based Cognitive Large Language Model Training Method for Power Standard Knowledge |
| title_short | Research on RAG-Based Cognitive Large Language Model Training Method for Power Standard Knowledge |
| title_sort | research on rag based cognitive large language model training method for power standard knowledge |
| topic | Electric Standards Knowledge LLM RAG Knowledge Graph Semantic Reasoning |
| url | https://hightechjournal.org/index.php/HIJ/article/view/969 |
| work_keys_str_mv | AT saizhang researchonragbasedcognitivelargelanguagemodeltrainingmethodforpowerstandardknowledge AT xiaoxuanfan researchonragbasedcognitivelargelanguagemodeltrainingmethodforpowerstandardknowledge AT bochuansong researchonragbasedcognitivelargelanguagemodeltrainingmethodforpowerstandardknowledge AT xiaoliang researchonragbasedcognitivelargelanguagemodeltrainingmethodforpowerstandardknowledge AT qiangzhang researchonragbasedcognitivelargelanguagemodeltrainingmethodforpowerstandardknowledge AT zhihaowang researchonragbasedcognitivelargelanguagemodeltrainingmethodforpowerstandardknowledge AT bozhang researchonragbasedcognitivelargelanguagemodeltrainingmethodforpowerstandardknowledge |