Research on RAG-Based Cognitive Large Language Model Training Method for Power Standard Knowledge

Electrical standards encompass complex technical requirements across multiple disciplines, making their management and application a significant challenge that urgently requires efficient solutions. This paper proposes a knowledge graph retrieval-enhanced training method for large language models (...

Full description

Saved in:
Bibliographic Details
Main Authors: Sai Zhang, Xiaoxuan Fan, Bochuan Song, Xiao Liang, Qiang Zhang, Zhihao Wang, Bo Zhang
Format: Article
Language:English
Published: Ital Publication 2025-06-01
Series:HighTech and Innovation Journal
Subjects:
Online Access:https://hightechjournal.org/index.php/HIJ/article/view/969
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Electrical standards encompass complex technical requirements across multiple disciplines, making their management and application a significant challenge that urgently requires efficient solutions. This paper proposes a knowledge graph retrieval-enhanced training method for large language models (LLMs). By leveraging a pre-trained language model (PLM), highly similar subgraphs are retrieved from the electrical standards knowledge graph. These subgraphs are then parsed into triples using entity linking and semantic reasoning. The triples are converted into natural language text by the LLM, which combines them with the input question to perform reasoning and generate accurate answers. The proposed method addresses the complexity of question answering for electrical standards and offers a novel approach for managing and applying these standards in the field of electrical engineering. Experimental results demonstrate that this approach significantly enhances the model's understanding of electrical standards, enabling it to generate more accurate answers.
ISSN:2723-9535