Review of key technologies in knowledge graphs powered by code large language models

Traditional knowledge graph technologies still face significant challenges in converting human knowledge, expressed in natural language, into a formal language-based knowledge graph and utilizing it effectively. In recent years, code large language models (LLMs) have demonstrated remarkable capabili...

Full description

Saved in:
Bibliographic Details
Main Authors: LI Zixuan, BAI Long, REN Weicheng, SU Miao, LIU Wenxuan, CHEN Lei³, JIN Xiaolong
Format: Article
Language:zho
Published: China InfoCom Media Group 2025-03-01
Series:大数据
Subjects:
Online Access:http://www.j-bigdataresearch.com.cn/thesisDetails#10.11959/j.issn.2096-0271.2025022
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850140687792603136
author LI Zixuan
BAI Long
REN Weicheng
SU Miao
LIU Wenxuan
CHEN Lei³
JIN Xiaolong
author_facet LI Zixuan
BAI Long
REN Weicheng
SU Miao
LIU Wenxuan
CHEN Lei³
JIN Xiaolong
author_sort LI Zixuan
collection DOAJ
description Traditional knowledge graph technologies still face significant challenges in converting human knowledge, expressed in natural language, into a formal language-based knowledge graph and utilizing it effectively. In recent years, code large language models (LLMs) have demonstrated remarkable capabilities in understanding both natural and formal languages, as well as in translating between them. These advancements are expected to drive significant breakthroughs in developing next-generation knowledge graph technologies. This paper reviews the application of code LLMs in KGs. Firstly, this paper systematically analyzes the role of code LLMs in enhancing key knowledge graph technologies across three critical areas: construction, reasoning, and question-answering. Secondly, a relatively detailed introduction to the existing methodologies in these areas is provided. Finally, this paper summarizes the current state in the field and offers insights into the future of knowledge graph technologies empowered by code LLMs. In the future, knowledge representation based on programming languages is expected to enable more efficient, automated, and complex operations on knowledge graphs, realizing knowledge programming.
format Article
id doaj-art-d9ec6c8cc7914816bcc4a0f4ac2b73f3
institution OA Journals
issn 2096-0271
language zho
publishDate 2025-03-01
publisher China InfoCom Media Group
record_format Article
series 大数据
spelling doaj-art-d9ec6c8cc7914816bcc4a0f4ac2b73f32025-08-20T02:29:42ZzhoChina InfoCom Media Group大数据2096-02712025-03-0111192886967612Review of key technologies in knowledge graphs powered by code large language modelsLI ZixuanBAI LongREN WeichengSU MiaoLIU WenxuanCHEN Lei³JIN XiaolongTraditional knowledge graph technologies still face significant challenges in converting human knowledge, expressed in natural language, into a formal language-based knowledge graph and utilizing it effectively. In recent years, code large language models (LLMs) have demonstrated remarkable capabilities in understanding both natural and formal languages, as well as in translating between them. These advancements are expected to drive significant breakthroughs in developing next-generation knowledge graph technologies. This paper reviews the application of code LLMs in KGs. Firstly, this paper systematically analyzes the role of code LLMs in enhancing key knowledge graph technologies across three critical areas: construction, reasoning, and question-answering. Secondly, a relatively detailed introduction to the existing methodologies in these areas is provided. Finally, this paper summarizes the current state in the field and offers insights into the future of knowledge graph technologies empowered by code LLMs. In the future, knowledge representation based on programming languages is expected to enable more efficient, automated, and complex operations on knowledge graphs, realizing knowledge programming.http://www.j-bigdataresearch.com.cn/thesisDetails#10.11959/j.issn.2096-0271.2025022code large language modellarge language model
spellingShingle LI Zixuan
BAI Long
REN Weicheng
SU Miao
LIU Wenxuan
CHEN Lei³
JIN Xiaolong
Review of key technologies in knowledge graphs powered by code large language models
大数据
code large language model
large language model
title Review of key technologies in knowledge graphs powered by code large language models
title_full Review of key technologies in knowledge graphs powered by code large language models
title_fullStr Review of key technologies in knowledge graphs powered by code large language models
title_full_unstemmed Review of key technologies in knowledge graphs powered by code large language models
title_short Review of key technologies in knowledge graphs powered by code large language models
title_sort review of key technologies in knowledge graphs powered by code large language models
topic code large language model
large language model
url http://www.j-bigdataresearch.com.cn/thesisDetails#10.11959/j.issn.2096-0271.2025022
work_keys_str_mv AT lizixuan reviewofkeytechnologiesinknowledgegraphspoweredbycodelargelanguagemodels
AT bailong reviewofkeytechnologiesinknowledgegraphspoweredbycodelargelanguagemodels
AT renweicheng reviewofkeytechnologiesinknowledgegraphspoweredbycodelargelanguagemodels
AT sumiao reviewofkeytechnologiesinknowledgegraphspoweredbycodelargelanguagemodels
AT liuwenxuan reviewofkeytechnologiesinknowledgegraphspoweredbycodelargelanguagemodels
AT chenlei3 reviewofkeytechnologiesinknowledgegraphspoweredbycodelargelanguagemodels
AT jinxiaolong reviewofkeytechnologiesinknowledgegraphspoweredbycodelargelanguagemodels