Next-Generation Text Summarization: A T5-LSTM FusionNet Hybrid Approach for Psychological Data
Automatic text summarization (ATS) has developed as a vital method for compressing massive amounts of textual content into concise and useful summaries, to retrieve more effective and useful information. ATS reduces textual statistics into coherent and shorter versions especially focusing on psychol...
Saved in:
| Main Authors: | , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/10903679/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1850072239603449856 |
|---|---|
| author | Bilal Khan Muhammad Usman Inayat Khan Jawad Khan Dildar Hussain Yeong Hyeon Gu |
| author_facet | Bilal Khan Muhammad Usman Inayat Khan Jawad Khan Dildar Hussain Yeong Hyeon Gu |
| author_sort | Bilal Khan |
| collection | DOAJ |
| description | Automatic text summarization (ATS) has developed as a vital method for compressing massive amounts of textual content into concise and useful summaries, to retrieve more effective and useful information. ATS reduces textual statistics into coherent and shorter versions especially focusing on psychological text summarization to extract insights and emotional states, assisting in better analysis and understanding of psychological contents. In this context, this study proposes a new hybrid model T5-LSTM FusionNet, to enhance textual content summarization in the field of psychology. The motivation derives from the developing extent and accessibility of psychological literature online, which necessitates exceptional techniques for extracting significant findings quickly and reliably. The recommended T5-LSTM FusionNet model combines the benefits of Text-to-Text Transfer Transformer (T5) and Long Short-Term Memory (LSTM). The dataset with 5480 records, accumulated from numerous psychology-associated websites, has been used to verify its performance. T5-LSTM FusionNet’s overall performance is evaluated against several latest models along with T5, LSTM, BERT, and DistilBERT. Measures such as accuracy, precision, recall, F1-score, and ROUGE rankings are used to evaluate the model’s exceptional summarization. With T5-LSTM FusionNet accomplishing a precision of 0.72, recall of 0.72, F1-score of 0.71, and accuracy of 0.74, the effects show significant improvement over individual models like T5 and LSTM, as well as competitive models like BERT and DistilBERT, in terms of summarization effectiveness and accuracy. Furthermore, T5-LSTM FusionNet plays thoroughly in catching both unigram and bigram overlaps concerning summaries, as proven through a comparison study using ROUGE metrics. This suggests that T5-LSTM FusionNet can retain sequence integrity and relevance in summarizing tasks. This work advances ATS techniques in psychology by presenting a hybrid model that combines sequential and transformer-based learning strategies. |
| format | Article |
| id | doaj-art-4de2dbe2a39b43f2abd70b7c8601eb7c |
| institution | DOAJ |
| issn | 2169-3536 |
| language | English |
| publishDate | 2025-01-01 |
| publisher | IEEE |
| record_format | Article |
| series | IEEE Access |
| spelling | doaj-art-4de2dbe2a39b43f2abd70b7c8601eb7c2025-08-20T02:47:07ZengIEEEIEEE Access2169-35362025-01-0113375573757110.1109/ACCESS.2025.354059010903679Next-Generation Text Summarization: A T5-LSTM FusionNet Hybrid Approach for Psychological DataBilal Khan0https://orcid.org/0000-0002-6816-3776Muhammad Usman1https://orcid.org/0000-0001-8363-0179Inayat Khan2https://orcid.org/0009-0002-7542-736XJawad Khan3https://orcid.org/0000-0001-8263-7213Dildar Hussain4https://orcid.org/0000-0001-9007-6284Yeong Hyeon Gu5https://orcid.org/0000-0002-0002-9386Department of Computer Software Engineering, University of Engineering and Technology, Mardan, PakistanDepartment of Computer Science, University of Engineering and Technology, Mardan, PakistanDepartment of Computer Science, University of Engineering and Technology, Mardan, PakistanSchool of Computing, Gachon University, Seongnam-si, Republic of KoreaDepartment of Artificial Intelligence and Data Science, Sejong University, Seoul, Republic of KoreaDepartment of Artificial Intelligence and Data Science, Sejong University, Seoul, Republic of KoreaAutomatic text summarization (ATS) has developed as a vital method for compressing massive amounts of textual content into concise and useful summaries, to retrieve more effective and useful information. ATS reduces textual statistics into coherent and shorter versions especially focusing on psychological text summarization to extract insights and emotional states, assisting in better analysis and understanding of psychological contents. In this context, this study proposes a new hybrid model T5-LSTM FusionNet, to enhance textual content summarization in the field of psychology. The motivation derives from the developing extent and accessibility of psychological literature online, which necessitates exceptional techniques for extracting significant findings quickly and reliably. The recommended T5-LSTM FusionNet model combines the benefits of Text-to-Text Transfer Transformer (T5) and Long Short-Term Memory (LSTM). The dataset with 5480 records, accumulated from numerous psychology-associated websites, has been used to verify its performance. T5-LSTM FusionNet’s overall performance is evaluated against several latest models along with T5, LSTM, BERT, and DistilBERT. Measures such as accuracy, precision, recall, F1-score, and ROUGE rankings are used to evaluate the model’s exceptional summarization. With T5-LSTM FusionNet accomplishing a precision of 0.72, recall of 0.72, F1-score of 0.71, and accuracy of 0.74, the effects show significant improvement over individual models like T5 and LSTM, as well as competitive models like BERT and DistilBERT, in terms of summarization effectiveness and accuracy. Furthermore, T5-LSTM FusionNet plays thoroughly in catching both unigram and bigram overlaps concerning summaries, as proven through a comparison study using ROUGE metrics. This suggests that T5-LSTM FusionNet can retain sequence integrity and relevance in summarizing tasks. This work advances ATS techniques in psychology by presenting a hybrid model that combines sequential and transformer-based learning strategies.https://ieeexplore.ieee.org/document/10903679/Automatic text summarizationT5-LSTM FusionNetpsychologynatural language processingmachine learning |
| spellingShingle | Bilal Khan Muhammad Usman Inayat Khan Jawad Khan Dildar Hussain Yeong Hyeon Gu Next-Generation Text Summarization: A T5-LSTM FusionNet Hybrid Approach for Psychological Data IEEE Access Automatic text summarization T5-LSTM FusionNet psychology natural language processing machine learning |
| title | Next-Generation Text Summarization: A T5-LSTM FusionNet Hybrid Approach for Psychological Data |
| title_full | Next-Generation Text Summarization: A T5-LSTM FusionNet Hybrid Approach for Psychological Data |
| title_fullStr | Next-Generation Text Summarization: A T5-LSTM FusionNet Hybrid Approach for Psychological Data |
| title_full_unstemmed | Next-Generation Text Summarization: A T5-LSTM FusionNet Hybrid Approach for Psychological Data |
| title_short | Next-Generation Text Summarization: A T5-LSTM FusionNet Hybrid Approach for Psychological Data |
| title_sort | next generation text summarization a t5 lstm fusionnet hybrid approach for psychological data |
| topic | Automatic text summarization T5-LSTM FusionNet psychology natural language processing machine learning |
| url | https://ieeexplore.ieee.org/document/10903679/ |
| work_keys_str_mv | AT bilalkhan nextgenerationtextsummarizationat5lstmfusionnethybridapproachforpsychologicaldata AT muhammadusman nextgenerationtextsummarizationat5lstmfusionnethybridapproachforpsychologicaldata AT inayatkhan nextgenerationtextsummarizationat5lstmfusionnethybridapproachforpsychologicaldata AT jawadkhan nextgenerationtextsummarizationat5lstmfusionnethybridapproachforpsychologicaldata AT dildarhussain nextgenerationtextsummarizationat5lstmfusionnethybridapproachforpsychologicaldata AT yeonghyeongu nextgenerationtextsummarizationat5lstmfusionnethybridapproachforpsychologicaldata |