-
41
Electric Vehicle Sentiment Analysis Using Large Language Models
Published 2024-11-01“…EV companies are becoming significant competitors in the automotive industry and are projected to cover up to 30% of the United States light vehicle market by 2030 In this study, we present a comparative study of large language models (LLMs) including bidirectional encoder representations from transformers (BERT), robustly optimised BERT (RoBERTa), and a generalised autoregressive pre-training method (XLNet) using Lucid Motors and Tesla Motors YouTube datasets. …”
Get full text
Article -
42
ENHANCING NAMED ENTITY RECOGNITION ON HINER DATASET USING ADVANCED NLP TECHNIQUES
Published 2025-05-01“…Conversely, it lacks speed and accuracy. Therefore, the present research uses advanced NLP models such as bidirectional encoder representations from transformers (BERT), Distil BERT and the robustly optimized BERT approach (RoBERTA) for effective entity prediction performance. …”
Get full text
Article -
43
Does the Choice of Topic Modeling Technique Impact the Interpretation of Aviation Incident Reports? A Methodological Assessment
Published 2025-05-01“…This study presents a comparative analysis of four topic modeling techniques —Latent Dirichlet Allocation (LDA), Bidirectional Encoder Representations from Transformers (BERT), Probabilistic Latent Semantic Analysis (pLSA), and Non-negative Matrix Factorization (NMF)—applied to aviation safety reports from the ATSB dataset spanning 2013–2023. …”
Get full text
Article -
44
Comparison of Deep Learning Sentiment Analysis Methods, Including LSTM and Machine Learning
Published 2023-11-01“…In this case, it is crucial for researchers to explore the possibilities of updating certain tools, either to combine them or to develop them to adapt them to modern tasks in order to provide a clearer understanding of the results of their treatment. We present a comparison of several deep learning models, including convolutional neural networks, recurrent neural networks, and long-term and shortterm bidirectional memory, evaluated using different approaches to word integration, including Bidirectional Encoder Representations from Transformers (BERT) and its variants, FastText and Word2Vec. …”
Get full text
Article -
45
Vietnamese Sentence Fact Checking Using the Incremental Knowledge Graph, Deep Learning, and Inference Rules on Online Platforms
Published 2025-01-01“…ViKGFC integrates a Knowledge Graph (KG), inference rules, and the Knowledge graph - Bidirectional Encoder Representations from Transformers (KG-BERT) deep learning model. …”
Get full text
Article -
46
Detecting Chinese Disinformation with Fine–Tuned BERT and Contextual Techniques
Published 2025-12-01“…Building on large language models (LLMs) like BERT (Bidirectional Encoder Representations from Transformers) provides a promising avenue for addressing this challenge. …”
Get full text
Article -
47
A Cross-Product Analysis of Earphone Reviews Using Contextual Topic Modeling and Association Rule Mining
Published 2024-12-01“…Therefore, this study addresses the need and rationale for having comprehensive sentiment analysis systems by integrating topic modeling and association rule mining to analyze online customer reviews of earphones sold on Amazon. It employs Bidirectional Encoder Representations from Transformers for Topic Modeling (BERTopic), a technique that generates coherent topics by effectively capturing contextual information, and Frequent Pattern Growth (FPGrowth), an efficient association rule mining algorithm used for discovering patterns and relationships in a dataset without candidate generation. …”
Get full text
Article -
48
Optimizing an LSTM Self-Attention Architecture for Portuguese Sentiment Analysis Using a Genetic Algorithm
Published 2025-06-01“…A key outcome of this study was that the optimization process produced a model that is competitive with a Bidirectional Encoder Representation from Transformers (BERT) model retrained for Portuguese, which was used as the baseline. …”
Get full text
Article -
49
Content analysis of multi-annual time series of flood-related Twitter (X) data
Published 2025-02-01“…We implement bidirectional encoder representations from transformers in combination with unsupervised clustering techniques (BERTopic) to automatically extract social media content, addressing transferability issues that arise from commonly used bag-of-words representations. …”
Get full text
Article -
50
Graph neural networks embedded with domain knowledge for cyber threat intelligence entity and relationship mining
Published 2025-04-01“…Specifically, first, domain knowledge is collected to build a domain knowledge graph, which is then embedded using graph convolutional networks (GCN) to enhance the feature representation of threat intelligence text. Next, the features from domain knowledge graph embedding and those generated by the bidirectional encoder representations from transformers (BERT) model are fused using the Layernorm algorithm. …”
Get full text
Article -
51
Analysis of Short Texts Using Intelligent Clustering Methods
Published 2025-05-01“…This article presents a comprehensive review of short text clustering using state-of-the-art methods: Bidirectional Encoder Representations from Transformers (BERT), Term Frequency-Inverse Document Frequency (TF-IDF), and the novel hybrid method Latent Dirichlet Allocation + BERT + Autoencoder (LDA + BERT + AE). …”
Get full text
Article -
52
Empowering geoportals HCI with task-oriented chatbots through NLP and deep transfer learning
Published 2024-10-01“…The notion of deep transfer learning (DTL) was then put into practice by customizing a pre-trained BERT (Bidirectional Encoder Representations from Transformers) model for our particular aim and creating a task-oriented conversational agent. …”
Get full text
Article -
53
IndoGovBERT: A Domain-Specific Language Model for Processing Indonesian Government SDG Documents
Published 2024-11-01“…This circumstance makes it difficult to automate document processing and improve the efficacy of SDG-related government efforts. The presented study introduces IndoGovBERT, a Bidirectional Encoder Representations from Transformers (BERT)-based PTLM built with domain-specific corpora, leveraging the Indonesian government’s public and internal documents. …”
Get full text
Article -
54
Automated and efficient Bangla signboard detection, text extraction, and novel categorization method for underrepresented languages in smart cities
Published 2025-06-01“…Finally, fine-tuning of the pre-trained multilingual Bidirectional Encoder Representations from Transformers (BERT) model is implemented to mimic human perception to achieve Named Entity Recognition (NER) capabilities. …”
Get full text
Article -
55
Ransomware detection and family classification using fine-tuned BERT and RoBERTa models
Published 2025-06-01“…This research explores these challenges and proposes a novel approach using hyperparameter-optimized transfer learning-based models, Bidirectional Encoder Representations from Transformers (BERT), and a Robustly Optimized BERT Approach (RoBERTa), to not only detect but also classify ransomware targeting IoT devices by analyzing dynamically executed API call sequences in a sandbox environment. …”
Get full text
Article -
56
Intelligent integration of AI and IoT for advancing ecological health, medical services, and community prosperity
Published 2025-08-01“…CNN (convolutional neural networks) with transfer learning enabled by Res-Net provides high-accuracy image recognition, which can be used for waste classification. Bidirectional Encoder Representations from Transformers (BERT) allow multilingual users to interact and communicate properly in any linguistic environment. …”
Get full text
Article -
57
TSB-Forecast: A Short-Term Load Forecasting Model in Smart Cities for Integrating Time Series Embeddings and Large Language Models
Published 2025-01-01“…The model uses Sentence Bidirectional Encoder Representations from Transformers (SBERT) to extract semantic characteristics from textual news and Time to Vector (Time2Vec) to capture temporal patterns, acquiring cyclical behavior and context-sensitive impacts. …”
Get full text
Article -
58
Obfuscated Malware Detection and Classification in Network Traffic Leveraging Hybrid Large Language Models and Synthetic Data
Published 2025-01-01“…This phase leverages a fine-tuned LLM, Bidirectional Encoder Representations from Transformers (BERT), with classification layers. …”
Get full text
Article -
59
M.I.N.I.-KID interviews with adolescents: a corpus-based language analysis of adolescents with depressive disorders and the possibilities of continuation using Chat GPT
Published 2024-12-01“…The transcribed interviews comprised 4,077 question-answer-pairs, with which we predicted the clinical rating (depressive/non-depressive) with use of a feedforward neural network that received BERT (Bidirectional Encoder Representations from Transformers) vectors of interviewer questions and patient answers as input. …”
Get full text
Article -
60
Enhancing Pulmonary Disease Prediction Using Large Language Models With Feature Summarization and Hybrid Retrieval-Augmented Generation: Multicenter Methodological Study Based on R...
Published 2025-06-01“…The traditional deep learning model, BERT (Bidirectional Encoder Representations from Transformers), was also compared to assess the superiority of LLMs. …”
Get full text
Article