Showing 41 - 60 results of 66 for search 'bidirectional encoder presentation from (transformers OR transformed)', query time: 0.13s Refine Results
  1. 41

    Electric Vehicle Sentiment Analysis Using Large Language Models by Hemlata Sharma, Faiz Ud Din, Bayode Ogunleye

    Published 2024-11-01
    “…EV companies are becoming significant competitors in the automotive industry and are projected to cover up to 30% of the United States light vehicle market by 2030 In this study, we present a comparative study of large language models (LLMs) including bidirectional encoder representations from transformers (BERT), robustly optimised BERT (RoBERTa), and a generalised autoregressive pre-training method (XLNet) using Lucid Motors and Tesla Motors YouTube datasets. …”
    Get full text
    Article
  2. 42

    ENHANCING NAMED ENTITY RECOGNITION ON HINER DATASET USING ADVANCED NLP TECHNIQUES by Harshvardhan Pardeshi, Prof. Piyush Pratap Singh

    Published 2025-05-01
    “…Conversely, it lacks speed and accuracy. Therefore, the present research uses advanced NLP models such as bidirectional encoder representations from transformers (BERT), Distil BERT and the robustly optimized BERT approach (RoBERTA) for effective entity prediction performance. …”
    Get full text
    Article
  3. 43

    Does the Choice of Topic Modeling Technique Impact the Interpretation of Aviation Incident Reports? A Methodological Assessment by Aziida Nanyonga, Keith Joiner, Ugur Turhan, Graham Wild

    Published 2025-05-01
    “…This study presents a comparative analysis of four topic modeling techniques —Latent Dirichlet Allocation (LDA), Bidirectional Encoder Representations from Transformers (BERT), Probabilistic Latent Semantic Analysis (pLSA), and Non-negative Matrix Factorization (NMF)—applied to aviation safety reports from the ATSB dataset spanning 2013–2023. …”
    Get full text
    Article
  4. 44

    Comparison of Deep Learning Sentiment Analysis Methods, Including LSTM and Machine Learning by Jean Max T. Habib, A. A. Poguda

    Published 2023-11-01
    “…In this case, it is crucial for researchers to explore the possibilities of updating certain tools, either to combine them or to develop them to adapt them to modern tasks in order to provide a clearer understanding of the results of their treatment. We present a comparison of several deep learning models, including convolutional neural networks, recurrent neural networks, and long-term and shortterm bidirectional memory, evaluated using different approaches to word integration, including Bidirectional Encoder Representations from Transformers (BERT) and its variants, FastText and Word2Vec. …”
    Get full text
    Article
  5. 45

    Vietnamese Sentence Fact Checking Using the Incremental Knowledge Graph, Deep Learning, and Inference Rules on Online Platforms by Huong To Duong, Van Hai Ho, Phuc do

    Published 2025-01-01
    “…ViKGFC integrates a Knowledge Graph (KG), inference rules, and the Knowledge graph - Bidirectional Encoder Representations from Transformers (KG-BERT) deep learning model. …”
    Get full text
    Article
  6. 46

    Detecting Chinese Disinformation with Fine–Tuned BERT and Contextual Techniques by Lixin Yun, Sheng Yun, Haoran Xue

    Published 2025-12-01
    “…Building on large language models (LLMs) like BERT (Bidirectional Encoder Representations from Transformers) provides a promising avenue for addressing this challenge. …”
    Get full text
    Article
  7. 47

    A Cross-Product Analysis of Earphone Reviews Using Contextual Topic Modeling and Association Rule Mining by Ugbold Maidar, Minyoung Ra, Donghee Yoo

    Published 2024-12-01
    “…Therefore, this study addresses the need and rationale for having comprehensive sentiment analysis systems by integrating topic modeling and association rule mining to analyze online customer reviews of earphones sold on Amazon. It employs Bidirectional Encoder Representations from Transformers for Topic Modeling (BERTopic), a technique that generates coherent topics by effectively capturing contextual information, and Frequent Pattern Growth (FPGrowth), an efficient association rule mining algorithm used for discovering patterns and relationships in a dataset without candidate generation. …”
    Get full text
    Article
  8. 48

    Optimizing an LSTM Self-Attention Architecture for Portuguese Sentiment Analysis Using a Genetic Algorithm by Daniel Parada, Alexandre Branco, Marcos Silva, Fábio Mendonça, Sheikh Mostafa, Fernando Morgado-Dias

    Published 2025-06-01
    “…A key outcome of this study was that the optimization process produced a model that is competitive with a Bidirectional Encoder Representation from Transformers (BERT) model retrained for Portuguese, which was used as the baseline. …”
    Get full text
    Article
  9. 49

    Content analysis of multi-annual time series of flood-related Twitter (X) data by N. Veigel, N. Veigel, N. Veigel, H. Kreibich, J. A. de Bruijn, J. A. de Bruijn, J. C. J. H. Aerts, J. C. J. H. Aerts, A. Cominola, A. Cominola

    Published 2025-02-01
    “…We implement bidirectional encoder representations from transformers in combination with unsupervised clustering techniques (BERTopic) to automatically extract social media content, addressing transferability issues that arise from commonly used bag-of-words representations. …”
    Get full text
    Article
  10. 50

    Graph neural networks embedded with domain knowledge for cyber threat intelligence entity and relationship mining by Gan Liu, Kai Lu, Saiqi Pi

    Published 2025-04-01
    “…Specifically, first, domain knowledge is collected to build a domain knowledge graph, which is then embedded using graph convolutional networks (GCN) to enhance the feature representation of threat intelligence text. Next, the features from domain knowledge graph embedding and those generated by the bidirectional encoder representations from transformers (BERT) model are fused using the Layernorm algorithm. …”
    Get full text
    Article
  11. 51

    Analysis of Short Texts Using Intelligent Clustering Methods by Jamalbek Tussupov, Akmaral Kassymova, Ayagoz Mukhanova, Assyl Bissengaliyeva, Zhanar Azhibekova, Moldir Yessenova, Zhanargul Abuova

    Published 2025-05-01
    “…This article presents a comprehensive review of short text clustering using state-of-the-art methods: Bidirectional Encoder Representations from Transformers (BERT), Term Frequency-Inverse Document Frequency (TF-IDF), and the novel hybrid method Latent Dirichlet Allocation + BERT + Autoencoder (LDA + BERT + AE). …”
    Get full text
    Article
  12. 52

    Empowering geoportals HCI with task-oriented chatbots through NLP and deep transfer learning by Mohammad H. Vahidnia

    Published 2024-10-01
    “…The notion of deep transfer learning (DTL) was then put into practice by customizing a pre-trained BERT (Bidirectional Encoder Representations from Transformers) model for our particular aim and creating a task-oriented conversational agent. …”
    Get full text
    Article
  13. 53

    IndoGovBERT: A Domain-Specific Language Model for Processing Indonesian Government SDG Documents by Agus Riyadi, Mate Kovacs, Uwe Serdült, Victor Kryssanov

    Published 2024-11-01
    “…This circumstance makes it difficult to automate document processing and improve the efficacy of SDG-related government efforts. The presented study introduces IndoGovBERT, a Bidirectional Encoder Representations from Transformers (BERT)-based PTLM built with domain-specific corpora, leveraging the Indonesian government’s public and internal documents. …”
    Get full text
    Article
  14. 54

    Automated and efficient Bangla signboard detection, text extraction, and novel categorization method for underrepresented languages in smart cities by Tanmoy Mazumder, Fariha Nusrat, Abu Bakar Siddique Mahi, Jolekha Begum Brishty, Rashik Rahman, Tanjina Helaly

    Published 2025-06-01
    “…Finally, fine-tuning of the pre-trained multilingual Bidirectional Encoder Representations from Transformers (BERT) model is implemented to mimic human perception to achieve Named Entity Recognition (NER) capabilities. …”
    Get full text
    Article
  15. 55

    Ransomware detection and family classification using fine-tuned BERT and RoBERTa models by Amjad Hussain, Ayesha Saadia, Faeiz M. Alserhani

    Published 2025-06-01
    “…This research explores these challenges and proposes a novel approach using hyperparameter-optimized transfer learning-based models, Bidirectional Encoder Representations from Transformers (BERT), and a Robustly Optimized BERT Approach (RoBERTa), to not only detect but also classify ransomware targeting IoT devices by analyzing dynamically executed API call sequences in a sandbox environment. …”
    Get full text
    Article
  16. 56

    Intelligent integration of AI and IoT for advancing ecological health, medical services, and community prosperity by Abdulrahman Alzahrani, Patty Kostkova, Hamoud Alshammari, Safa Habibullah, Ahmed Alzahrani

    Published 2025-08-01
    “…CNN (convolutional neural networks) with transfer learning enabled by Res-Net provides high-accuracy image recognition, which can be used for waste classification. Bidirectional Encoder Representations from Transformers (BERT) allow multilingual users to interact and communicate properly in any linguistic environment. …”
    Get full text
    Article
  17. 57

    TSB-Forecast: A Short-Term Load Forecasting Model in Smart Cities for Integrating Time Series Embeddings and Large Language Models by Mohamed Mahmoud Hasan, Neamat El-Tazi, Ramadan Moawad, Amany H. B. Eissa

    Published 2025-01-01
    “…The model uses Sentence Bidirectional Encoder Representations from Transformers (SBERT) to extract semantic characteristics from textual news and Time to Vector (Time2Vec) to capture temporal patterns, acquiring cyclical behavior and context-sensitive impacts. …”
    Get full text
    Article
  18. 58

    Obfuscated Malware Detection and Classification in Network Traffic Leveraging Hybrid Large Language Models and Synthetic Data by Mehwish Naseer, Farhan Ullah, Samia Ijaz, Hamad Naeem, Amjad Alsirhani, Ghadah Naif Alwakid, Abdullah Alomari

    Published 2025-01-01
    “…This phase leverages a fine-tuned LLM, Bidirectional Encoder Representations from Transformers (BERT), with classification layers. …”
    Get full text
    Article
  19. 59

    M.I.N.I.-KID interviews with adolescents: a corpus-based language analysis of adolescents with depressive disorders and the possibilities of continuation using Chat GPT by Irina Jarvers, Angelika Ecker, Pia Donabauer, Katharina Kampa, Maximilian Weißenbacher, Daniel Schleicher, Stephanie Kandsperger, Romuald Brunner, Bernd Ludwig

    Published 2024-12-01
    “…The transcribed interviews comprised 4,077 question-answer-pairs, with which we predicted the clinical rating (depressive/non-depressive) with use of a feedforward neural network that received BERT (Bidirectional Encoder Representations from Transformers) vectors of interviewer questions and patient answers as input. …”
    Get full text
    Article
  20. 60

    Enhancing Pulmonary Disease Prediction Using Large Language Models With Feature Summarization and Hybrid Retrieval-Augmented Generation: Multicenter Methodological Study Based on R... by Ronghao Li, Shuai Mao, Congmin Zhu, Yingliang Yang, Chunting Tan, Li Li, Xiangdong Mu, Honglei Liu, Yuqing Yang

    Published 2025-06-01
    “…The traditional deep learning model, BERT (Bidirectional Encoder Representations from Transformers), was also compared to assess the superiority of LLMs. …”
    Get full text
    Article