Showing 41 - 60 results of 66 for search 'Bidirectional encoder presentation from transformed', query time: 0.12s Refine Results
  1. 41

    VitroBert: modeling DILI by pretraining BERT on in vitro data by Muhammad Arslan Masood, Anamya Ajjolli Nagaraja, Katia Belaid, Natalie Mesens, Hugo Ceulemans, Samuel Kaski, Dorota Herman, Markus Heinonen

    Published 2025-08-01
    “…We therefore introduce VitroBERT, a bidirectional encoder representations from transformers (BERT) model pretrained on large-scale in vitro assay profiles to generate biologically informed molecular embeddings. …”
    Get full text
    Article
  2. 42

    Does the Choice of Topic Modeling Technique Impact the Interpretation of Aviation Incident Reports? A Methodological Assessment by Aziida Nanyonga, Keith Joiner, Ugur Turhan, Graham Wild

    Published 2025-05-01
    “…This study presents a comparative analysis of four topic modeling techniques —Latent Dirichlet Allocation (LDA), Bidirectional Encoder Representations from Transformers (BERT), Probabilistic Latent Semantic Analysis (pLSA), and Non-negative Matrix Factorization (NMF)—applied to aviation safety reports from the ATSB dataset spanning 2013–2023. …”
    Get full text
    Article
  3. 43

    Electric Vehicle Sentiment Analysis Using Large Language Models by Hemlata Sharma, Faiz Ud Din, Bayode Ogunleye

    Published 2024-11-01
    “…EV companies are becoming significant competitors in the automotive industry and are projected to cover up to 30% of the United States light vehicle market by 2030 In this study, we present a comparative study of large language models (LLMs) including bidirectional encoder representations from transformers (BERT), robustly optimised BERT (RoBERTa), and a generalised autoregressive pre-training method (XLNet) using Lucid Motors and Tesla Motors YouTube datasets. …”
    Get full text
    Article
  4. 44

    ENHANCING NAMED ENTITY RECOGNITION ON HINER DATASET USING ADVANCED NLP TECHNIQUES by Harshvardhan Pardeshi, Prof. Piyush Pratap Singh

    Published 2025-05-01
    “…Conversely, it lacks speed and accuracy. Therefore, the present research uses advanced NLP models such as bidirectional encoder representations from transformers (BERT), Distil BERT and the robustly optimized BERT approach (RoBERTA) for effective entity prediction performance. …”
    Get full text
    Article
  5. 45

    Vietnamese Sentence Fact Checking Using the Incremental Knowledge Graph, Deep Learning, and Inference Rules on Online Platforms by Huong To Duong, Van Hai Ho, Phuc do

    Published 2025-01-01
    “…ViKGFC integrates a Knowledge Graph (KG), inference rules, and the Knowledge graph - Bidirectional Encoder Representations from Transformers (KG-BERT) deep learning model. …”
    Get full text
    Article
  6. 46

    A Cross-Product Analysis of Earphone Reviews Using Contextual Topic Modeling and Association Rule Mining by Ugbold Maidar, Minyoung Ra, Donghee Yoo

    Published 2024-12-01
    “…Therefore, this study addresses the need and rationale for having comprehensive sentiment analysis systems by integrating topic modeling and association rule mining to analyze online customer reviews of earphones sold on Amazon. It employs Bidirectional Encoder Representations from Transformers for Topic Modeling (BERTopic), a technique that generates coherent topics by effectively capturing contextual information, and Frequent Pattern Growth (FPGrowth), an efficient association rule mining algorithm used for discovering patterns and relationships in a dataset without candidate generation. …”
    Get full text
    Article
  7. 47

    Detecting Chinese Disinformation with Fine–Tuned BERT and Contextual Techniques by Lixin Yun, Sheng Yun, Haoran Xue

    Published 2025-12-01
    “…Building on large language models (LLMs) like BERT (Bidirectional Encoder Representations from Transformers) provides a promising avenue for addressing this challenge. …”
    Get full text
    Article
  8. 48

    Content analysis of multi-annual time series of flood-related Twitter (X) data by N. Veigel, N. Veigel, N. Veigel, H. Kreibich, J. A. de Bruijn, J. A. de Bruijn, J. C. J. H. Aerts, J. C. J. H. Aerts, A. Cominola, A. Cominola

    Published 2025-02-01
    “…We implement bidirectional encoder representations from transformers in combination with unsupervised clustering techniques (BERTopic) to automatically extract social media content, addressing transferability issues that arise from commonly used bag-of-words representations. …”
    Get full text
    Article
  9. 49

    Optimizing an LSTM Self-Attention Architecture for Portuguese Sentiment Analysis Using a Genetic Algorithm by Daniel Parada, Alexandre Branco, Marcos Silva, Fábio Mendonça, Sheikh Mostafa, Fernando Morgado-Dias

    Published 2025-06-01
    “…A key outcome of this study was that the optimization process produced a model that is competitive with a Bidirectional Encoder Representation from Transformers (BERT) model retrained for Portuguese, which was used as the baseline. …”
    Get full text
    Article
  10. 50

    Graph neural networks embedded with domain knowledge for cyber threat intelligence entity and relationship mining by Gan Liu, Kai Lu, Saiqi Pi

    Published 2025-04-01
    “…Specifically, first, domain knowledge is collected to build a domain knowledge graph, which is then embedded using graph convolutional networks (GCN) to enhance the feature representation of threat intelligence text. Next, the features from domain knowledge graph embedding and those generated by the bidirectional encoder representations from transformers (BERT) model are fused using the Layernorm algorithm. …”
    Get full text
    Article
  11. 51

    Empowering geoportals HCI with task-oriented chatbots through NLP and deep transfer learning by Mohammad H. Vahidnia

    Published 2024-10-01
    “…The notion of deep transfer learning (DTL) was then put into practice by customizing a pre-trained BERT (Bidirectional Encoder Representations from Transformers) model for our particular aim and creating a task-oriented conversational agent. …”
    Get full text
    Article
  12. 52

    Analysis of Short Texts Using Intelligent Clustering Methods by Jamalbek Tussupov, Akmaral Kassymova, Ayagoz Mukhanova, Assyl Bissengaliyeva, Zhanar Azhibekova, Moldir Yessenova, Zhanargul Abuova

    Published 2025-05-01
    “…This article presents a comprehensive review of short text clustering using state-of-the-art methods: Bidirectional Encoder Representations from Transformers (BERT), Term Frequency-Inverse Document Frequency (TF-IDF), and the novel hybrid method Latent Dirichlet Allocation + BERT + Autoencoder (LDA + BERT + AE). …”
    Get full text
    Article
  13. 53

    Innovative Sentiment Analysis and Prediction of Stock Price Using FinBERT, GPT-4 and Logistic Regression: A Data-Driven Approach by Olamilekan Shobayo, Sidikat Adeyemi-Longe, Olusogo Popoola, Bayode Ogunleye

    Published 2024-10-01
    “…This study explores the comparative performance of cutting-edge AI models, i.e., Finaance Bidirectional Encoder representations from Transsformers (FinBERT), Generatice Pre-trained Transformer GPT-4, and Logistic Regression, for sentiment analysis and stock index prediction using financial news and the NGX All-Share Index data label. …”
    Get full text
    Article
  14. 54

    IndoGovBERT: A Domain-Specific Language Model for Processing Indonesian Government SDG Documents by Agus Riyadi, Mate Kovacs, Uwe Serdült, Victor Kryssanov

    Published 2024-11-01
    “…This circumstance makes it difficult to automate document processing and improve the efficacy of SDG-related government efforts. The presented study introduces IndoGovBERT, a Bidirectional Encoder Representations from Transformers (BERT)-based PTLM built with domain-specific corpora, leveraging the Indonesian government’s public and internal documents. …”
    Get full text
    Article
  15. 55

    A Novel HDF-Based Data Compression and Integration Approach to Support BIM-GIS Practical Applications by Zeyu Pan, Jianyong Shi, Liu Jiang

    Published 2020-01-01
    “…Next, bidirectional transformation methods for BIM and GIS modeling data, images, and analytical data into HDF are proposed. …”
    Get full text
    Article
  16. 56

    Automated and efficient Bangla signboard detection, text extraction, and novel categorization method for underrepresented languages in smart cities by Tanmoy Mazumder, Fariha Nusrat, Abu Bakar Siddique Mahi, Jolekha Begum Brishty, Rashik Rahman, Tanjina Helaly

    Published 2025-06-01
    “…Finally, fine-tuning of the pre-trained multilingual Bidirectional Encoder Representations from Transformers (BERT) model is implemented to mimic human perception to achieve Named Entity Recognition (NER) capabilities. …”
    Get full text
    Article
  17. 57

    Ransomware detection and family classification using fine-tuned BERT and RoBERTa models by Amjad Hussain, Ayesha Saadia, Faeiz M. Alserhani

    Published 2025-06-01
    “…This research explores these challenges and proposes a novel approach using hyperparameter-optimized transfer learning-based models, Bidirectional Encoder Representations from Transformers (BERT), and a Robustly Optimized BERT Approach (RoBERTa), to not only detect but also classify ransomware targeting IoT devices by analyzing dynamically executed API call sequences in a sandbox environment. …”
    Get full text
    Article
  18. 58

    Intelligent integration of AI and IoT for advancing ecological health, medical services, and community prosperity by Abdulrahman Alzahrani, Patty Kostkova, Hamoud Alshammari, Safa Habibullah, Ahmed Alzahrani

    Published 2025-08-01
    “…CNN (convolutional neural networks) with transfer learning enabled by Res-Net provides high-accuracy image recognition, which can be used for waste classification. Bidirectional Encoder Representations from Transformers (BERT) allow multilingual users to interact and communicate properly in any linguistic environment. …”
    Get full text
    Article
  19. 59

    TSB-Forecast: A Short-Term Load Forecasting Model in Smart Cities for Integrating Time Series Embeddings and Large Language Models by Mohamed Mahmoud Hasan, Neamat El-Tazi, Ramadan Moawad, Amany H. B. Eissa

    Published 2025-01-01
    “…The model uses Sentence Bidirectional Encoder Representations from Transformers (SBERT) to extract semantic characteristics from textual news and Time to Vector (Time2Vec) to capture temporal patterns, acquiring cyclical behavior and context-sensitive impacts. …”
    Get full text
    Article
  20. 60

    M.I.N.I.-KID interviews with adolescents: a corpus-based language analysis of adolescents with depressive disorders and the possibilities of continuation using Chat GPT by Irina Jarvers, Angelika Ecker, Pia Donabauer, Katharina Kampa, Maximilian Weißenbacher, Daniel Schleicher, Stephanie Kandsperger, Romuald Brunner, Bernd Ludwig

    Published 2024-12-01
    “…The transcribed interviews comprised 4,077 question-answer-pairs, with which we predicted the clinical rating (depressive/non-depressive) with use of a feedforward neural network that received BERT (Bidirectional Encoder Representations from Transformers) vectors of interviewer questions and patient answers as input. …”
    Get full text
    Article