Showing 1 - 20 results of 66 for search 'Bidirectional encoder presentation from transformed', query time: 0.19s Refine Results
  1. 1

    Bidirectional Encoder representation from Image Transformers for recognizing sunflower diseases from photographs by V.A. Baboshina, P.A. Lyakhov, U.A. Lyakhova, V.A. Pismennyy

    Published 2025-06-01
    “…This paper proposes a modern system for recognizing sunflower diseases based on Bidirectional Encoder representation from Image Transformers (BEIT). …”
    Get full text
    Article
  2. 2

    Assessing Scientific Text Similarity: A Novel Approach Utilizing Non-Negative Matrix Factorization and Bidirectional Encoder Representations from Transformer by Zhixuan Jia, Wenfang Tian, Wang Li, Kai Song, Fuxin Wang, Congjing Ran

    Published 2024-10-01
    “…This approach integrates a patent’s content with international patent classification (IPC), leveraging bidirectional encoder representations from transformers (BERT), and non-negative matrix factorization (NMF). …”
    Get full text
    Article
  3. 3

    A Centrality-Weighted Bidirectional Encoder Representation from Transformers Model for Enhanced Sequence Labeling in Key Phrase Extraction from Scientific Texts by Tsitsi Zengeya, Jean Vincent Fonou Dombeu, Mandlenkosi Gwetu

    Published 2024-12-01
    “…Deep learning approaches, utilizing Bidirectional Encoder Representation from Transformers (BERT) and advanced fine-tuning techniques, have achieved state-of-the-art accuracies in the domain of term extraction from texts. …”
    Get full text
    Article
  4. 4
  5. 5

    LDAViewer: An Automatic Language-Agnostic System for Discovering State-of-the-Art Topics in Research Using Topic Modeling, Bidirectional Encoder Representations From Transformers,... by Timothy Dillan, Dhomas Hatta Fudholi

    Published 2023-01-01
    “…Subsequently, a numeric document-phrase matrix is created and analyzed using latent Dirichlet allocation (LDA) and bidirectional encoder representations from transformers (BERT) to discover and label topics automatically. …”
    Get full text
    Article
  6. 6

    Application of the Bidirectional Encoder Representations from Transformers Model for Predicting the Abbreviated Injury Scale in Patients with Trauma: Algorithm Development and Vali... by Jun Tang, Yang Li, Keyu Luo, Jiangyuan Lai, Xiang Yin, Dongdong Wu

    Published 2025-05-01
    “…We used a robust optimization Bidirectional Encoder Representations from Transformers (BERT) pretraining method to embed these features and constructed a prediction model based on BERT. …”
    Get full text
    Article
  7. 7
  8. 8
  9. 9
  10. 10
  11. 11

    Utilizing Machine Learning Techniques for Cancer Prediction and Classification based on Gene Expression Data by Mariwan Mahmood Hama Aziz, Sozan Abdullah Mahmood

    Published 2025-06-01
    “…In this paper, we propose a unique approach that utilizes DistilBERT, a distilled version of the Bidirectional Encoder Representations from Transformers, for cancer classification and prediction. …”
    Get full text
    Article
  12. 12

    Deep Learning-Based Short Text Summarization: An Integrated BERT and Transformer Encoder–Decoder Approach by Fahd A. Ghanem, M. C. Padma, Hudhaifa M. Abdulwahab, Ramez Alkhatib

    Published 2025-04-01
    “…The proposed approach combines bidirectional encoder representations from transformers (BERT) with a transformer-based encoder–decoder architecture (TEDA), incorporating an attention mechanism to improve contextual understanding. …”
    Get full text
    Article
  13. 13

    A dual-phase deep learning framework for advanced phishing detection using the novel OptSHQCNN approach by Srikanth Meda, Vangipuram Sesha Srinivas, Killi Chandra Bhushana Rao, Repudi Ramesh, Narasimha Rao Yamarthi

    Published 2025-07-01
    “…Results In the post-deployment phase, the URL is encoded using Optimized Bidirectional Encoder Representations from Transformers (OptBERT), after which the features are extracted. …”
    Get full text
    Article
  14. 14

    The geometry of meaning: evaluating sentence embeddings from diverse transformer-based models for natural language inference by Mohammed Alsuhaibani

    Published 2025-06-01
    “…Natural language inference (NLI) is a fundamental task in natural language processing that focuses on determining the relationship between pairs of sentences. In this article, we present a simple and straightforward approach to evaluate the effectiveness of various transformer-based models such as bidirectional encoder representations from transformers (BERT), Generative Pre-trained Transformer (GPT), robustly optimized BERT approach (RoBERTa), and XLNet in generating sentence embeddings for NLI. …”
    Get full text
    Article
  15. 15

    Leveraging large language models for spelling correction in Turkish by Ceren Guzel Turhan

    Published 2025-06-01
    “…To address this, the research introduces a novel dataset, referred to as NoisyWikiTr, to explore encoder-only models based on bidirectional encoder representations from transformers (BERT) and existing auto-correction tools. …”
    Get full text
    Article
  16. 16

    Using a transformer language model to curate a pulmonary embolism dataset from the Medical Information Mart for Intensive Care IV: MIMIC-IV-Ext-PE by Barbara D. Lam, Shengling Ma, Iuliia Kovalenko, Peiqi Wang, Omid Jafari, Ang Li, Steven Horng

    Published 2025-05-01
    “…Using this as our gold standard, we compared the performance of a fine-tuned Bio_ClinicalBERT (bidirectional encoder representations from transformers) transformer language model, known as venous thromboembolism-BERT, with diagnosis codes in the ability to classify reports as PE positive or negative. …”
    Get full text
    Article
  17. 17

    Building sustainable information systems and transformer models on demand by Thomas Asselborn, Sylvia Melzer, Simon Schiff, Magnus Bender, Florian Andreas Marwitz, Said Aljoumani, Stefan Thiemann, Konrad Hirschler, Ralf Möller

    Published 2025-02-01
    “…On the one hand, we have achieved a substantial reduction in the development time of an information system, from months to seconds, as well as the ability to fine-tune BERT (Bidirectional Encoder Representations from Transformers) models without specific knowledge in selecting models or tools. …”
    Get full text
    Article
  18. 18

    Detecting sarcasm in user-generated content integrating transformers and gated graph neural networks by Zhenkai Qin, Qining Luo, Zhidong Zang, Hongpeng Fu

    Published 2025-04-01
    “…To address this issue, the present study proposes a novel sarcasm detection model that combines bidirectional encoder representations from transformers (BERT) with gated graph neural networks (GGNN), further enhanced by a self-attention mechanism to more effectively capture ironic cues. …”
    Get full text
    Article
  19. 19

    Leveraging Multilingual Transformer for Multiclass Sentiment Analysis in Code-Mixed Data of Low-Resource Languages by Muhammad Kashif Nazir, Cm Nadeem Faisal, Muhammad Asif Habib, Haseeb Ahmad

    Published 2025-01-01
    “…Subsequently, the Multilingual Bidirectional Encoder Representations from Transformers (mBERT) model was optimized and trained for multiclass sentiment analysis on the code-mixed data. …”
    Get full text
    Article
  20. 20

    Trajectory-Ordered Objectives for Self-Supervised Representation Learning of Temporal Healthcare Data Using Transformers: Model Development and Evaluation Study by Ali Amirahmadi, Farzaneh Etminani, Jonas Björk, Olle Melander, Mattias Ohlsson

    Published 2025-06-01
    “…MethodsWe introduce Trajectory Order Objective BERT (Bidirectional Encoder Representations from Transformers; TOO-BERT), a transformer-based model that advances the MLM pretraining approach by integrating a novel TOO to better learn the complex sequential dependencies between medical events. …”
    Get full text
    Article