-
1
Bidirectional Encoder representation from Image Transformers for recognizing sunflower diseases from photographs
Published 2025-06-01“…This paper proposes a modern system for recognizing sunflower diseases based on Bidirectional Encoder representation from Image Transformers (BEIT). …”
Get full text
Article -
2
Assessing Scientific Text Similarity: A Novel Approach Utilizing Non-Negative Matrix Factorization and Bidirectional Encoder Representations from Transformer
Published 2024-10-01“…This approach integrates a patent’s content with international patent classification (IPC), leveraging bidirectional encoder representations from transformers (BERT), and non-negative matrix factorization (NMF). …”
Get full text
Article -
3
A Centrality-Weighted Bidirectional Encoder Representation from Transformers Model for Enhanced Sequence Labeling in Key Phrase Extraction from Scientific Texts
Published 2024-12-01“…Deep learning approaches, utilizing Bidirectional Encoder Representation from Transformers (BERT) and advanced fine-tuning techniques, have achieved state-of-the-art accuracies in the domain of term extraction from texts. …”
Get full text
Article -
4
-
5
LDAViewer: An Automatic Language-Agnostic System for Discovering State-of-the-Art Topics in Research Using Topic Modeling, Bidirectional Encoder Representations From Transformers,...
Published 2023-01-01“…Subsequently, a numeric document-phrase matrix is created and analyzed using latent Dirichlet allocation (LDA) and bidirectional encoder representations from transformers (BERT) to discover and label topics automatically. …”
Get full text
Article -
6
Application of the Bidirectional Encoder Representations from Transformers Model for Predicting the Abbreviated Injury Scale in Patients with Trauma: Algorithm Development and Vali...
Published 2025-05-01“…We used a robust optimization Bidirectional Encoder Representations from Transformers (BERT) pretraining method to embed these features and constructed a prediction model based on BERT. …”
Get full text
Article -
7
Advancing smart tourism destinations: A case study using bidirectional encoder representations from transformers‐based occupancy predictions in torrevieja (Spain)
Published 2024-12-01“…In this extended study, we delve deeper into the realm of social sensing by utilising bidirectional encoder representations from transformers (BERT) for topic modelling. …”
Get full text
Article -
8
-
9
Domain-Generalized Emotion Recognition on German Text Corpora
Published 2025-01-01Get full text
Article -
10
Short-term cryptocurrency price forecasting based on news headline analysis
Published 2025-07-01Get full text
Article -
11
Utilizing Machine Learning Techniques for Cancer Prediction and Classification based on Gene Expression Data
Published 2025-06-01“…In this paper, we propose a unique approach that utilizes DistilBERT, a distilled version of the Bidirectional Encoder Representations from Transformers, for cancer classification and prediction. …”
Get full text
Article -
12
Deep Learning-Based Short Text Summarization: An Integrated BERT and Transformer Encoder–Decoder Approach
Published 2025-04-01“…The proposed approach combines bidirectional encoder representations from transformers (BERT) with a transformer-based encoder–decoder architecture (TEDA), incorporating an attention mechanism to improve contextual understanding. …”
Get full text
Article -
13
A dual-phase deep learning framework for advanced phishing detection using the novel OptSHQCNN approach
Published 2025-07-01“…Results In the post-deployment phase, the URL is encoded using Optimized Bidirectional Encoder Representations from Transformers (OptBERT), after which the features are extracted. …”
Get full text
Article -
14
The geometry of meaning: evaluating sentence embeddings from diverse transformer-based models for natural language inference
Published 2025-06-01“…Natural language inference (NLI) is a fundamental task in natural language processing that focuses on determining the relationship between pairs of sentences. In this article, we present a simple and straightforward approach to evaluate the effectiveness of various transformer-based models such as bidirectional encoder representations from transformers (BERT), Generative Pre-trained Transformer (GPT), robustly optimized BERT approach (RoBERTa), and XLNet in generating sentence embeddings for NLI. …”
Get full text
Article -
15
Leveraging large language models for spelling correction in Turkish
Published 2025-06-01“…To address this, the research introduces a novel dataset, referred to as NoisyWikiTr, to explore encoder-only models based on bidirectional encoder representations from transformers (BERT) and existing auto-correction tools. …”
Get full text
Article -
16
Using a transformer language model to curate a pulmonary embolism dataset from the Medical Information Mart for Intensive Care IV: MIMIC-IV-Ext-PE
Published 2025-05-01“…Using this as our gold standard, we compared the performance of a fine-tuned Bio_ClinicalBERT (bidirectional encoder representations from transformers) transformer language model, known as venous thromboembolism-BERT, with diagnosis codes in the ability to classify reports as PE positive or negative. …”
Get full text
Article -
17
Building sustainable information systems and transformer models on demand
Published 2025-02-01“…On the one hand, we have achieved a substantial reduction in the development time of an information system, from months to seconds, as well as the ability to fine-tune BERT (Bidirectional Encoder Representations from Transformers) models without specific knowledge in selecting models or tools. …”
Get full text
Article -
18
Detecting sarcasm in user-generated content integrating transformers and gated graph neural networks
Published 2025-04-01“…To address this issue, the present study proposes a novel sarcasm detection model that combines bidirectional encoder representations from transformers (BERT) with gated graph neural networks (GGNN), further enhanced by a self-attention mechanism to more effectively capture ironic cues. …”
Get full text
Article -
19
Leveraging Multilingual Transformer for Multiclass Sentiment Analysis in Code-Mixed Data of Low-Resource Languages
Published 2025-01-01“…Subsequently, the Multilingual Bidirectional Encoder Representations from Transformers (mBERT) model was optimized and trained for multiclass sentiment analysis on the code-mixed data. …”
Get full text
Article -
20
Trajectory-Ordered Objectives for Self-Supervised Representation Learning of Temporal Healthcare Data Using Transformers: Model Development and Evaluation Study
Published 2025-06-01“…MethodsWe introduce Trajectory Order Objective BERT (Bidirectional Encoder Representations from Transformers; TOO-BERT), a transformer-based model that advances the MLM pretraining approach by integrating a novel TOO to better learn the complex sequential dependencies between medical events. …”
Get full text
Article