Larger models yield better results? Streamlined severity classification of ADHD-related concerns using BERT-based knowledge distillation.
This work focuses on the efficiency of the knowledge distillation approach in generating a lightweight yet powerful BERT-based model for natural language processing (NLP) applications. After the model creation, we applied the resulting model, LastBERT, to a real-world task-classifying severity level...
Saved in:
Main Authors: | Ahmed Akib Jawad Karim, Kazi Hafiz Md Asad, Md Golam Rabiul Alam |
---|---|
Format: | Article |
Language: | English |
Published: |
Public Library of Science (PLoS)
2025-01-01
|
Series: | PLoS ONE |
Online Access: | https://doi.org/10.1371/journal.pone.0315829 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Distillation of Essential Oils
by: Elise V. Pearlstine
Published: (2011-06-01) -
ADHD in Early Childhood: Part 1 - Understanding ADHD in Preschoolers
by: Allie Munch, et al.
Published: (2012-07-01) -
Thinning Florida Peaches for Larger Fruit
by: Yuru Chang, et al.
Published: (2019-01-01) -
DockCADD: A streamlined in silico pipeline for the identification of potent ribosomal S6 Kinase 2 (RSK2) inhibitors
by: El Mehdi Karim, et al.
Published: (2025-03-01) -
Streamlining Resiliency: Regulatory Considerations in Permitting Small Scale Living Shorelines
by: Thomas T. Ankersen, et al.
Published: (2018-04-01)