BERT2DAb: a pre-trained model for antibody representation based on amino acid sequences and 2D-structure

Prior research has generated a vast amount of antibody sequences, which has allowed the pre-training of language models on amino acid sequences to improve the efficiency of antibody screening and optimization. However, compared to those for proteins, there are fewer pre-trained language models avail...

Full description

Saved in:
Bibliographic Details
Main Authors: Xiaowei Luo, Fan Tong, Wenbin Zhao, Xiangwen Zheng, Jiangyu Li, Jing Li, Dongsheng Zhao
Format: Article
Language:English
Published: Taylor & Francis Group 2023-12-01
Series:mAbs
Subjects:
Online Access:https://www.tandfonline.com/doi/10.1080/19420862.2023.2285904
Tags: Add Tag
No Tags, Be the first to tag this record!