RSGPT: a generative transformer model for retrosynthesis planning pre-trained on ten billion datapoints
Abstract Retrosynthesis planning is a crucial task in organic synthesis, and deep-learning methods have enhanced and accelerated this process. With the advancement of the emergence of large language models, the demand for data is rapidly increasing. However, available retrosynthesis data are limited...
Saved in:
| Main Authors: | Yafeng Deng, Xinda Zhao, Hanyu Sun, Yu Chen, Xiaorui Wang, Xi Xue, Liangning Li, Jianfei Song, Chang-Yu Hsieh, Tingjun Hou, Xiandao Pan, Taghrid Saad Alomar, Xiangyang Ji, Xiaojian Wang |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Nature Portfolio
2025-07-01
|
| Series: | Nature Communications |
| Online Access: | https://doi.org/10.1038/s41467-025-62308-6 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
HiCLR: Knowledge-Induced Hierarchical Contrastive Learning with Retrosynthesis Prediction Yields a Reaction Foundation Model
by: Jialu Wu, et al.
Published: (2025-06-01) -
Vehicle Trajectory Repair Under Full Occlusion and Limited Datapoints with Roadside LiDAR
by: Qiyang Luo, et al.
Published: (2025-02-01) -
Generating diversity and securing completeness in algorithmic retrosynthesis
by: Florian Mrugalla, et al.
Published: (2025-05-01) -
Improving route development using convergent retrosynthesis planning
by: Paula Torren-Peraire, et al.
Published: (2025-02-01) -
Single-step retrosynthesis prediction via multitask graph representation learning
by: Peng-Cheng Zhao, et al.
Published: (2025-01-01)