CGM: Copy Mechanism GPT with Mask for Ellipsis and Anaphora Resolution in Dialogue
GPT (Generative Pre-trained Transformer) is a generative language model that demonstrates outstanding performance in the field of text generation. Generally, the attention mechanism of the transformer model behaves similarly to a copy distribution. However, due to the absence of a dedicated encoder,...
Saved in:
| Main Authors: | Ji-Won Cho, Jinyoung Oh, Jeong-Won Cha |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2024-12-01
|
| Series: | Applied Sciences |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2076-3417/15/1/5 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
SAB: Self-Adaptive Bias
by: Suchan Choi, et al.
Published: (2024-12-01) -
Generalization Increases the Adaptive Value of Mate Choice Copying When Immediate Copying Is Costly
by: Geoff Kushnick
Published: (2024-10-01) -
Design and Implementation of Efficient and Transparent Zero Copy Read
by: Jiwoong Park, et al.
Published: (2024-01-01) -
An Algorithm for Anaphora Resolution in Spanish Texts
by: Manuel Palomar, et al.
Published: (2021-03-01) -
Le Musée européen des copies de Charles Blanc comme « pendant » du Louvre
by: Elisa Rodríguez Castresana
Published: (2017-10-01)