Directed Acyclic Transformer Pre-training for High-quality Non-autoregressive Text Generation

Saved in:
Bibliographic Details
Main Authors: Fei Huang, Pei Ke, Minlie Huang
Format: Article
Language:English
Published: The MIT Press 2023-08-01
Series:Transactions of the Association for Computational Linguistics
Online Access:http://dx.doi.org/10.1162/tacl_a_00582
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850264856449515520
author Fei Huang
Pei Ke
Minlie Huang
author_facet Fei Huang
Pei Ke
Minlie Huang
author_sort Fei Huang
collection DOAJ
format Article
id doaj-art-72d98cfd40fd487a92a6d7c554846bd6
institution OA Journals
issn 2307-387X
language English
publishDate 2023-08-01
publisher The MIT Press
record_format Article
series Transactions of the Association for Computational Linguistics
spelling doaj-art-72d98cfd40fd487a92a6d7c554846bd62025-08-20T01:54:37ZengThe MIT PressTransactions of the Association for Computational Linguistics2307-387X2023-08-011110.1162/tacl_a_00582Directed Acyclic Transformer Pre-training for High-quality Non-autoregressive Text GenerationFei HuangPei KeMinlie Huanghttp://dx.doi.org/10.1162/tacl_a_00582
spellingShingle Fei Huang
Pei Ke
Minlie Huang
Directed Acyclic Transformer Pre-training for High-quality Non-autoregressive Text Generation
Transactions of the Association for Computational Linguistics
title Directed Acyclic Transformer Pre-training for High-quality Non-autoregressive Text Generation
title_full Directed Acyclic Transformer Pre-training for High-quality Non-autoregressive Text Generation
title_fullStr Directed Acyclic Transformer Pre-training for High-quality Non-autoregressive Text Generation
title_full_unstemmed Directed Acyclic Transformer Pre-training for High-quality Non-autoregressive Text Generation
title_short Directed Acyclic Transformer Pre-training for High-quality Non-autoregressive Text Generation
title_sort directed acyclic transformer pre training for high quality non autoregressive text generation
url http://dx.doi.org/10.1162/tacl_a_00582
work_keys_str_mv AT feihuang directedacyclictransformerpretrainingforhighqualitynonautoregressivetextgeneration
AT peike directedacyclictransformerpretrainingforhighqualitynonautoregressivetextgeneration
AT minliehuang directedacyclictransformerpretrainingforhighqualitynonautoregressivetextgeneration