Formal Language Recognition by Hard Attention Transformers: Perspectives from Circuit Complexity

Saved in:
Bibliographic Details
Main Authors: Yiding Hao, Dana Angluin, Robert Frank
Format: Article
Language:English
Published: The MIT Press 2022-08-01
Series:Transactions of the Association for Computational Linguistics
Online Access:http://dx.doi.org/10.1162/tacl_a_00490
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1846129811234226176
author Yiding Hao
Dana Angluin
Robert Frank
author_facet Yiding Hao
Dana Angluin
Robert Frank
author_sort Yiding Hao
collection DOAJ
format Article
id doaj-art-2d6e3a021ef24a47b04cab72661d6411
institution Kabale University
issn 2307-387X
language English
publishDate 2022-08-01
publisher The MIT Press
record_format Article
series Transactions of the Association for Computational Linguistics
spelling doaj-art-2d6e3a021ef24a47b04cab72661d64112024-12-09T20:15:24ZengThe MIT PressTransactions of the Association for Computational Linguistics2307-387X2022-08-011010.1162/tacl_a_00490Formal Language Recognition by Hard Attention Transformers: Perspectives from Circuit ComplexityYiding HaoDana AngluinRobert Frankhttp://dx.doi.org/10.1162/tacl_a_00490
spellingShingle Yiding Hao
Dana Angluin
Robert Frank
Formal Language Recognition by Hard Attention Transformers: Perspectives from Circuit Complexity
Transactions of the Association for Computational Linguistics
title Formal Language Recognition by Hard Attention Transformers: Perspectives from Circuit Complexity
title_full Formal Language Recognition by Hard Attention Transformers: Perspectives from Circuit Complexity
title_fullStr Formal Language Recognition by Hard Attention Transformers: Perspectives from Circuit Complexity
title_full_unstemmed Formal Language Recognition by Hard Attention Transformers: Perspectives from Circuit Complexity
title_short Formal Language Recognition by Hard Attention Transformers: Perspectives from Circuit Complexity
title_sort formal language recognition by hard attention transformers perspectives from circuit complexity
url http://dx.doi.org/10.1162/tacl_a_00490
work_keys_str_mv AT yidinghao formallanguagerecognitionbyhardattentiontransformersperspectivesfromcircuitcomplexity
AT danaangluin formallanguagerecognitionbyhardattentiontransformersperspectivesfromcircuitcomplexity
AT robertfrank formallanguagerecognitionbyhardattentiontransformersperspectivesfromcircuitcomplexity