Formal Language Recognition by Hard Attention Transformers: Perspectives from Circuit Complexity
Saved in:
| Main Authors: | Yiding Hao, Dana Angluin, Robert Frank |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
The MIT Press
2022-08-01
|
| Series: | Transactions of the Association for Computational Linguistics |
| Online Access: | http://dx.doi.org/10.1162/tacl_a_00490 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
What Formal Languages Can Transformers Express? A Survey
by: Lena Strobl, et al.
Published: (2024-05-01) -
Enhancing Cross-Language Multimodal Emotion Recognition With Dual Attention Transformers
by: Syed Aun Muhammad Zaidi, et al.
Published: (2024-01-01) -
A lightweight transformer with linear self‐attention for defect recognition
by: Yuwen Zhai, et al.
Published: (2024-09-01) -
Effective Partitioning Method With Predictable Hardness for CircuitSAT
by: Konstantin Chukharev, et al.
Published: (2025-01-01) -
Trends in Language Formalization in Architecture
by: Franklim Morais Pereira
Published: (2014-12-01)