Efficient Second-Order TreeCRF for Neural Dependency Parsing Y Zhang, Z Li, M Zhang ACL 2020, 2020 | 115 | 2020 |
Fast and Accurate Neural CRF Constituency Parsing Y Zhang, H Zhou, Z Li IJCAI 2020, 2020 | 95 | 2020 |
Semantic Role Labeling as Dependency Parsing: Exploring Latent Tree Structures Inside Arguments Y Zhang, Q Xia, S Zhou, Y Jiang, G Fu, M Zhang COLING 2022, 2021 | 30 | 2021 |
Is POS Tagging Necessary or Even Helpful for Neural Dependency Parsing? H Zhou, Y Zhang, Z Li, M Zhang NLPCC 2020, 2020 | 23* | 2020 |
HLT@SUDA at SemEval-2019 Task 1: UCCA Graph Parsing as Constituent Tree Parsing W Jiang, Z Li, Y Zhang, M Zhang SemEval 2019, 2019 | 23 | 2019 |
Scalable MatMul-free Language Modeling RJ Zhu, Y Zhang, E Sifferman, T Sheaves, Y Wang, D Richmond, P Zhou, ... arXiv preprint arXiv:2406.02528, 2024 | 12 | 2024 |
Parallelizing Linear Transformers with the Delta Rule over Sequence Length S Yang, B Wang, Y Zhang, Y Shen, Y Kim NeurIPS 2024, 2024 | 11 | 2024 |
Fast and Accurate End-to-End Span-based Semantic Role Labeling as Word-based Graph Parsing S Zhou, Q Xia, Z Li, Y Zhang, Y Hong, M Zhang COLING 2022, 2022 | 8 | 2022 |
FLA: A triton-based library for hardware-efficient implementations of linear attention mechanism S Yang, Y Zhang https://github.com/sustcsonglin/flash-linear-attention, 2024 | 6 | 2024 |
Non-autoregressive Text Editing with Copy-aware Latent Alignments Y Zhang, Y Zhang, L Cui, G Fu EMNLP 2023, 2023 | 4 | 2023 |
Gated Slot Attention for Efficient Linear-Time Sequence Modeling Y Zhang, S Yang, R Zhu, Y Zhang, L Cui, Y Wang, B Wang, F Shi, B Wang, ... NeurIPS 2024, 2024 | | 2024 |