Obserwuj
Yu Zhang
Tytuł
Cytowane przez
Cytowane przez
Rok
Efficient Second-Order TreeCRF for Neural Dependency Parsing
Y Zhang, Z Li, M Zhang
ACL 2020, 2020
1152020
Fast and Accurate Neural CRF Constituency Parsing
Y Zhang, H Zhou, Z Li
IJCAI 2020, 2020
952020
Semantic Role Labeling as Dependency Parsing: Exploring Latent Tree Structures Inside Arguments
Y Zhang, Q Xia, S Zhou, Y Jiang, G Fu, M Zhang
COLING 2022, 2021
302021
Is POS Tagging Necessary or Even Helpful for Neural Dependency Parsing?
H Zhou, Y Zhang, Z Li, M Zhang
NLPCC 2020, 2020
23*2020
HLT@SUDA at SemEval-2019 Task 1: UCCA Graph Parsing as Constituent Tree Parsing
W Jiang, Z Li, Y Zhang, M Zhang
SemEval 2019, 2019
232019
Scalable MatMul-free Language Modeling
RJ Zhu, Y Zhang, E Sifferman, T Sheaves, Y Wang, D Richmond, P Zhou, ...
arXiv preprint arXiv:2406.02528, 2024
122024
Parallelizing Linear Transformers with the Delta Rule over Sequence Length
S Yang, B Wang, Y Zhang, Y Shen, Y Kim
NeurIPS 2024, 2024
112024
Fast and Accurate End-to-End Span-based Semantic Role Labeling as Word-based Graph Parsing
S Zhou, Q Xia, Z Li, Y Zhang, Y Hong, M Zhang
COLING 2022, 2022
82022
FLA: A triton-based library for hardware-efficient implementations of linear attention mechanism
S Yang, Y Zhang
https://github.com/sustcsonglin/flash-linear-attention, 2024
62024
Non-autoregressive Text Editing with Copy-aware Latent Alignments
Y Zhang, Y Zhang, L Cui, G Fu
EMNLP 2023, 2023
42023
Gated Slot Attention for Efficient Linear-Time Sequence Modeling
Y Zhang, S Yang, R Zhu, Y Zhang, L Cui, Y Wang, B Wang, F Shi, B Wang, ...
NeurIPS 2024, 2024
2024
Nie można teraz wykonać tej operacji. Spróbuj ponownie później.
Prace 1–11