Obserwuj
Shuangzhi Wu
Shuangzhi Wu
Bytedance
Zweryfikowany adres z bytedance.com
Tytuł
Cytowane przez
Cytowane przez
Rok
Achieving human parity on automatic chinese to english news translation
H Hassan, A Aue, C Chen, V Chowdhary, J Clark, C Federmann, X Huang, ...
arXiv preprint arXiv:1803.05567, 2018
6852018
Sequence-to-dependency neural machine translation
S Wu, D Zhang, N Yang, M Li, M Zhou
Proceedings of the 55th Annual Meeting of the Association for Computational …, 2017
1122017
Alternating language modeling for cross-lingual pre-training
J Yang, S Ma, D Zhang, S Wu, Z Li, M Zhou
Proceedings of the AAAI Conference on Artificial Intelligence 34 (05), 9386-9393, 2020
752020
Regularizing neural machine translation by target-bidirectional agreement
Z Zhang, S Wu, S Liu, M Li, M Zhou, T Xu
Proceedings of the AAAI Conference on Artificial Intelligence 33 (01), 443-450, 2019
672019
Dependency-to-dependency neural machine translation
S Wu, D Zhang, Z Zhang, N Yang, M Li, M Zhou
IEEE/ACM Transactions on Audio, Speech, and Language Processing 26 (11 …, 2018
642018
Unsupervised keyphrase extraction by jointly modeling local and global context
X Liang, S Wu, M Li, Z Li
arXiv preprint arXiv:2109.07293, 2021
562021
Improved Neural Machine Translation with Source Syntax.
S Wu, M Zhou, D Zhang
IJCAI, 4179-4185, 2017
392017
Improving unsupervised extractive summarization with facet-aware modeling
X Liang, S Wu, M Li, Z Li
Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 …, 2021
382021
Punctuation prediction with transition-based parsing
D Zhang, S Wu, N Yang, M Li
Proceedings of the 51st Annual Meeting of the Association for Computational …, 2013
382013
Central limit theorem and bootstrap approximation in high dimensions: Near rates via implicit smoothing
ME Lopes
The Annals of Statistics 50 (5), 2492-2513, 2022
372022
Efficient disfluency detection with transition-based parsing
S Wu, D Zhang, M Zhou, T Zhao
Proceedings of the 53rd Annual Meeting of the Association for Computational …, 2015
292015
Tencent neural machine translation systems for the WMT20 news translation task
S Wu, X Wang, L Wang, F Liu, J Xie, Z Tu, S Shi, M Li
Proceedings of the fifth conference on machine translation, 313-319, 2020
272020
Attention calibration for transformer in neural machine translation
Y Lu, J Zeng, J Zhang, S Wu, M Li
Proceedings of the 59th Annual Meeting of the Association for Computational …, 2021
242021
Depn: Detecting and editing privacy neurons in pretrained language models
X Wu, J Li, M Xu, W Dong, S Wu, C Bian, D Xiong
arXiv preprint arXiv:2310.20138, 2023
232023
Unleashing infinite-length input capacity for large-scale language models with self-controlled memory system
X Liang, B Wang, H Huang, S Wu, P Wu, L Lu, Z Ma, Z Li
arXiv preprint arXiv:2304.13343, 2023
202023
Learning confidence for transformer-based neural machine translation
Y Lu, J Zeng, J Zhang, S Wu, M Li
arXiv preprint arXiv:2203.11413, 2022
182022
UM4: unified multilingual multiple teacher-student model for zero-resource neural machine translation
J Yang, Y Yin, S Ma, D Zhang, S Wu, H Guo, Z Li, F Wei
arXiv preprint arXiv:2207.04900, 2022
172022
Tencent translation system for the WMT21 news translation task
L Wang, M Li, F Liu, S Shi, Z Tu, X Wang, S Wu, J Zeng, W Zhang
Proceedings of the sixth conference on machine translation, 216-224, 2021
162021
Improving machine reading comprehension with single-choice decision and transfer learning
Y Jiang, S Wu, J Gong, Y Cheng, P Meng, W Lin, Z Chen
arXiv preprint arXiv:2011.03292, 2020
162020
Source dependency-aware transformer with supervised self-attention
C Wang, S Wu, S Liu
arXiv preprint arXiv:1909.02273, 2019
132019
Nie można teraz wykonać tej operacji. Spróbuj ponownie później.
Prace 1–20