Obserwuj
Ziwei He
Ziwei He
Zweryfikowany adres z sjtu.edu.cn
Tytuł
Cytowane przez
Cytowane przez
Rok
Rasat: Integrating relational structures into pretrained seq2seq model for text-to-sql
J Qi, J Tang, Z He, X Wan, Y Cheng, C Zhou, X Wang, Q Zhang, Z Lin
EMNLP 2022: The 2022 Conference on Empirical Methods in Natural Language …, 2022
572022
A comparative study of different approaches for tracking communities in evolving social networks
Z He, EG Tajeuna, S Wang, M Bouguessa
2017 IEEE International Conference on Data Science and Advanced Analytics …, 2017
102017
Few-shot table-to-text generation with prompt planning and knowledge memorization
Z Guo, M Yan, J Qi, J Zhou, Z He, Z Lin, G Zheng, X Wang
arXiv preprint arXiv:2302.04415, 2023
52023
Fourier Transformer: Fast Long Range Modeling by Removing Sequence Redundancy with FFT Operator
Z He, M Yang, M Feng, J Yin, X Wang, J Leng, Z Lin
ACL 2023: Findings of the Association for Computational Linguistics, 2023
22023
Towards Controlled Table-to-Text Generation with Scientific Reasoning
Z Guo, J Zhou, J Qi, M Yan, Z He, G Zheng, Z Lin, X Wang, C Zhou
ICASSP 2024: IEEE International Conference on Acoustics, Speech and Signal …, 2024
12024
Few-Shot Table-to-Text Generation with Prompt-based Adapter
Z Guo, M Yan, J Qi, J Zhou, Z He, Z Lin, G Zheng, X Wang
arXiv e-prints, arXiv: 2302.12468, 2023
12023
Fovea Transformer: Efficient Long-Context Modeling with Structured Fine-to-Coarse Attention
Z He, J Yuan, L Zhou, J Leng, B Jiang
ICASSP 2024: The 2024 IEEE International Conference on Acoustics, Speech and …, 2023
2023
Adapting Prompt for Few-shot Table-to-Text Generation
Z Guo, M Yan, J Qi, J Zhou, Z He, Z Lin, G Zheng, X Wang
arXiv preprint arXiv:2302.12468, 2023
2023
Nie można teraz wykonać tej operacji. Spróbuj ponownie później.
Prace 1–8