Yao Wan
Cytowane przez
Cytowane przez
Improving automatic source code summarization via deep reinforcement learning
Y Wan, Z Zhao, M Yang, G Xu, H Ying, J Wu, PS Yu
Proceedings of the 33rd ACM/IEEE international conference on automated …, 2018
Find or classify? dual strategy for slot-value predictions on multi-domain dialog state tracking
JG Zhang, K Hashimoto, CS Wu, Y Wan, PS Yu, R Socher, C Xiong
The 9th Joint Conference on Lexical and Computational Semantics (*SEM 2020), 2019
Multi-modal attention network learning for semantic source code retrieval
Y Wan, J Shu, Y Sui, G Xu, Z Zhao, J Wu, PS Yu
Proceedings of the 34th ACM/IEEE International Conference on Automated …, 2019
Kg-bart: Knowledge graph-augmented bart for generative commonsense reasoning
Y Liu, Y Wan, L He, H Peng, SY Philip
Proceedings of the AAAI Conference on Artificial Intelligence 35 (7), 6418-6425, 2021
Reinforcement-learning-guided source code summarization using hierarchical attention
W Wang, Y Zhang, Y Sui, Y Wan, Z Zhao, J Wu, SY Philip, G Xu
IEEE Transactions on software Engineering 48 (1), 102-119, 2020
Discriminative nearest neighbor few-shot intent detection by transferring natural language inference
JG Zhang, K Hashimoto, W Liu, CS Wu, Y Wan, PS Yu, R Socher, C Xiong
The 2020 Conference on Empirical Methods in Natural Language Processing …, 2020
Fcca: Hybrid code representation for functional clone detection using attention networks
W Hua, Y Sui, Y Wan, G Liu, G Xu
IEEE Transactions on Reliability 70 (1), 304-318, 2020
Syncobert: Syntax-guided multi-modal contrastive pre-training for code representation
X Wang, Y Wang, F Mi, P Zhou, Y Wan, X Liu, L Li, H Wu, J Liu, X Jiang
arXiv preprint arXiv:2108.04556, 2021
Multi-view factorization machines for mobile app recommendation based on hierarchical attention
T Liang, L Zheng, L Chen, Y Wan, SY Philip, J Wu
Knowledge-Based Systems 187, 104821, 2020
Multi-modal generative adversarial network for short product title generation in mobile e-commerce
JG Zhang, P Zou, Z Li, Y Wan, X Pan, Y Gong, PS Yu
2019 Annual Conference of the North American Chapter of the Association for …, 2019
SCSMiner: mining social coding sites for software developer recommendation with relevance propagation
Y Wan, L Chen, G Xu, Z Zhao, J Tang, J Wu
World Wide Web 21, 1523-1543, 2018
Exploiting geographical location for team formation in social coding sites
Y Han, Y Wan, L Chen, G Xu, J Wu
Advances in Knowledge Discovery and Data Mining: 21st Pacific-Asia …, 2017
Are pretrained transformers robust in intent classification? a missing ingredient in evaluation of out-of-scope intent detection
JG Zhang, K Hashimoto, Y Wan, Y Liu, C Xiong, PS Yu
Proceedings of the 4th Workshop on NLP for Conversational AI, ACL 2022, 2021
Enriching Non-Autoregressive Transformer with Syntactic and Semantic Structures for Neural Machine Translation
Y Liu, Y Wan, JG Zhang, W Zhao, PS Yu
EACL 2021, 2021
What Do They Capture?--A Structural Analysis of Pre-Trained Language Models for Source Code
Y Wan, W Zhao, H Zhang, Y Sui, G Xu, H Jin
ICSE 2022, 2022
Local-global knowledge distillation in heterogeneous federated learning with non-iid data
D Yao, W Pan, Y Dai, Y Wan, X Ding, H Jin, Z Xu, L Sun
arXiv preprint arXiv:2107.00051, 2021
Competitive multi-agent deep reinforcement learning with counterfactual thinking
Y Wang, Y Wan, C Zhang, L Bai, L Cui, P Yu
2019 IEEE International Conference on Data Mining (ICDM), 1366-1371, 2019
Incorporating heterogeneous information for mashup discovery with consistent regularization
Y Wan, L Chen, Q Yu, T Liang, J Wu
Advances in Knowledge Discovery and Data Mining: 20th Pacific-Asia …, 2016
FedBERT: When Federated Learning Meets Pre-training
Y Tian, Y Wan, L Lyu, D Yao, H Jin, L Sun
ACM Transactions on Intelligent Systems and Technology (TIST) 13 (4), 1-26, 2022
Hetformer: Heterogeneous transformer with sparse attention for long-text extractive summarization
Y Liu, JG Zhang, Y Wan, C Xia, L He, PS Yu
The 2021 Conference on Empirical Methods in Natural Language Processing …, 2021
Nie można teraz wykonać tej operacji. Spróbuj ponownie później.
Prace 1–20