Location: Hangzhou, China | Email: pemywei@gmail.com
[Google Scholar] [DBLP]
Feel free and welcome to contact for intern positions and possible collaboration!
Biography
About Me: I am currently an Algorithm Expert in the Language Technology Lab at Alibaba DAMO Academy. I received my Ph.D. degree from the University of Chinese Academy of Sciences (UCAS) in 2021, supervised by Prof. Yue Hu.
My research interests include natural language generation and multilingual applications. Once awarded the Outstanding Paper Award at ACL 2022 and recognized as a Beijing Excellent Graduate. Currently, I focus on pretraining a polyglot large language model.
2023.09 Our paper "EMMA-X: An EM-like Multilingual Pre-training Algorithm for Cross-lingual Representation Learning" has been accepted to NeurIPS 2023 main conference.
2023.07 We are pleased to announce the launch of our new polyglot large language model, PolyLM.
2022.05 Our paper "Learning to Generalize to More: Continuous Semantic Augmentation for Neural Machine Translation" has been selected as an outstanding paper in ACL 2022.
2022.02 Our paper "Learning to Generalize to More: Continuous Semantic Augmentation for Neural Machine Translation" has been accepted to ACL 2022 main conference.
Publications
Ping Guo, Xiangpeng Wei*, Yue Hu*, Baosong Yang, Dayiheng Liu, Fei Huang and Jun Xie.
"EMMA-X: An EM-like Multilingual Pre-training Algorithm for Cross-lingual Representation Learning". In NeurIPS 2023.
Xiangpeng Wei, Haoran Wei, Huan Lin, Tianhao Li, Pei Zhang, Xingzhang Ren, Mei Li, Yu Wan, Zhiwei Cao, Binbin Xie, Tianxiang Hu, Shangjie Li, Binyuan Hui, Bowen Yu, Dayiheng Liu, Baosong Yang, Fei Huang, Jun Xie.
"PolyLM: An Open Source Polyglot Large Language Model". [paper][huggingface][modelscope]
Xiangpeng Wei, Heng Yu, Yue Hu, Rongxiang Weng, Weihua Luo, Jun Xie and Rong Jin.
"Learning to Generalize to More: Continuous Semantic Augmentation for Neural Machine Translation". In ACL 2022. Outstanding Paper Award. [paper][code][slides]
Xiangpeng Wei, Yue Hu, Rongxiang Weng, Luxi Xing, Heng Yu and Weihua Luo.
"On Learning Universal Representations Across Languages". In ICLR 2021. [paper]
Xiangpeng Wei, Heng Yu, Yue Hu, Rongxiang Weng, Luxi Xing and Weihua Luo.
"Uncertainty-Aware Semantic Augmentation for Neural Machine Translation". In EMNLP 2020. [paper][code]
Xiangpeng Wei, Heng Yu, Yue Hu, Yue Zhang, Rongxiang Weng and Weihua Luo.
"Multiscale Collaborative Deep Models for Neural Machine Translation". In ACL 2020. [paper][code]
Xiangpeng Wei, Yue Hu, Luxi Xing, Yipeng Wang and Li Gao. "Translating with Bilingual Topic Knowledge for Neural Machine Translation". In AAAI 2019. [paper]
Xiangpeng Wei, Yue Hu, Luxi Xing and Li Gao. "Unsupervised Neural Machine Translation with Future Rewarding". In CoNLL 2019. [paper]
Rongxiang Weng, Heng Yu, Xiangpeng Wei and Weihua Luo.
"Towards Enhancing Faithfulness for Neural Machine Translation". In EMNLP 2020.