Tao Gui (桂 韬)

alt text 

Pre-tenured Associate Professor
Institute of Modern Languages and Linguistics
NLP Group, School of Computer Science
Fudan University

Contact: Shanghai, China, 200438

  • Room 242, Environmental Science Building, 220 Handan Rd

  • Room A3011, Intersection Building No.2, 2005 Songhu Rd

Email: tgui [@] fudan[DOT] edu [DOT] cn

[CV] [Github] [Google Scholar]

About me

Dr. Gui is a pre-tenured associate professor in Institute of Modern Languages and Linguistics. At Fudan, he is the vice-director of NLP-Linguistic Intelligence lab (NLP-LI lab of Fudan NLP, the director is Prof. Qi Zhang), and member of a larger NLP group directed by Prof. Xuanjing Huang. He was also very fortunate to work in this laboratory for five years and get the PhD in 2021. Before entering Fudan University, he obtained a bachelor's degree from the National University of Defense Technology, and served in the army for several years, where he developed a perseverance and team spirit.


My research interests include

  • Natural Language Processing

  • Large Language Models

  • LLM Powered Learning

Selected Publications

alt text 

Secrets of RLHF in Large Language Models Part I: PPO
[Arxiv version], [Github]

alt text 

TextFlint: Unified Multilingual Robustness Evaluation Toolkit for Natural Language Processing
[Arxiv version], [ACL demo version], [TextFlint platform], [Github]

  • RE-Matching: A Fine-Grained Semantic Matching Method for Zero-Shot Relation Extraction
    Jun Zhao, WenYu Zhan, Tao Gui, Qi Zhang, Jin Ma and Ying Shan
    EMNLP 2022 Findings.

  • TextFusion: Privacy-Preserving Pre-trained Model Inference via Token Fusion
    Xin Zhou, Jinzhu Lu, Tao Gui, Ruotian Ma, Zichu Fei, Yuran Wang, Yong Ding, Yibo Cheung, Qi Zhang and Xuanjing Huang
    EMNLP 2022.

  • Efficient Adversarial Training with Robust Early-Bird Tickets
    zhiheng xi, rui zheng, Tao Gui, Qi Zhang and Xuanjing Huang
    EMNLP 2022.

  • Cross-Linguistic Syntactic Difference in Multilingual BERT: How Good is It and How Does It Affect Transfer?
    Ningyu Xu, Tao Gui, Ruotian Ma, Qi Zhang, Jingting Ye, Menghan Zhang and Xuanjing Huang
    EMNLP 2022.

  • Making Parameter-efficient Tuning More Efficient: A Unified Framework for Classification Tasks
    Xin Zhou, Ruotian Ma, Yicheng Zou, Xuanting Chen, Tao Gui, Qi Zhang, Xuanjing Huang, Rui Xie and Wei Wu
    COLING 2022.

  • Template-free Prompt Tuning for Few-shot NER
    Ruotian Ma, Xin Zhou, Tao Gui, Yiding Tan, Linyang Li, Qi Zhang, Xuanjing Huang
    NAACL 2022.

  • Searching for Optimal Subword Tokenization in Cross-domain NER
    Ruotian Ma, Yiding Tan, Xin Zhou, Xuanting Chen, Tao Gui, Di Liang, Sirui Wang, Wei Wu
    IJCAI 2022, Top 3.75%.

  • Flooding-X: Improving BERT's Resistance to Adversarial Attacks via Loss-Restricted Fine-Tuning
    Qin Liu, Rui Zheng, Bao Rong, Jingyi Liu, ZhiHua Liu, Zhanzhan Cheng, Liang Qiao, Tao Gui, Qi Zhang, Xuanjing Huang
    ACL 2022.

  • Robust Lottery Tickets for Pre-trained Language Models
    Rui Zheng, Rong Bao, Yuhao Zhou, Di Liang, Sirui Wang, Wei Wu, Tao Gui, Qi Zhang, Xuanjing Huang
    ACL 2022.

  • ONE2SET: Generating Diverse Keyphrases as a Set
    Jiacheng Ye, Tao Gui*, Yichao Luo, Yige Xu, and Qi Zhang. (* Corresponding author)
    ACL 2021. [pdf] [code]

  • SENT: Sentence-level Distant Relation Extraction via Negative Training
    Ruotian Ma, Tao Gui*, Linyang Li, Qi Zhang, Xuanjing Huang and Yaqian Zhou. (* Corresponding author)
    ACL 2021. [pdf] [code]

  • A Unified Generative Framework for Various NER Tasks
    Hang Yan, Tao Gui, Junqi Dai, Qipeng Guo, Zheng Zhang and Xipeng Qiu.
    ACL 2021. [pdf] [code]