您当前所在的位置:首页 > 师资队伍 > 教师名录

教师名录

王瑞
长聘教轨副教授

邮箱:wangrui12@sjtu.edu.cn

所在研究所:通用人工智能研究所

个人主页:https://wangruinlp.github.io/

个人简介

I have been working at Shanghai Jiao Tong University since 2021. Previously, I worked at the Japanese National Institute of Information and Communications Technology from 2016 to 2020. I was a joint Ph.D. student at Shanghai Jiao Tong University and the French National Centre for Scientific Research from 2012 to 2016.


Language intelligence is a form of intelligence in which humans and machines learn about the outside world by reading natural language, producing highly abstract linguistic thinking, and then understanding, improving, and creating about the outside world. We aim to explore and extend the boundaries of language intelligence in the following areas:


1. Computational Linguistics: An interdisciplinary field of computer science, linguistics, cognitive science, psychology, etc. 

2. Language Modeling: The mechanism and application. 

3. Machine Translation: I worked on it for over ten years and will never give it up.


教授课程

Lecture

CS3966: Natural Language Processing and Large Language Model (for the John Class), 2024-
CS3602: Natural Language Processing (for the CS and AI major), 2021-
CS438: Information Extraction, 2021-2023
CS247: Data Mining, 2021-2022


Tutorial

Advances and Challenges in Unsupervised Neural Machine Translation.
   Rui Wang and Hai Zhao
   EACL, 2021
Syntax in End-to-End Natural Language Processing
    Hai Zhao, Rui Wang, and Kehai Chen
    EMNLP, 2021
Domain Adaptation for Neural Machine Translation
   Chenhui Chu and Rui Wang
   CCMT, 2019

论文发表

[Google Scholar]


[DBLP]


Representative Works 

[Neural Theory-of-Mind Networks]: Spontaneous High-Order Generalization in Neural Theory-of-Mind Networks. ArXiv. 2025

[DeepMath-103K]: DeepMath-103K: A Large-Scale, Challenging, Decontaminated, and Verifiable Mathematical Dataset for Advancing Reasoning. ArXiv. 2025

[Overthinking]: Do NOT Think That Much for 2+3=? On the Overthinking of o1-Like LLMs. ICML.2025

[Chain-of-Embedding]: Latent Space Chain-of-Embedding Enables Output-free LLM Self-Evaluation. ICLR.2025

[Unsupervised Machine Translation]: Towards Unsupervised Machine Translation. EACL.2021

[Human-like Machine Translation]: Exploring Human-Like Translation Strategy with Large Language Models. TACL.2024

资助项目

PI of NSFC "Multilingual Unsupervised Machine Translation", 2022-2025

PI of JSPS "Toward Unsupervised Machine Translation", 2019-2020

PI of a series of CCF-Tencent/Baidu/Ant/Alibaba joint projects on NLP and LLM, 2021-now

获奖信息

WMT-2024: 1st places in Translation into Low-Resource Languages of Spain[Paper]
WMT-2023: 1st places in Word-Level AutoCompletion[Paper]
WMT-2022: 1st places in Livonian<->English[Paper]
IWSLT-2022: 1st in the Simultaneous Speech Translation task [Results][Paper]
WMT-2020: 1st in three tasks (English->Chinese, Polish->English, and German-Upper Sorbian) [Results][Paper]
CoNLL-2019: 1st in the DM sub-task and the 2nd overall [Results][Paper]
WMT-2019: 1st in the only unsupervised MT task (German-Czech) [Results] [Paper]
WAT-2018: 1st places in Myanmar (Burmese) <- English [Results][Paper]
WMT-2018: 1st places in four tasks (English<->Estonian and English<->Finnish) [Results][Paper]

学术服务

Program Chair: MT Summit 2023 (Research Track)
Senior Area Chairs: EACL-2026, EMNLP-2025, NAACL-2025, ACL-2024
Area Chairs: ICML (2023-2025), NeurIPS (2022-2025), ICLR (2021-2025), EMNLP (2022), NAACL (2021), CoNLL (2021-2022), CCL (2018-2019)
Associate Editors: ACL Rolling Review (2024), IEICE Transactions on Information and Systems (2022-2024)
Standing Reviewers: CL (2020-2024) and TACL (2020-2024)