Penghui Yang 杨鹏辉

Penghui Yang is currently a PhD student at the College of Computing and Data Science, Nanyang Technological University, supervised by Prof. Bo An. He received his B.Sc. degree in Computer Science from Nanjing University of Aeronautics and Astronautics in 2023, advised by Prof. Sheng-Jun Huang. Previously, he collaborated closely with Dr. Ming-Kun Xie and Prof. Lei Feng, and he is currently working closely with Dr. Cunxiao Du.

Google Scholar | DBLP

News
  • [Jul 2023] One paper was accepted by ICCV'23.

Research Highlights

My long-term research goal is to develop efficient and scalable methods for accelerating and compressing machine learning models. My current focus is on speculative decoding to speed up LLM inference, while my past work involved knowledge distillation for accelerating computer vision models. I aim to push the boundaries of fast and lightweight AI systems, making AI models more practical and widely accessible.

If you are interested in collaboration, feel free to get in touch with me.

Publications ( show selected / show all by date / show all by topic )

Topics: Speculative Decoding / Knowledge Distillation / Others

A Unified Open Adapter for Open-World Noisy Label Learning: Data-Centric and Learning-Based Insights
Chen-Chen Zong, Peng-Hui Yang, Ming-Kun Xie, Sheng-Jun Huang

TCSVT 2025 Paper | Code

LongSpec: Long-Context Speculative Decoding with Efficient Drafting and Verification
Penghui Yang*, Cunxiao Du*, Fengzhuo Zhang, Haonan Wang, Tianyu Pang, Chao Du, Bo An

arXiv 2025 Paper | Project Page | Code

Sailor2: Sailing in South-East Asia with Inclusive Multilingual LLMs
Sailor2 Team

arXiv 2025 Paper | Project Page | Code

Dual-Head Knowledge Distillation: Enhancing Logits Utilization with an Auxiliary Head
Penghui Yang, Chen-Chen Zong, Sheng-Jun Huang, Lei Feng, Bo An

arXiv 2024 Paper

Mitigating Backdoor Attacks in Federated Learning via Flipping Weight Updates of Low-Activation Input Neurons
Bin-Bin Ding, Penghui Yang, Ze-Qing Ge, Sheng-Jun Huang

arXiv 2024 Paper

Multi-Label Knowledge Distillation
Penghui Yang*, Ming-Kun Xie*, Chen-Chen Zong, Lei Feng, Gang Niu, Masashi Sugiyama, Sheng-Jun Huang

ICCV 2023 Paper | Code