Haotong Qin

Center for Project-Based Learning D-ITET, ETH Zürich
Office: ETZ D 97.4, Gloriastrasse 35, 8092 Zürich, Switzerland
Email: qinhaotong@gmail.com

I am a Postdoctoral Researcher at the Center for Project-Based Learning (PBL) D-ITET, ETH Zürich, Switzerland, working with PD Dr. Michele Magno. Previously, I received my Ph.D. degree from the State Key Laboratory of Complex & Critical Software Environment (SKLCCSE), Beihang University, supervised by Prof. Wei Li and Prof. Xianglong Liu. I obtained my B.E. degree from the School of Computer Science and Engineering (SCSE), Beihang University. I interned at Microsoft Research Asia, ByteDance AI Lab, and Tencent WeChat.

My research interest broadly includes efficient deep learning. Specifically, I focus on deep model compression (e.g., network binarization, quantization, and distillation), efficient generative models (e.g., efficient large language models and diffusion models), neuromorphic computing (e.g., spiking neural network), hardware acceleration (e.g., hardware-aware architecture search), etc. I am/was an Area Chair for BMCV'2024 and AISTATS'2025, a Program Committee member for ICML'(2023-2024), NeurIPS'(2023-2024), ICLR'(2024-2025), CVPR'(2023-2025), ICCV'2023, ECCV'2024, etc., and an Organizer or Challenge Chair for workshops at CVPR'(2022-2024), AAAI'(2022-2023), and IEEE CAI'2024.

For prospective collaborators, if you are interested in joining/visiting our lab or remotely working with us, please email me with your self-introduction, the project of interest, and CV to haotong.qin@pbl.ee.ethz.ch.
[Special Issue] Model Compression in the Era of Large Language Models @ Neural Networks

News

2024-11 I obtain the CSIG 2024 Outstanding PhD Thesis Award (10 people nationwide).
2024-10 I am serving as an Area Chair for AISTATS 2025.
2024-09 Four (one spotlight) papers and one workshop paper are accepted by NeurIPS 2024.
2024-06 I obtain the WAIC Yunfan Award (Rising Star) 2024 (15 people worldwide).
2024-05 I am serving as an Guest Editor for Neural Networks.
2024-05 One paper for LLM quantization is accepted by ACL 2024.
2024-05 Five papers (one oral) for efficient LLMs and DMs are accepted by ICML 2024.
2024-05 I am serving as an Area Chair for BMVC 2024.
2024-04 I obtain the Electronics 2023 Best PhD Thesis Award (2 people worldwide).
2024-01 I obtain the Outstanding Graduate of Beijing 2024.
2024-01 I obtain the Baidu Scholarship 2023 (10 people worldwide).
2023-09 Two first-authored papers (one spotlight) are model quantization is accepted by NeurIPS 2023.
2023-09 I obtain China National Scholarship (3rd time).
2023-06 We release our open source project "Awesome Efficient AIGC".
2023-05 I am named one of the CVPR 2023 Outstanding Reviewers.
2023-04 One first-authored paper for data-free quantization is accepted by IEEE TPAMI.
2023-04 One first-authored paper for model binarization benchmark is accepted by ICML 2023.
2023-04 I obtain the DAAD AInet Fellowship 2023.
2023-02 One first-authored paper for FSMN binarization is accepted by IEEE TNNLS.
2022-12 I am selected for the Rising Stars in AI Symposium 2023 organized by KAUST AI initiative.
2022-10 I obtain the ByteDance Scholarship 2022.
2022-10 Our BiPointNet (ICLR'21) integrated into Amazon's Deep Graph Library (DGL).
2022-09 One first-authored paper for model binarization is accepted by IJCV.
2022-06 One co-authored paper for ViT quantization is accepted by ACM MM 2022.
2022-06 I obtain the Beihang Top-10 PhD Students Award 2022.
2022-05 Our BiBERT (ICLR'22) integrated into Baidu's deep learning platform PaddlePaddle.
2022-04 One first-authored paper for FSMN binarization is accepted by IJCAI 2022.
2022-03 Our survey paper for binary neural networks is selected to ESI Highly Cited Papers
2022-01 One first-authored paper for BERT binarization is accepted by ICLR 2022.
2021-09 Our BiPointNet (ICLR'21) obtain the Most Popular Paper in Beijing Area.
2021-09 I obtain China National Scholarship 2021 (2nd time).
2021-05 I obtain Huawei Scholarship 2021.
2021-03 One co-first-authored oral paper for data-free quantization is accepted by CVPR 2021.
2021-01 One first-authored paper for PointNet binarization is accepted by ICLR 2021.
2020-09 I obtain China National Scholarship 2020.
2020-09 We release our open source project "Awesome Model Quantization".
2020-02 One first-authored paper for model binarization is accepted by CVPR 2020.
2020-02 One first-authored survey paper for binary neural networks is accepted by PR.

Selected Publications

  1. Binarized Diffusion Model for Image Super-Resolution
    Zheng Chen, Haotong Qin*, Yong Guo, Xiongfei Su, Xin Yuan, Linghe Kong, Yulun Zhang*
    In Conference on Neural Information Processing Systems (NeurIPS), 2024
  2. BiDM: Pushing the Limit of Quantization for Diffusion Models
    Xingyu Zheng, Xianglong Liu, Yichen Bian, Xudong Ma, Yulun Zhang, Jiakai Wang, Jinyang Guo, Haotong Qin
    In Conference on Neural Information Processing Systems (NeurIPS), 2024
  3. Accurate LoRA-Finetuning Quantization of LLMs via Information Retention
    Haotong Qin, Xudong Ma, Xingyu Zheng, Xiaoyang Li, Yang Zhang, Shouda Liu, Jie Luo, Xianglong Liu, Michele Magno
    In International Conference on Machine Learning (ICML Oral), 2024
  4. BiLLM: Pushing the Limit of Post-Training Quantization for LLMs
    Wei Huang, Yangdong Liu, Haotong Qin*, Ying Li, Shiming Zhang, Xianglong Liu, Michele Magno, Xiaojuan Qi
    In International Conference on Machine Learning (ICML), 2024
  5. Flexible Residual Binarization for Image Super-Resolution
    Yulun Zhang, Haotong Qin, Zixiang Zhao, Xianglong Liu, Martin Danelljan, Fisher Yu
    In International Conference on Machine Learning (ICML), 2024
  6. QuantSR: Accurate Low-bit Quantization for Efficient Image Super-Resolution
    Haotong Qin, Yulun Zhang, Yifu Ding, Yifan Liu, Xianglong Liu, Martin Danelljan, Fisher Yu
    In Conference on Neural Information Processing Systems (NeurIPS Spotlight), 2023
  7. BiMatting: Efficient Video Matting via Binarization
    Haotong Qin, Lei Ke, Xudong Ma, Martin Danelljan, Yu-Wing Tai, Chi-Keung Tang, Xianglong Liu, Fisher Yu
    In Conference on Neural Information Processing Systems (NeurIPS), 2023
  8. BiBench: Benchmarking and Analyzing Network Binarization
    Haotong Qin, Mingyuan Zhang, Yifu Ding, Aoyu Li, Zhongang Cai, Ziwei Liu, Fisher Yu, Xianglong Liu
    In International Conference on Machine Learning (ICML), 2023
  9. BiBERT: Accurate Fully Binarized BERT
    Haotong Qin, Yifu Ding, Mingyuan Zhang, Qinghua Yan, Aishan Liu, Qingqing Dang, Ziwei Liu, Xianglong Liu
    In International Conference on Learning Representations (ICLR), 2022
  10. BiFSMN: Binary Neural Network for Keyword Spotting
    Haotong Qin, Xudong Ma, Yifu Ding, Xiaoyang Li, Yang Zhang, Yao Tian, Zejun Ma, Jie Luo, Xianglong Liu
    In International Joint Conference on Artificial Intelligence (IJCAI), 2022
  11. Diversifying Sample Generation for Accurate Data-Free Quantization
    Xiangguo Zhang, Haotong Qin, Yifu Ding, Ruihao Gong, Qinghua Yan, Renshuai Tao, Yuhang Li, Fengwei Yu, Xianglong Liu
    In Computer Vision and Pattern Recognition (CVPR Oral), 2021
  12. BiPointNet: Binary Neural Network for Point Clouds
    Haotong Qin, Zhongang Cai, Mingyuan Zhang, Yifu Ding, Haiyu Zhao, Shuai Yi, Xianglong Liu, Hao Su
    In International Conference on Learning Representations (ICLR), 2021
  13. Forward and Backward Information Retention for Accurate Binary Neural Networks
    Haotong Qin, Ruihao Gong, Xianglong Liu, Mingzhu Shen, Ziran Wei, Fengwei Yu, Jingkuan Song
    In Computer Vision and Pattern Recognition (CVPR), 2020
© Copyright 2024 Haotong Qin. Powered by Jekyll with al-folio theme. Last updated: Feb 04, 2024.