Welcome to the website of the Learning and INference Systems (LINs) Lab at Westlake University!

Our research interests lie in the intersection of optimization and generalization for deep learning:

  • leveraging theoretical/empirical understanding (e.g., loss landscape, and training dynamics)
  • to design efficient & robust methods (both learning and inference)
  • for deep learning (centralized) and collaborative deep learning (distributed and/or decentralized),
  • under imperfect environments (e.g., noisy, heterogeneous, and hardware-constrained).

Lab activities:


News

Feb 27, 2024 Our paper on efficient dataset distillation was accepted at this year’s CVPR 2024 conference. Congratulations to Peng.
Feb 4, 2024 We are looking for self-motivated research internship students! Please email us with your CV.
Jan 16, 2024 Two papers of our group were accepted at this year’s ICLR 2024 conference, on model selection for robust multi-modal model reasoning and parameter-efficient fine-tuning. Congratulations to Xiangyan, Rongxue, Haobo and Hao. In addition, we present workshop papers on collaborative knowledge editing for LLMs, flash tree-attention for efficient LLM inference, federated unlearning, and any-scale dataset distillation.
Sep 22, 2023 One paper was accepted to NeurIPS 2023. Congratulations to Lin.
Jul 14, 2023 One paper was accepted to ICCV 2023. Congratulations to Zexi.

Selected publications

  1. ICML 2023
    On Pitfalls of Test-time Adaptation
    In International Conference on Machine Learning (ICML), abridged in ICLR Workshop on Trustworthy ML 2023 and ICLR Workshop on DG (spotlight) 2023
  2. ICLR 2023
    Test-Time Robust Personalization for Federated Learning
    In International Conference on Learning Representations (ICLR) 2023
  3. ICML 2021
    Quasi-global Momentum: Accelerating Decentralized Deep Learning on Heterogeneous Data
    In International Conference on Machine Learning (ICML) 2021
  4. NeurIPS 2020
    Ensemble Distillation for Robust Model Fusion in Federated Learning
    In Advances in Neural Information Processing Systems (NeurIPS) 2020
  5. ICLR 2020
    Decentralized Deep Learning with Arbitrary Communication Compression
    In International Conference on Learning Representations (ICLR) 2020
  6. ICLR 2020
    Don’t Use Large Mini-batches, Use Local SGD
    In International Conference on Learning Representations (ICLR) 2020