PRoLoRA: Partial rotation empowers more parameter-efficient LoRA
Aug 1, 2024·
,,,,,,·
0 min read
Sheng Wang
Boyang Xue
Jiacheng Ye
Jiyue Jiang
Liheng Chen
Lingpeng Kong
Chuan Wu

Abstract
PRoLoRA introduces partial rotation to empower more parameter-efficient LoRA, achieving better performance with fewer trainable parameters.
Type
Publication
In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)

Authors
Sheng Wang
(Forence)
PhD Graduate in Computer Science
Sheng Wang is a PhD graduate from The University of Hong Kong, supervised by Prof. Chuan Wu and Prof. Lingpeng Kong.
His research focuses on Agent, LLM Super-Alignment, and Data Synthesis. He has published 14+ papers in top-tier
conferences including NIPS2025 (Spotlight), ICLR2025, ACL2024/2025, EMNLP2025.
Authors
Authors
Authors
Authors
Authors
Authors