Parameter-Efficient Fine-Tuning

MoS: Unleashing parameter efficiency of low-rank adaptation with mixture of shards featured image

MoS: Unleashing parameter efficiency of low-rank adaptation with mixture of shards

A paper at ICLR 2025 presenting MoS, a method for more parameter-efficient LoRA through mixture of shards.

avatar
Sheng Wang
PRoLoRA: Partial rotation empowers more parameter-efficient LoRA featured image

PRoLoRA: Partial rotation empowers more parameter-efficient LoRA

A main conference paper at ACL 2024 presenting PRoLoRA for more parameter-efficient fine-tuning.

avatar
Sheng Wang
LoRA meets dropout under a unified framework featured image

LoRA meets dropout under a unified framework

ACL 2024 Findings paper unifying LoRA and dropout in a single framework.

avatar
Sheng Wang