MoS: Unleashing parameter efficiency of low-rank adaptation with mixture of shards
A paper at ICLR 2025 presenting MoS, a method for more parameter-efficient LoRA through mixture of shards.
A paper at ICLR 2025 presenting MoS, a method for more parameter-efficient LoRA through mixture of shards.
A main conference paper at ACL 2024 presenting PRoLoRA for more parameter-efficient fine-tuning.
ACL 2024 Findings paper unifying LoRA and dropout in a single framework.