728x90 FT1 [논문 리뷰] DoRA: Weight-Decomposed Low-Rank Adaptation 논문 링크 : https://arxiv.org/abs/2402.09353 DoRA: Weight-Decomposed Low-Rank Adaptation Among the widely used parameter-efficient finetuning (PEFT) methods, LoRA and its variants have gained considerable popularity because of avoiding additional inference costs. However, there still often exists an accuracy gap between these methods and full arxiv.org 현재 주목받는 효과적인 파라미터 최적화 방법인 LoRA는 저순위 분해(low-rank.. 2024. 2. 19. 이전 1 다음 728x90