top | item 44285186

(no title)

npollock | 8 months ago

LoRA adapters modify the model's internal weights

discuss

order

make3|8 months ago

not unless they're explicitly merged, which is not a requirement but a small speed only thing

_ea1k|8 months ago

Yeah, I honestly think some of the language used with LoRA gets in the way of people understanding them. It becomes much easier to understand when looking at an actual implementation, as well as how they can be merged or kept separate.