(no title)
abcdabcd987 | 2 years ago
We are polishing the 4-bit code. It will be added to Punica code base soon. Please stay tuned :)
abcdabcd987 | 2 years ago
We are polishing the 4-bit code. It will be added to Punica code base soon. Please stay tuned :)
Palmik|2 years ago
So Atom base models would be compatible with Punica?
I also wonder, many people already train LoRAs in 8 or even 4 bit (for the base model), would it make sense to match the quantization algo used during training and inference?