(no title)
enum | 4 months ago
Does it work if you change to torch.bfloat16?
- https://publish.obsidian.md/aixplore/Practical+Applications/... The PyTorch 2.9 wheels do work. You can pip install torch --index-url <whatever-it-is> and it just works. You do need to build flash attention from source, which takes an hour or so.
No comments yet.