top | item 46332517 (no title) djfobbz | 2 months ago Lol, right?!?! I would've expected sequential PNGs followed by SVGs once the model improved. discuss order hn newest CamperBob2|2 months ago That's what the example code at https://old.reddit.com/r/StableDiffusion/comments/1pqnghp/qw... generates. You get 0.png, 1.png ... n.png, where n= the requested number of layers-1.It'll drop a 600W RTX 6000 to its knees for about a minute, but it does work. dvrp|2 months ago I saw some people at a company called Pruna AI got it down to 8 seconds with Cloudflare/Replicate, but I don't know if it was on consumer hardware or an A100/H100/H200, and I don't know if the inference optimization is open-source yet.
CamperBob2|2 months ago That's what the example code at https://old.reddit.com/r/StableDiffusion/comments/1pqnghp/qw... generates. You get 0.png, 1.png ... n.png, where n= the requested number of layers-1.It'll drop a 600W RTX 6000 to its knees for about a minute, but it does work. dvrp|2 months ago I saw some people at a company called Pruna AI got it down to 8 seconds with Cloudflare/Replicate, but I don't know if it was on consumer hardware or an A100/H100/H200, and I don't know if the inference optimization is open-source yet.
dvrp|2 months ago I saw some people at a company called Pruna AI got it down to 8 seconds with Cloudflare/Replicate, but I don't know if it was on consumer hardware or an A100/H100/H200, and I don't know if the inference optimization is open-source yet.
CamperBob2|2 months ago
It'll drop a 600W RTX 6000 to its knees for about a minute, but it does work.
dvrp|2 months ago