gbieler's comments

gbieler | 1 year ago

We just launched our cloud solution to deploy any ComfyUI workflow on scalable infrastructure. It works with any node and model.

When you upload your workflow, our system detects all the custom nodes and will install them automatically. It will also download all the models that it recognises. If it can't find a model, because you are using a custom LoRA for example, it will ask for a download link and add it to the deployment.

We have 7 different GPUs available to cover all use cases: - T4 (16GB VRAM) - L4 (24GB VRAM) - A10G (24GB VRAM) - A100 (40GB VRAM) - A100-80GB (40GB VRAM) - L40S (48GB VRAM) - H100 (80GB VRAM)

We’ve optimised this solution to work with our open-source app builder for Comfy workflows: https://github.com/ViewComfy/ViewComfy. This means that you can turn a workflow into a web app running in the cloud in just a few minutes.

The deployments can also be accessed via APIs that can easily be integrated into existing apps.

gbieler | 1 year ago | on: Show HN: Open-source app builder for comfy workflows

Hey, thanks for sharing! I used it before and it is indeed useful for that kind of use case.

Like you said, there seems to be a gap between people who can use Comfy as a development tool and workflows' target users. The idea is to develop a set of tools to help bridge that gap.

page 1