Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.runcomfy.com/llms.txt

Use this file to discover all available pages before exploring further.

RunComfy provides four API products, an MCP server for AI assistants, and a CLI for the terminal. The APIs share the same high-level flow (submit > get a request_id > fetch status/results), but they solve different problems.

Serverless API (ComfyUI)

Deploy ComfyUI workflows as serverless endpoints.

Trainer API (LoRA)

Run AI Toolkit LoRA training jobs on GPUs — bring your dataset + YAML config.

Serverless API (LoRA)

Deploy LoRAs as serverless endpoints.

Model API

Run hosted models on-demand with no deployment — pay per request.

MCP

Connect AI assistants (Claude, Cursor, Windsurf) to your deployments via MCP.

CLI

Run RunComfy models from your terminal or any AI agent. One command to submit, poll, and download.

Which API should I use?

Use this as a quick decision guide:
What you are trying to doRecommended APIWhat you call withDeployment required?
Train/fine‑tune an AI Toolkit LoRA (upload dataset, run training, download artifacts)Trainer APIdataset_id + job_idNo
Run a model from the RunComfy Models catalog (or a hosted pipeline)Model APImodel_idNo
Run inference with a LoRA without deploying anythingModel APImodel_id + LoRA inputsNo
Turn a ComfyUI workflow into a production endpoint (versions, autoscaling, webhooks, instance proxy)Serverless API (ComfyUI)deployment_idYes
Serve a LoRA behind a dedicated, scalable endpointServerless API (LoRA)deployment_idYes
One important mental model:
Both Serverless API (LoRA) and Serverless API (ComfyUI) are built on the same serverless deployment system. The difference is what you deploy and therefore what the request schema looks like.

Getting started


Common request pattern

Most RunComfy endpoints are asynchronous:
  1. Submit a job (POST …) > get an ID (request_id, job_id, etc.)
  2. Poll status (GET …/status) until it completes
  3. Fetch outputs (GET …/result) or use webhooks for push-based updates
If you are deploying workflows (Serverless API), you can also manage the deployment lifecycle (create/update/delete) and interact with live instances through the Instance Proxy.