Skip to main content
Common questions about the RunComfy MCP server.

What deployments can I access?

The MCP server uses your API token to call the RunComfy Serverless API. You see exactly the same deployments as on your Deployments dashboard — both ComfyUI workflow deployments and LoRA deployments.

How is my API key handled?

Your API token is sent per-request in the Authorization: Bearer header. The MCP server is stateless — it does not store, log, or cache your token. Each tool call forwards your token directly to api.runcomfy.net, which handles authentication and billing attribution.

What does it cost?

The MCP server itself is free. You pay only for the RunComfy resources you use:
  • Inference requests are billed the same as calling the Serverless API directly
  • Deployments with min_instances > 0 incur GPU uptime charges even when idle
  • Deployments with min_instances = 0 cost nothing when idle (scale-to-zero)
See Serverless API Billing for pricing details.

How do I pass images or videos as inputs?

When a workflow node requires an image, video, or audio file, pass the input directly in the overrides object of submit_request:
  • Public HTTPS URL (recommended): "image": "https://example.com/photo.jpg" — use a URL that returns the raw file without authentication
  • Base64 data URI: "image": "data:image/jpeg;base64,/9j/4AAQ..." — for inline content
No separate upload step is needed. See Async Queue Endpoints for more details and examples.

Are there rate limits?

The MCP server respects the same rate limits as the RunComfy Serverless API. There are no additional MCP-specific rate limits on top of that.

How do I find node IDs for my workflow?

Call get_deployment with include_payload=true. The response includes a payload_summary with every node’s ID, class type, and input names. Use these to build the overrides object for submit_request. For example, if the summary shows node_id: "6" with class_type: "CLIPTextEncode" and input_names: ["text", "clip"], your override would be:
{
  "6": {
    "inputs": {
      "text": "your prompt here"
    }
  }
}
For more context on workflow files and node schemas, see Workflow Files.

Can I use a different workflow without redeploying?

Yes. The submit_request tool accepts an optional workflow_api_json parameter that lets you send a full ComfyUI workflow inline. The deployment’s stored workflow is bypassed for that request. This is useful for testing workflow changes before updating the deployment. See Async Queue Endpoints — Send dynamic workflow for details.

Does it work with ChatGPT?

ChatGPT Enterprise supports MCP connectors but requires OAuth 2.0 authentication, which is not yet supported by the RunComfy MCP server. We are actively working on OAuth support. In the meantime, you can use RunComfy from ChatGPT via Custom GPT Actions with the Serverless API’s REST endpoints directly. Contact us at hi@runcomfy.com for a sample OpenAPI spec.

Where can I learn more about the Serverless API?

The MCP tools map directly to the Serverless API endpoints: