Skip to main content

Deploy FLUX.2

Image

FLUX.2 is Black Forest Labs' 32B parameter model with multi-image editing capabilities. The Dev FP8 variant runs on RTX 4090/5090, while the full model requires 48GB+ VRAM for maximum quality.

Deploy FLUX.2 in minutes

Starting at $0.53/hr on dedicated GPU

Available Variants (3)

ModelGPUVRAMPriceAction
FLUX.2 Dev FP8
Dev FP8 (Balanced)
RTX A600048 GB$0.66/hrDeploy
FLUX.2 Dev Full
Dev Full (64GB)
A100 80GB PCIe80 GB$1.85/hrDeploy
FLUX.2 Dev Q4 GGUF
Dev Q4 (Low VRAM)
L424 GB$0.53/hrDeploy

Prices include 30% service fee. Billed per minute while running.

Requirements

FLUX.2 requires 24–80GB VRAM depending on variant. Consumer GPUs like the RTX 5080 (16GB) or RTX 4090 (24GB) may not have enough memory for larger variants.

On ModelPilot, deploy on a dedicated cloud GPU (up to 80GB VRAM) starting at $0.53/hr with no setup required.

Includes full ComfyUI environment with custom node support.

Use Cases

  • Multi-image editing workflows
  • Professional image generation
  • Fine-grained image control
  • Production-quality outputs

Related Models

Frequently Asked Questions

How much VRAM does FLUX.2 need?

FLUX.2 requires 24–80GB VRAM depending on the variant.

How much does it cost to run FLUX.2?

Starting at $0.53/hr on a dedicated GPU. Billed per minute while running, with auto-stop when credits run out.

How long does FLUX.2 take to deploy?

Most deployments complete in 10–20 minutes including model download and environment setup.

Can I run FLUX.2 on my local GPU?

You can run smaller variants locally if your GPU has enough VRAM. For larger variants or sustained production use, cloud GPUs offer more capacity and reliability.

Ready to deploy FLUX.2?

Pick your GPU and have it running in minutes. No infrastructure setup required.