Feel the Power: Run ComfyUI on Cloud GPUs - Full VM Setup Guide
Running ComfyUI locally is great, but unless you're running a RTX 5090 or similar high end hardware, you will run into bottlenecks sooner than later.
So using cloud hosted GPUs from platforms like CloudRift will not only save your hardware, it's also give you much faster results. Plus you only pay for what you use. In a previous article I gave you a walkthough on how to set up ComfyUI in a pre-configured container. Which is great to start out, but as soon as you want to add checkpoints and configure your workflows more, you'll need some additional freedom.
If that's what you're looking for, and you want to configure your own environment instead of using a pre-built container, this guide shows how to launch ComfyUI on a rented GPU-powered VM in minutes.
This is Part 2 of our ComfyUI series. Part 1: Preconfigured Containers →
Prerequisites
Before starting, you'll need a GPU VM:
- Register at CloudRift Console
- Add a few dollars in credits
- Click "New" and select "VM" under the Virtualization step


- Choose Ubuntu 24.04 as your OS

Once your VM is running, continue below.
Quick Overview
- Works on Ubuntu 22.04 and 24.04
- Tested on RTX 4090 instances
- No extra dependencies
- Takes less than 3 minutes
Step-by-Step Setup
1. Connect to your VM
To connect to your VM via SSH, run the following command.
Open your terminal (or command prompt on Windows) and run:
ssh riftuser@<VM_IP>
Replace <VM_IP> with your actual VM's IP address. You'll find it in your Dashboard.


2. Verify GPU
Once connected to your VM via SSH, verify your GPU is accessible by running:
nvidia-smi
If you see your GPU listed, everything is ready.

3. Check if Docker is installed
docker --version
This Ubuntu Version should come with Docker pre-installed. If missing you can run:
sudo apt update && sudo apt install -y docker.io
sudo systemctl enable docker --now
4. Fix permissions
sudo usermod -aG docker $USER
newgrp docker
docker ps
An empty list confirms Docker is working without sudo.
5. Confirm GPU access (optional)
For Ubuntu 22.04:
docker run --rm --gpus all nvidia/cuda:12.3.2-base-ubuntu22.04 nvidia-smi
For Ubuntu 24.04:
docker run --rm --gpus all nvidia/cuda:12.4.0-base-ubuntu24.04 nvidia-smi
6. Launch ComfyUI
mkdir -p ~/comfyui/models/checkpoints
docker run -d \
--name comfyui \
--gpus all \
-p 8188:8188 \
-e WEB_ENABLE_AUTH=false \
-v ~/comfyui/models:/opt/ComfyUI/models \
ghcr.io/ai-dock/comfyui:latest-cuda

7. Add a model checkpoint
cd ~/comfyui/models/checkpoints
wget -O flux1-schnell-fp8.safetensors \
"https://huggingface.co/Comfy-Org/flux1-schnell/resolve/main/flux1-schnell-fp8.safetensors"
docker restart comfyui
This is the Flux Schnell model, which is great to start with. Obviously you get to choose whichever model you like.

8. Open ComfyUI in your browser
To access ComfyUI, go to your browser and enter: http://<VM_IP>:8188
Replace <VM_IP> with your VM's IP address.
If the port is blocked, go back to your terminal and run:
sudo ufw allow 8188/tcp
Once ComfyUI loads, you'll see the default workflow. You can adjust your prompt, set the CFG to 1 for the Flux Schnell model for best results, and click "Queue Prompt" to start generating images.



9. Optional management
Stop the container:
docker stop comfyui
Start it again:
docker start comfyui
Remove the container completely:
docker rm comfyui
Result
ComfyUI runs on your GPU VM with the Flux Schnell FP8 checkpoint loaded.
Add more models to ~/comfyui/models/checkpoints or build your workflow directly in the browser.
TL;DR
Question: How do I install and run ComfyUI on a GPU VM on CloudRift?
Answer: Connect to your Ubuntu 22.04 or 24.04 VM, verify the GPU with nvidia-smi, install Docker, fix permissions using sudo usermod -aG docker $USER, then run the ComfyUI Docker image and add your model checkpoint. Access it at http://<VM_IP>:8188.
Video Walkthrough
What's Next?
Explore more ways to rent a GPU for ComfyUI:
- Quick Setup: Rent a GPU for ComfyUI in Under 2 Minutes → - Pre-configured container (fastest option)
- Complete Setup Guide: How to Rent a GPU for ComfyUI → - In-depth tutorial with template and CLI methods
For more advanced workflows, custom model management, and persistent storage configurations, look out for future articles!
