Running Stable Diffusion on AWS: A Step-by-Step Guide

Running Stable Diffusion in the cloud (AWS) has many advantages. You rent the hardware on-demand and only pay for the time you use. You don’t need to worry about maintaining the hardware.

Recently, my Stable Diffusion PC has been down, so I set up a personal cloud server to run AUTOMATIC1111, ComfyUI, and SD Forge. To save storage space, the three Stable Diffusion software share models.

This article is the notes I wrote down when setting up the server on AWS.

When do you want to use the Cloud?

The benefits of using a personal cloud server to run Stable Diffusion are:

  • You don’t need to buy and maintain hardware. The cloud provider is responsible for the capital cost and maintenance.
  • You can easily rent a more powerful GPU if you need it.
  • You can access the machine anywhere, even when you are traveling.

Alternative

Setting up a cloud server requires some technical expertise and can be time-consuming.

Check out Think Diffusion for a fully managed A1111 and ComfyUI online service. They offer 20% extra credits to our readers. (and a small commission to support this site.)

Cloud server setup

This article will guide you in setting up a Stable Diffusion cloud server for personal use. You will use Amazon Web Service (AWS) to set up the cloud system.

AWS is Amazon’s cloud computing service. You can rent computer resources such as CPU, GPU, RAM, storage, and public IP addresses on demand. You only pay for the hours you use.

We will use:

  • EC2: Compute instance to host the Stable Diffusion server. You can select the CPU, GPU, and RAM specs. The instance will have options to run A1111, ComfyUI, or SD Forge.
  • Elastic IP: Fix the IP address of the EC2 instance. Without it, the IP address will change everytime you stop and start the instance.
  • S3 bucket: Optional. For storing the AI models more economically.

Notes:

  • You should stop the instance after each session. Otherwise, it will keep charging at the rate of a running instance.
  • The storage is persistent, meaning that all the files and settings stay the same between sessions. It is no different from your local PC.
  • See the summary of commands.

Prerequisite

To follow this tutorial, you should have a basic knowledge of using Linux with a terminal.

Create a new EC2 instance

Log in to AWS. In the top search bar, type “EC2”. Click the EC2 service.

Click lanch instance.

Now, you should be on the Launch Instance page. Use the following settings.

EC2 instance settings

Enter the following settings for the EC2 instance.

Name: stable diffusion

Amazon Machine Image: Ubuntu Server 24.04 LTS

Instance type: g4dn.xlarge

The g4dn.xlarge instance has 4 vCPU, 16 GB RAM, and one T4 GPU with 16GB of VRAM.

If you want more RAM, the next level up is g4dn.2xlarge with 8 vCPU and 32 GB RAM.

G4dn is not the only option. You can pick a different instance with a different price and speed tradeoff. You can see the pricing in the page below.

  • G4dn – T4 GPU
  • G6 – L4 GPU (Up to 2x faster)
  • G5 – A10 GPU (Up to 3x faster)

Security key

Next, you will need to create a security key pair. In Key pair (login), click Create new key pair. Give it a name (e.g. aws_sd) and click create pair.

The pem key file should be downloaded automatically.

Configure Storage

In Configure Storage, change the storage to 100 GiB.

Click Launch instance.

Set up EC2 instance

 Open the Amazon EC2 console.

Select Instances in the sidebar. You should see your instance initializing. The machine is ready to use when the Status check changes to checks passed.

Select the EC2 instance and click Connect.

In the EC2 Instance Connect, ensure the Username is ubuntu and click Connect.

You should now have access to the machine’s terminal. Alternatively, you can ssh the machine from your local PC using the information in the SSH client tab.

Update software

The machine image is outdated. Update the machine by running the following commands.

sudo apt update
sudo apt upgrade

Install NVidia driver

You need to install the NVidia driver before using the GPU. Run the following command.

sudo apt install nvidia-driver-535

You can use any newer version (the last 3 digits) available.

Install Python 3.10

You will need Python 3.10 to run the Stable Diffusion software.

First, add the deadsnakes repository,

sudo add-apt-repository ppa:deadsnakes/ppa

Install Python 3.10.

sudo apt install python3.10

Set Python 3.10 as the default when you type python3.

sudo update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.10 2

Verify that python3 is calling python 3.10.

python3 --version

Install Python 3.10 venv. You will need it later when setting up AUTOMATIC1111.

sudo apt install python3.10-venv

Restart the instance

You need to restart the EC2 instance to initialize the GPU driver.

Restart the instance by using the AWS interface. Instance state > Reboot instance.

When the reboot is complete, and the instance is ready, start a new terminal.

Confirm that the NVidia driver is working.

nvidia-smi

You should see something like this.

Now, the GPU is ready to be used.

Elastic IP

The public IP address of the EC2 instance will change whenever the instance is restarted.

An elastic IP is a fixed public IP address you rent from AWS. It is not straightly necessary but, for a small fee, you get a fixed IP address to your machine.

 Open the Amazon EC2 console.

Select Elastic IP under Network Security.

Click Allocate Elastic IP > Allocate.

Select the Elastic IP > Action > Associate Elastic IP.

Choose the EC2 instance you just created under Instance.

Click Associate.

Now, your EC2 instance has a persistent IP address.

You can confirm the IP address by clicking EC2 > Instances. Select your EC2.

The public IP is listed in the Public IPv4 address.

AUTOMATIC1111

Clone repository

Go to the home directory.

cd ~

Run the following command to clone AUTOMATIC1111 to your EC2.

git clone https://github.com/AUTOMATIC1111/stable-diffusion-webui

Set command line arguments

In the stable-diffusion-webui directory, edit the file webui-user.sh. The following example uses nano to edit the file. Feel free to use other editors.

cd stable-diffusion-webui
nano webui-user.sh

Uncomment the line with COMMANDLINE_ARGS and change to the following.

export COMMANDLINE_ARGS="--listen --opt-sdp-attention --enable-insecure-extension-access"

This enables:

  • Connecting to A1111 from another computer such as yours.
  • Using the faster flash attention when using models.
  • Allowing you to install extensions when connecting remotely.

Install and run webui

Start webui:

./webui.sh

It will take some time to install. When it is done, you should see a local URL address http://0.0.0.0:7860.

You will need to open port 7860 on the EC2 instance to connect to it.

Open port

Go to the instance > security. Click the link on the security group.

Inbound rule > Edit inbound rules.

Click Add rule. Add a Custom TCP port 7860. Allow connection source from My IP.

This restrict access of your Stable Diffusion port 7860 only from your IP address.

Test connection

Now, you should be able to access your AUTOMATIC1111 webui by the elastic IP address and the port 7860.

Go to EC2> Instance. Click on your instance and note the Public IPv4 address. For example, if the public IPv4 address is http://12.3.456.789, you can add “:7860” to the address to access AUTOTMATIC1111 from your browser.

http://12.3.456.789:7860

Test using it to make sure it works.

ComfyUI

Install ComfyUI

To install ComfyUI, first go to the home folder.

cd ~

Clone ComfyUI.

git clone https://github.com/comfyanonymous/ComfyUI

Share models with AUTOMATIC1111

Use AUTOMATIC1111’s models.

cd ComfyUI
cp extra_model_paths.yaml.example extra_model_paths.yaml

Edit extra_model_paths.yaml with nano.

nano extra_model_paths.yaml

Change the line.

    base_path: path/to/stable-diffusion-webui/

To:

    base_path: /home/ubuntu/stable-diffusion-webui

Run ComfyUI

You can use the python of AUTOMATIC1111 so that you don’t need to install all the libraries again. This saves some space.

~/stable-diffusion-webui/venv/bin/python main.py --listen

You should see it uses the port 8188.

Open port

 Open the Amazon EC2 console > Instances. Select your Stable Diffusion instance > Security > Click the link of the security group.

Edit inbound Rule. Add custom TCP port 8188 and make it only accessible from your IP address.

See the previous instructions.

SD Forge

Install Forge

First, go to the home folder.

cd ~

Clone the repository.

git clone https://github.com/lllyasviel/stable-diffusion-webui-forge

Edit webui-user.sh. Uncomment the following line and add the path pointing to A1111. This let you use A1111’s model files.

export COMMANDLINE_ARGS="--listen --forge-ref-a1111-home /home/ubuntu/stable-diffusion-webui"

Complete the installation of and start SD Forge.

./webui.sh

Now you will find the Insight Face library is missing.

Quit Forge and install Insight Face.

./venv/bin/python -m pip install insightface

Start Forge

./webui.sh

Storing models in S3 bucket (optional)

You can store all your models in the Elastic Block Storage (The local hard drive of your EC2). It is fast and convenient, but it is not the most cost-effective.

Alternatively, you can use AWS’s S3 bucket data storage service, which is about a few times cheaper. The tradeoff is extra setup steps and longer loading times when switching models.

Note: I would only recommend using an S3 bucket if you need many checkpoint models. Otherwise, a more straightforward method is to increase the storage size to 200 GB, for example.

Create an S3 bucket

Search S3 in the top search bar and select the S3 service.

Click Create Bucket. Enter a unique bucket name. You may need to add some numbers to the end to make it unique.

Click Create Bucket.

Create an IAM policy

Open the IAM console.

On the sidebar, click Roles > Create Role.

On the Select trusted entity page, select AWS Service and EC2 under Use Case. Click Next.

On the Add permission page, search and select AmazonS3FullAccess. Click Next.

On the next page, enter the role name, e.g. s3fullaccess. Click Create role.

Now, you have created a policy for an EC2 instance to access an S3 bucket.

Attach IAM policy to the EC2 instance

 Open the Amazon EC2 console.

On the sidebar, select Instance.

Select your Stable Diffusion EC2 instance. Then select Actions > Security > Modify IAM role.

Select the IAM role you just created and click Update IAM role.

Verifying access to S3

In the terminal of your EC2 instance, run the following commands to install the AWS command line interface tool.

sudo apt install unzip
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
unzip awscliv2.zip
sudo ./aws/install

Run the following command to verify access to your S3 bucket. Substitute BUCKET_NAME with the unique name of your S3 bucket.

aws s3 ls s3://BUCKET_NAME

You have access to your S3 bucket if it runs without error and output. You can change it to a non-existent bucket name, and you should see an error.

Mounting the s3 bucket

We will use the s3fs tool to mount the S3 bucket.

First, install the s3fs tool.

sudo apt install s3fs

Create the mount point.

sudo mkdir /s3

Finally, mount the S3 bucket to the mount point.

sudo s3fs BUCKET_NAME /s3 -o iam_role=s3fullaccess -o use_cache=/tmp -o allow_other -o uid=1001 -o mp_umask=002 -o multireq_max=5 -o use_path_request_style -o url=https://s3.us-east-1.amazonaws.com -o endpoint=us-east-1

You will need to substitute BUCKET_NAME with your S3 bucket’s name. You will also need to use an iam_role that matches the one you attached. You may need to change the url and endpoint parameters to match your EC2’s region.

The last step needs to be done every time you restart the instance. I simply create a script that runs this command. You can also use /etc/fstab to automate the mount, but this is out of the scope of this tutorial.

Moving the A1111 models to the S3 bucket

Now, we are in a position to move all the models to the S3 bucket. The best way is to move the models folder.

Make sure the S3 bucket is mounted to /s3. Then run:

cd ~/stable-diffusion-webui
mv models /s3

These commands move the whole models folder to the S3 bucket’s root location.

Then, create a symbolic link to the new models location.

ln -s /s3/models

Summary of usage commands

You would normally stop (but not terminate) the instance after each usage session.

Start the instance

 Open the Amazon EC2 console.

Select your Stable Diffusion instance > Instance state > Start instance.

Then connect to the terminal: Actions > Connect.

Mount S3

If you use a S3 bucket to store models, run the following command (with modification) to mount the S3 bucket.

sudo s3fs BUCKET_NAME /s3 -o iam_role=s3fullaccess -o use_cache=/tmp -o allow_other -o uid=1001 -o mp_umask=002 -o multireq_max=5 -o use_path_request_style -o url=https://s3.us-east-1.amazonaws.com -o endpoint=us-east-1

Start AUTOMATIC1111

cd ~/stable-diffusion-webui; ./webui.sh

Start ComfyUI

~/stable-diffusion-webui/venv/bin/python main.py --listen

Start SD Forge

cd ~/stable-diffusion-webui-forge; ./webui.sh

Stop the instance

When you are done, stop the instance to avoid extra charges.

Open the Amazon EC2 console. Select your Stable Diffusion instance > Instance state > Stop instance.

aizmin: