Skip to content

Home Assistant: Easy Automated Backups to S3

Firstly, sorry it’s been a while since I last blogged. It is indeed strange times we find ourselves in with the worldwide pandemic!

During lockdown here in the UK I decided to take a look at the open source home automation software Home Assistant. I started looking at Home Assistant (HA) as a way to bring control of the many different smart devices I have under a single control panel.

Time To Backup Home Assistant

I have been using HA for some time now, and thought it was about time that I took a look at home to backup my config to AWS S3. I’m going to go through what I have done to to archive my goal of backing up to S3 and automating daily and weekly backups.

I’m going to assume that you already have HA installed and have HACS and supervisor setup on your system. If you don’t have HACS installed already I recommend this video from Bearded Tinker who goes through the process.

There are two things we need to configure the backups, first, we need the Amazon S3 Backup add-on. This isn’t a add-on that comes as part of HASS and is located in a separate repository. The add-on has been created by Greg Rapp and is located within his HASS repo. The second thing we require is the HASS Auto Backup integration which can be found within HACS.

Home Assistant Logo
Home Assistant

To add the Amazon S3 Backup add-on to supervisor, add the following repository by going to supervisor -> Add-on Store then clicking on the three vertical dots in the top right hand corner and clicking Repositories. Copy th repository link below into the box and click Add.

Once added, you should see a new header on the add-ons page called Greg’s Home Assistant Addons. Located under that header you will find the Amazon S3 Backup add-on. Click on it and click install. The configuration is fairly straight forward and you can see all the options that can be specified. My config looks a little like this:

log_level: info
aws_access_key: xxxxx
aws_secret_access_key: xxxxx
bucket_name: xxxxx
bucket_region: eu-west-2
storage_class: STANDARD
upload_missing_files: true
keep_local_snapshots: 3

You will require a aws_access_key and aws_secret_access_key for a IAM user with permissions to write to your chosen bucket. You can also choose what storage class you want to store your backups in, I have went for standard as this is enough for my needs.

S3 Storage Costs

Before we continue. A quick note on cost. S3 storage is ridiculously cheap. Depending on how important your data is, you can select different storage tiers. For example, standard storage is designed for 99.999999999% durability and 99.9% availability.

However, if availability isn’t a priority you could choose to use One Zone-Infrequent Access. You could also choose to use S3 Glacier which is the cheapest form of storage. One thing to keep in mind about Glacier however is the retrieval time. Retrieval time for objects stored Glacier can be between minutes to hours. The speed of retrieval depends on how much your willing to spend! Also note that you are charged for object retrieval on all storage classes.

AWS also provides a free tier for the first year of your account. You can have 5GB of standard storage free for 12 months. After that, standard prices apply.

S3 Free Tier
Screenshot of S3 free tier allowance

To give you an idea of costs, you can visit the AWS Calculator and input how much storage you expect to use. For my example, I based 4GB of storage per months and 1000 PUT requests.

AWS Cost Calculator for various storage tiers

Setting Up AWS IAM User and Bucket

I decided to setup a separate IAM user for this project and provide only the required permissions (as per best practice). You can do this manually going through the console, however I prefer to write anything that I deploy into my AWS account as code.

I chose to write my infrastructure using Terraform. I first defined the provider, in this case AWS, and the region which I want to use, for me that was eu-west-2 or London:

provider "aws" {
  version = "2.33.0"
  region = "eu-west-2"

I then defined by bucket and made sure that it was private:

resource "aws_s3_bucket" "b" {
    bucket = "my-home-assistant-bucket"
    acl = "private"

Finally, I specified the IAM user, access key and attached a policy to the user to allow write access to the bucket:

resource "aws_iam_user" "ha-backup-bucket-user" {
    name = "ha-backup-bucket-user"

resource "aws_iam_access_key" "ha-backup-bucket-user" {
    user =

resource "aws_iam_policy" "ha_bucket_allow" {
    name = "HASnapshotsS3Allow"
    path = "/"
    description = "Policy to allow read and write to HA snapshot bucket"
    policy = <<EOF
    "Version": "2012-10-17",
    "Statement": [
            "Action": [
            "Effect": "Allow",
            "Resource": [

resource "aws_iam_policy_attachment" "ha_bucket_allow_attach" {
    name = "ha_bucket_allow_attach"
    users = []
    policy_arn = aws_iam_policy.ha_bucket_allow.arn

Once deployed into my AWS account, I added the access keys, bucket name and region to the config within HA and started it up. As I had set the upload_missing_files option to true all my previous snapshots were uploaded to my bucket.

Automating Backups

To create backups automatically, I installed the HASS Auto Backup integration. Once installed via HACS, I then created a new file and configured the service with one backup daily and one weekly:

  auto_purge: true

  - alias: Perform Daily Backup
      platform: time
      at: "00:00:00"
      condition: time
        - tue
        - wed
        - thu
        - fri
        - sat
        - sun
      service: auto_backup.snapshot_full
        name: "DailyBackup: {{ now().strftime('%A, %B %-d, %Y') }}"
        keep_days: 7

  - alias: Perform Weekly Backup
      platform: time
      at: "00:00:00"
      condition: time
        - mon
      service: auto_backup.snapshot_full
        name: "WeeklyBackup: {{ now().strftime('%A, %B %-d, %Y') }}"
        keep_days: 28

Restarted HA and there we go, automatic backups enabled!

Published inHome AssistantAWSSmart Home