Profile picture Schedule a Meeting
c a n d l a n d . n e t

Dokku Postgress Backups

Dusty Candland | | dokku, digitalocean, javascript, postgres

I'm using Dokku for my side projects and needed to make sure I had database backups. My Dokku instance is on DigitalOcean, so I wanted the backups to be there too.

Create a new Spaces bucket on DO

I'm using a bucket per project, but you don't need to. So create bucket and a set of access keys.

Next setup Dokku

You can do some of this through SSH, but it's easier to just do it all on the server.

proofreader_production = the Dokku Postgres service name proofreader-backups = the DigitalOcean bucket

We'll setup the authentication, then do a test backup, then schedule daily backups.

dokku postgres:backup-auth proofreader_production XXXXXXXXXXXXXXXXXXXX xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx us s3v4 https://sfo3.digitaloceanspaces.com

# RUN
dokku postgres:backup proofreader_production proofreader-backups

# SCHEULDE 9:04 AM UTC
dokku postgres:backup-schedule proofreader_production "14 9 * * *" proofreader-backups

Dokku Postgres Docs

Setup lifecycle rules to clean up old backups

I don't want to keep backups forever, 30 days seems good. You can't add these rules using the DigitalOcean web UI, but you can with code. Below is the script I'm using, modified from How to delete DigitalOcean Spaces files after X days with lifecycle rules

I changed it to take the bucket name from the command line and to get the configured rules after.

// https://www.codemzy.com/blog/digitalocean-spaces-lifecycle-rules

const {
  S3Client,
  GetBucketLifecycleConfigurationCommand,
  PutBucketLifecycleConfigurationCommand,
} = require("@aws-sdk/client-s3");

// connect to spaces
const s3 = new S3Client({
  endpoint: "https://sfo3.digitaloceanspaces.com",
  forcePathStyle: false,
  region: "us",
  credentials: {
    accessKeyId: process.env.DO_KEY,
    secretAccessKey: process.env.DO_SECRET,
  },
});

const bucketName = process.argv[2];

// create the lifecycle policy
const putConfigCommand = new PutBucketLifecycleConfigurationCommand({
  Bucket: bucketName,
  LifecycleConfiguration: {
    Rules: [
      {
        ID: "autodelete_rule",
        Expiration: { Days: 30 },
        Status: "Enabled",
        Prefix: "", // Unlike AWS in DO this parameter is required
      },
    ],
  },
});

const getConfigCommand = new GetBucketLifecycleConfigurationCommand({
  Bucket: bucketName,
});

async function run() {
  try {
    console.log(`Enabling lifecycle policy for ${bucketName}`);
    await s3.send(putConfigCommand);
    console.log("Lifecycle policy enabled!");
  } catch (error) {
    console.error(error);
  }

  try {
    console.log(`Getting lifecycle policy for ${bucketName}`);
    const result = await s3.send(getConfigCommand);
    console.log(result);
  } catch (error) {
    console.error(error);
  }
}

run();

Run with

DO_KEY=XXXX DO_SECRET=xxxx node index.js your-bucket-name

That's it! Daily Postgres backups from Dokku to DigitalOcean with a rolling 30 day window.

Webmentions

These are webmentions via the IndieWeb and webmention.io. Mention this post from your site: