Skip to content

Commit

Permalink
docs: clean up s3 examples
Browse files Browse the repository at this point in the history
  • Loading branch information
ricellis committed Jan 26, 2024
1 parent 1e1f085 commit 9660717
Show file tree
Hide file tree
Showing 5 changed files with 247 additions and 159 deletions.
24 changes: 21 additions & 3 deletions .secrets.baseline
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
"files": "package-lock.json|test/fixtures|^.secrets.baseline$",
"lines": null
},
"generated_at": "2024-01-11T17:39:14Z",
"generated_at": "2024-01-25T12:14:13Z",
"plugins_used": [
{
"name": "AWSKeyDetector"
Expand Down Expand Up @@ -95,12 +95,30 @@
"verified_result": null
}
],
"examples/README.md": [
{
"hashed_secret": "43bce7a87dd0e4b8c09b44173613bc95ba77d714",
"is_secret": false,
"is_verified": false,
"line_number": 41,
"type": "Secret Keyword",
"verified_result": null
},
{
"hashed_secret": "745d0b2380e21353d526db47a87158f2065563ee",
"is_secret": false,
"is_verified": false,
"line_number": 72,
"type": "Basic Auth Credentials",
"verified_result": null
}
],
"examples/s3-backup-file.js": [
{
"hashed_secret": "9d4e1e23bd5b727046a9e3b4b7db57bd8d6ee684",
"is_secret": false,
"is_verified": false,
"line_number": 39,
"line_number": 41,
"type": "Basic Auth Credentials",
"verified_result": null
}
Expand All @@ -110,7 +128,7 @@
"hashed_secret": "9d4e1e23bd5b727046a9e3b4b7db57bd8d6ee684",
"is_secret": false,
"is_verified": false,
"line_number": 37,
"line_number": 38,
"type": "Basic Auth Credentials",
"verified_result": null
}
Expand Down
84 changes: 78 additions & 6 deletions examples/README.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,9 @@
# CouchBackup Examples

This folder contains Node.js scripts which use the `couchbackup` library.
This folder contains example Node.js scripts which use the `couchbackup` library.

Use `npm install ../; npm install` in this folder to install the script
dependencies. This uses the checked out copy of couchbackup to ensure
everything is in sync.

Run a script without arguments to receive help.
These scripts are for inspiration and demonstration.
They are not a supported part of couchbackup and should not be considered production ready.

## Current examples

Expand All @@ -17,3 +14,78 @@ Run a script without arguments to receive help.
2. `s3-backup-stream.js` -- backup a database to an S3-API compatible store
by streaming the backup data directly from CouchDB or Cloudant into
an object.

#### Prerequisites

##### Install the dependencies

Use `npm install` in this folder to install the script
dependencies.
Note: this uses the latest release of couchbackup, not the
checked out out version.

##### AWS SDK configuration

The scripts expect AWS ini files:
* shared credentials file `~/.aws/credentials` or target file from `AWS_SHARED_CREDENTIALS_FILE` environment variable
* shared configuration file `~/.aws/config` or target file from `AWS_CONFIG_FILE` environment variable

###### IBM COS

When using IBM Cloud Object Storage create a service credential with the `Include HMAC Credential` option enabled.
The `access_key_id` and `secret_access_key` from the `cos_hmac_keys` entry in the generated credential are
the ones required to make an AWS credentials file e.g.
```ini
[default]
aws_access_key_id=paste access_key_id here
aws_secret_access_key=paste secret_access_key here
```

Run the scripts with the `--s3url` option pointing to your COS instance s3 endpoint.
The AWS SDK requires a region to initialize so ensure the config file has one named e.g.
```ini
[default]
region=eu-west-2
```

#### Usage

Run a script without arguments to receive help e.g.

`node s3-backup-file.js`

The source database and destination bucket are required options.
The minimum needed to run the scripts are thus:

`node s3-backup-stream.js -s 'https://dbser:[email protected]/exampledb' -b 'examplebucket'`

The object created in the bucket for the backup file will be
named according to a prefix (default `couchbackup`), DB name and timestamp e.g.

`couchbackup-exampledb-2024-01-25T09:45:11.730Z`

#### Progress and debug

To see detailed progress of the backup and upload or additional debug information
use the `DEBUG` environment variable with label `s3-backup` e.g.

`DEBUG='s3-backup' node s3-backup-stream.js -s 'https://dbser:[email protected]/exampledb' -b 'couchbackup' --s3url 'https://s3.eu-gb.cloud-object-storage.appdomain.cloud'`

```
s3-backup Creating a new backup of https://host.example/exampledb at couchbackup/couchbackup-exampledb-2024-01-25T09:45:11.730Z... +0ms
s3-backup Setting up S3 upload to couchbackup/couchbackup-exampledb-2024-01-25T09:45:11.730Z +686ms
s3-backup Starting streaming data from https://host.example/exampledb +2ms
s3-backup Couchbackup changes batch: 0 +136ms
s3-backup Fetched batch: 0 Total document revisions written: 15 Time: 0.067 +34ms
s3-backup couchbackup download from https://host.example/exampledb complete; backed up 15 +2ms
s3-backup S3 upload progress: {"loaded":6879,"total":6879,"part":1,"Key":"couchbackup-exampledb-2024-01-25T09:45:11.730Z","Bucket":"couchbackup"} +623ms
s3-backup S3 upload done +1ms
s3-backup Upload succeeded +0ms
s3-backup done. +0ms
```

#### Known issues

The S3 SDK does not appear to apply back-pressure to a Node `stream.Readable`. As such in environments
where the upload speed to S3 is significantly slower than either the speed of downloading from the database
or reading the backup file then the scripts may fail.
30 changes: 7 additions & 23 deletions examples/package.json
Original file line number Diff line number Diff line change
@@ -1,31 +1,15 @@
{
"name": "couchbackup-examples",
"version": "0.0.1",
"version": "0.0.2",
"description": "Examples of using CouchBackup as a library",
"dependencies": {
"aws-sdk": "^2.39.0",
"tmp": "^0.0.31",
"verror": "^1.10.0",
"yargs": "^7.0.2"
"@cloudant/couchbackup": "^2.9.16",
"@aws-sdk/client-s3": "^3.499.0",
"@aws-sdk/credential-providers": "^3.499.0",
"@aws-sdk/lib-storage": "^3.499.0",
"verror": "^1.10.1",
"yargs": "^17.7.2"
},
"devDependencies": {
"eslint": "^6.5.1",
"eslint-plugin-standard": "^3.0.1",
"eslint-plugin-import": "^2.2.0",
"eslint-plugin-node": "^4.2.2",
"eslint-plugin-promise": "^3.5.0",
"eslint-plugin-react": "^7.0.0",
"eslint-config-standard": "^10.2.1",
"eslint-config-semistandard": "^11.0.0",
"jsdoc": "^3.4.3",
"mocha": "^3.2.0",
"cloudant": "^1.7.1",
"uuid": "^3.0.1"
},
"scripts": {
"test": "eslint --ignore-path .gitignore . && mocha"
},
"author": "",
"license": "Apache-2.0"
}

Expand Down
133 changes: 67 additions & 66 deletions examples/s3-backup-file.js
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
// Copyright © 2017, 2018 IBM Corp. All rights reserved.
// Copyright © 2017, 2024 IBM Corp. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
Expand All @@ -20,14 +20,16 @@

'use strict';

const stream = require('stream');
const fs = require('fs');
const url = require('url');
const { createReadStream, createWriteStream, mkdtempSync } = require('node:fs');
const { tmpdir } = require('node:os');
const { join } = require('node:path');
const url = require('node:url');

const AWS = require('aws-sdk');
const couchbackup = require('@cloudant/couchbackup');
const { backup } = require('@cloudant/couchbackup');
const { fromIni } = require('@aws-sdk/credential-providers');
const { Upload } = require('@aws-sdk/lib-storage');
const { HeadBucketCommand, S3Client } = require('@aws-sdk/client-s3');
const debug = require('debug')('s3-backup');
const tmp = require('tmp');
const VError = require('verror').VError;

/*
Expand All @@ -45,42 +47,42 @@ function main() {
awsprofile: { nargs: 1, describe: 'The profile section to use in the ~/.aws/credentials file', default: 'default' }
})
.help('h').alias('h', 'help')
.epilog('Copyright (C) IBM 2017')
.epilog('Copyright (C) IBM 2017, 2024')
.argv;

const sourceUrl = argv.source;
const backupBucket = argv.bucket;
const backupName = new url.URL(sourceUrl).pathname.split('/').filter(function(x) { return x; }).join('-');
const backupKeyPrefix = `${argv.prefix}-${backupName}`;

const backupKey = `${backupKeyPrefix}-${new Date().toISOString()}`;
const backupTmpFile = tmp.fileSync();
const backupDate = Date.now();
const isoDate = new Date(backupDate).toISOString();
const backupKey = `${backupKeyPrefix}-${isoDate}`;
const backupTmpFile = join(mkdtempSync(join(tmpdir(), 'couchbackup-s3-backup-')), `${backupDate}`);

const s3Endpoint = argv.s3url;
const awsProfile = argv.awsprofile;

// Creds are from ~/.aws/credentials, environment etc. (see S3 docs).
const awsOpts = {
signatureVersion: 'v4',
credentials: new AWS.SharedIniFileCredentials({ profile: awsProfile })
credentials: fromIni({ profile: awsProfile })
};
if (typeof s3Endpoint !== 'undefined') {
awsOpts.endpoint = new AWS.Endpoint(s3Endpoint);
awsOpts.endpoint = s3Endpoint;
}
const s3 = new AWS.S3(awsOpts);
const s3 = new S3Client(awsOpts);

debug(`Creating a new backup of ${s(sourceUrl)} at ${backupBucket}/${backupKey}...`);
bucketAccessible(s3, backupBucket)
.then(() => {
return createBackupFile(sourceUrl, backupTmpFile.name);
return createBackupFile(sourceUrl, backupTmpFile);
})
.then(() => {
return uploadNewBackup(s3, backupTmpFile.name, backupBucket, backupKey);
return uploadNewBackup(s3, backupTmpFile, backupBucket, backupKey);
})
.then(() => {
debug('Backup successful!');
backupTmpFile.removeCallback();
debug('done.');
})
.catch((reason) => {
debug(`Error: ${reason}`);
Expand All @@ -96,18 +98,9 @@ function main() {
* @returns Promise
*/
function bucketAccessible(s3, bucketName) {
return new Promise(function(resolve, reject) {
const params = {
Bucket: bucketName
};
s3.headBucket(params, function(err, data) {
if (err) {
reject(new VError(err, 'S3 bucket not accessible'));
} else {
resolve();
}
});
});
return s3.send(new HeadBucketCommand({
Bucket: bucketName
})).catch(e => { throw new VError(e, 'S3 bucket not accessible'); });
}

/**
Expand All @@ -119,18 +112,27 @@ function bucketAccessible(s3, bucketName) {
*/
function createBackupFile(sourceUrl, backupTmpFilePath) {
return new Promise((resolve, reject) => {
couchbackup.backup(
backup(
sourceUrl,
fs.createWriteStream(backupTmpFilePath),
(err) => {
createWriteStream(backupTmpFilePath),
(err, done) => {
if (err) {
return reject(new VError(err, 'CouchBackup process failed'));
reject(err);
} else {
resolve(done);
}
debug('couchbackup to file done; uploading to S3');
resolve('creating backup file complete');
}
);
});
)
.on('changes', batch => debug('Couchbackup changes batch: ', batch))
.on('written', progress => debug('Fetched batch:', progress.batch, 'Total document revisions written:', progress.total, 'Time:', progress.time));
})
.then((done) => {
debug(`couchbackup to file done; backed up ${done.total}`);
debug('Ready to upload to S3');
})
.catch((err) => {
throw new VError(err, 'CouchBackup process failed');
});
}

/**
Expand All @@ -143,38 +145,37 @@ function createBackupFile(sourceUrl, backupTmpFilePath) {
* @returns Promise
*/
function uploadNewBackup(s3, backupTmpFilePath, bucket, key) {
return new Promise((resolve, reject) => {
debug(`Uploading from ${backupTmpFilePath} to ${bucket}/${key}`);

function uploadFromStream(s3, bucket, key) {
const pass = new stream.PassThrough();

const params = {
debug(`Uploading from ${backupTmpFilePath} to ${bucket}/${key}`);
const inputStream = createReadStream(backupTmpFilePath);
try {
const upload = new Upload({
client: s3,
params: {
Bucket: bucket,
Key: key,
Body: pass
};
s3.upload(params, function(err, data) {
debug('S3 upload done');
if (err) {
debug(err);
reject(new VError(err, 'Upload failed'));
return;
}
Body: inputStream
},
queueSize: 5, // allow 5 parts at a time
partSize: 1024 * 1024 * 64 // 64 MB part size
});
upload.on('httpUploadProgress', (progress) => {
debug(`S3 upload progress: ${JSON.stringify(progress)}`);
});
// Return a promise for the completed or aborted upload
return upload.done().finally(() => {
debug('S3 upload done');
})
.then(() => {
debug('Upload succeeded');
debug(data);
resolve();
}).httpUploadProgress = (progress) => {
debug(`S3 upload progress: ${progress}`);
};

return pass;
}

const inputStream = fs.createReadStream(backupTmpFilePath);
const s3Stream = uploadFromStream(s3, bucket, key);
inputStream.pipe(s3Stream);
});
})
.catch(err => {
debug(err);
throw new VError(err, 'Upload failed');
});
} catch (err) {
debug(err);
return Promise.reject(new VError(err, 'Upload could not start'));
}
}

/**
Expand Down
Loading

0 comments on commit 9660717

Please sign in to comment.