Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Partitions #453

Closed
wants to merge 54 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
54 commits
Select commit Hold shift + click to select a range
9485558
crontab: import from production, except with changes
rjbs Apr 5, 2024
24a1a97
services: add systemd service definition for paused
rjbs Apr 5, 2024
31fdf9f
services: add pause-web service
rjbs Apr 5, 2024
887a978
bootstrap/selfconfig: used during "unpause" style bootstrap
rjbs Apr 5, 2024
ff16654
cron/recentfile-aggregate.sh: update PATH
rjbs Apr 6, 2024
f1e0e6e
crontab: run pause jobs as pause user
rjbs Apr 6, 2024
9dbd29d
cron: remove cleanup-apachecores.pl
rjbs Apr 6, 2024
1e06426
cron programs: use plenv-based perl shebang
rjbs Apr 6, 2024
ead472c
cron: remove assert-paused-running.zsh
rjbs Apr 6, 2024
6a0eecf
cron: remove "perl" from programs with shebang and +x
rjbs Apr 6, 2024
5d6396e
bootstrap: switch to perl
rjbs Apr 6, 2024
8cfddc4
cron/cron-p6daily.pl: chmod +x
rjbs Apr 6, 2024
d850b80
crontab: fix env var typo: ROOT used instead of REPO
rjbs Apr 6, 2024
91a4327
bootstrap config: set FTPPUB config option
rjbs Apr 6, 2024
8bd5829
cron/update-checksums.pl: use ~pause/run for run files
rjbs Apr 6, 2024
bdb9267
indexscripts.pl: adjust shebang
rjbs Apr 6, 2024
a41f390
bootstrap/import-mod-db: import production data
rjbs Apr 6, 2024
9f92110
cron/cron-daily.pl: put tmpfile in /tmp
rjbs Apr 6, 2024
1146736
cron/recentfile-aggregate: convert to perl, fix paths
rjbs Apr 6, 2024
cae75de
bootstrap: set up INCOMING_LOC
rjbs Apr 6, 2024
d7ae851
bootstrap: set CHECKSUMS_SIGNING_ARGS to placeholder
rjbs Apr 6, 2024
5b53975
crontab: log mldistwatch to ~pause/log for now
rjbs Apr 6, 2024
8b000b8
bootstrap: fix a missing ; in PAUSE config
rjbs Apr 19, 2024
ccbb6aa
bootstrap: chdir to PAUSE-git before git init
rjbs Apr 19, 2024
c4d844e
bootstrap: import database faster
rjbs Apr 19, 2024
fcb7ccb
bootstrap: compile with more processes
rjbs Apr 19, 2024
51c1a34
bootstrap: do not install man pages for plenv perl
rjbs Apr 19, 2024
95acba9
bootstrap: provide initial branch name for PAUSE-git
rjbs Apr 19, 2024
d43d8e0
bootstrap: learn to use prepackaged plenv
rjbs Apr 19, 2024
73cd579
Config: set a AUTHEN_BACKUP_DIR at least for now
rjbs Apr 25, 2024
d97104f
mysql-dump.pl: db is not replicated, do not act as if it is
rjbs Apr 25, 2024
1cdc17c
cron/rm_stale_links: never remove critical authors/id dir
rjbs Apr 25, 2024
0c6ff7c
cron-p6daily.pl: put a domain on From address
rjbs Apr 25, 2024
5803d2c
bootstrap: also set PAUSE_PUBLIC_DATA
rjbs Apr 25, 2024
2c87cb2
cron/cron-daily.pl: put a domain on From address
rjbs Apr 25, 2024
b6506f0
bootstrap/import-mod-db: stop services while mirroring
rjbs Apr 25, 2024
88f1dfb
recentfile-aggregate: use FindBin to find library dir
rjbs Apr 25, 2024
ea8b2d7
cron/make-mirror-yaml: get rid of it
rjbs Apr 25, 2024
51c1b9a
bootstrap: put TMP back in ~pause/tmp like production
rjbs Apr 25, 2024
3ed1588
mldistwatch: respect PAUSE.CLOSED file
rjbs Apr 25, 2024
9012812
bootstrap/import-mod-db: stop indexing during run
rjbs Apr 25, 2024
bb9d2e4
mldistwatch: initialize logger with its own name
rjbs Apr 25, 2024
8c95977
PAUSE::Logger: give the default logger a default name
rjbs Apr 25, 2024
466da0b
bootstrap: set up a GPG key to sign with
rjbs Apr 25, 2024
cd3d64b
bootstrap: get database passwords from command line
rjbs Apr 25, 2024
68d7967
bootstrap: fix how pause user is put into config
rjbs Apr 25, 2024
9bf6e85
bootstrap/import-mod-db: import both database dumps
rjbs Apr 25, 2024
fc40057
bootstrap/import-pause-data: rename, program is no longer just db
rjbs Apr 25, 2024
5cd10aa
bootstrap: consolidate old "boxo" program into pause.git
rjbs Apr 25, 2024
249266b
bootstrap: add an --enable-mail option to enable mail
wolfsage Apr 26, 2024
34026ea
PAUSE.pm: eliminate $IS_PAUSE_US, there is One Pause
rjbs Apr 26, 2024
403e2ef
eliminate more hostname-based configuration
rjbs Apr 26, 2024
750a0d6
bootstrap: Override CRONPATH to the new location in Config
wolfsage Apr 26, 2024
7cf3942
Add partitioning and mount config.
rspier Apr 26, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion bin/indexscripts.pl
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
#!/usr/local/bin/perl
#!/home/pause/.plenv/shims/perl
# Build the scripts index for PAUSE
# Original author: KSTAR
# Last modified: $Date: 2003/12/13 05:52:51 $
Expand Down
41 changes: 41 additions & 0 deletions bootstrap/import-pause-data
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
#!/bin/bash
if [ $UID != "0" ]; then
echo "import-mod-db is meant to be run as root" >&2
exit 1;
fi

if [ ! -e "moddump.current.bz2" -o ! -e "authen_pausedump.current.bz2" ]; then
# The code used to do this:
# rsync -vaP pause.perl.org::pausedata/moddump.current.bz2 .
#
# ...but this only gets the moddump, and really we are going to need authen
# also.
echo "both moddump.current.bz2 and authen_pausedump.current.bz2 must be in cwd" >&2
exit 1;
fi

touch /etc/PAUSE.CLOSED

systemctl stop paused

sudo -u pause rsync --progress -av pause.perl.org::PAUSE/ /home/pause/pub/PAUSE/

echo 'SET GLOBAL innodb_flush_log_at_trx_commit=2' | mysql

bunzip2 moddump.current.bz2

pv moddump.current | \
perl -ple 's/^CHANGE MASTER.*//' | \
sudo mysql mod

bunzip2 authen_pausedump.current.bz2

pv authen_pause.current | \
perl -ple 's/^CHANGE MASTER.*//' | \
sudo mysql authen

echo 'SET GLOBAL innodb_flush_log_at_trx_commit=1' | mysql

systemctl start paused

rm /etc/PAUSE.CLOSED
315 changes: 315 additions & 0 deletions bootstrap/lib/Dobby/Client.pm
Original file line number Diff line number Diff line change
@@ -0,0 +1,315 @@
use v5.32.0;
use warnings;

package Dobby::Client;

use experimental 'signatures';
use parent 'IO::Async::Notifier';

use Carp ();
use Future::AsyncAwait;
use Future::Utils qw(repeat);
use IO::Async::Loop;
use JSON::MaybeXS;
use Net::Async::HTTP;

sub configure ($self, %param) {
my @missing;

KEY: for my $key (qw( bearer_token )) {
unless (defined $param{$key}) {
push @missing, $key;
next KEY;
}

$self->{"__dobby_$key"} = $param{$key};
}

if (@missing) {
Carp::confess("missing required Dobby::Client parameters: @missing");
}

return;
}

sub api_base {
return 'https://api.digitalocean.com/v2';
}

sub bearer_token { $_[0]{__dobby_bearer_token} }

sub http ($self) {
return $self->{__dobby_http} //= do {
my $http = Net::Async::HTTP->new(
user_agent => 'Dobby/0',
);

$self->loop->add($http);

$http;
};
}

async sub json_get ($self, $path) {
my $res = await $self->http->do_request(
method => 'GET',
uri => $self->api_base . $path,
headers => {
'Authorization' => "Bearer " . $self->bearer_token,
},
);

unless ($res->is_success) {
die "error getting $path at DigitalOcean: " . $res->as_string;
}

my $json = $res->decoded_content(charset => undef);
decode_json($json);
}

async sub json_get_pages_of ($self, $path, $key) {
my $url = $self->api_base . $path;

my @items;

while ($url) {
my $res = await $self->http->do_request(
method => 'GET',
uri => $url,
headers => {
'Authorization' => "Bearer " . $self->bearer_token,
},
);

unless ($res->is_success) {
die "error getting $path at DigitalOcean: " . $res->as_string;
}

my $json = $res->decoded_content(charset => undef);
my $data = decode_json($json);

die "no entry for $key in returned page"
unless exists $data->{$key};

push @items, $data->{$key}->@*;
$url = $data->{links}{pages}{next};
}

return \@items;
}

async sub _json_req_with_body ($self, $method, $path, $payload) {
my $res = await $self->http->do_request(
method => $method,
uri => $self->api_base . $path,
headers => {
'Authorization' => "Bearer " . $self->bearer_token,
},

content_type => 'application/json',
content => encode_json($payload),
);

unless ($res->is_success) {
die "error making $method to $path at DigitalOcean: " . $res->as_string;
}

my $json = $res->decoded_content(charset => undef);
decode_json($json);
}

async sub json_post ($self, $path, $payload) {
await $self->_json_req_with_body('POST', $path, $payload);
}

async sub json_put ($self, $path, $payload) {
await $self->_json_req_with_body('PUT', $path, $payload);
}

async sub delete_url ($self, $path) {
my $res = await $self->http->do_request(
method => 'DELETE',
uri => $self->api_base . $path,
headers => {
'Authorization' => "Bearer " . $self->bearer_token,
},
);

unless ($res->is_success) {
die "error deleting resource at $path in DigitalOcean: " . $res->as_string;
}

return;
}

async sub create_droplet ($self, $arg) {
state @required_keys = qw( name region size tags image ssh_keys );

my @missing;
KEY: for my $key (@required_keys) {
unless (defined $arg->{$key}) {
push @missing, $key;
next KEY;
}
}

if (@missing) {
Carp::confess("missing required Dobby::Client parameters: @missing");
}

my $create_res = await $self->json_post(
"/droplets",
{
$arg->%{ @required_keys },
},
);

my $droplet = $create_res->{droplet};

unless ($droplet) {
Carp::confess("Error creating Droplet.");
}

my $action_id = $create_res->{links}{actions}[0]{id};

unless (defined $action_id) {
Carp::confess(
"no action id from droplet action: " . encode_json($create_res)
);
}

my $waited = await $self->_do_action_status_f("/actions/$action_id");

return $droplet;
}

async sub take_droplet_action ($self, $droplet_id, $action, $payload = {}) {
my $action_res = await $self->json_post("/droplets/$droplet_id/actions", {
%$payload,
type => $action,
});

my $action_id = $action_res->{action}{id};

unless (defined $action_id) {
Carp::confess(
"no action id from droplet action: " . encode_json($action_res)
);
}

await $self->_do_action_status_f("/droplets/$droplet_id/actions/$action_id");

return;
}

async sub destroy_droplet ($self, $droplet_id) {
my $delete_res = await $self->delete_url("/droplets/$droplet_id");
return;
}

async sub _do_action_status_f ($self, $action_url) {
TRY: while (1) {
my $action = await $self->json_get($action_url);
my $status = $action->{action}{status};

# ugh, DO is now sometimes returning empty string in the status field
# -- michael, 2021-04-16
$status = 'completed' if ! $status && $action->{action}{completed_at};

if ($status eq 'in-progress') {
await $self->loop->delay_future(after => 5);
next TRY;
}

if ($status eq 'completed') {
return;
}

if ($status eq 'errored') {
Carp::confess("action $action_url failed: " . encode_json($action->{action}));
}

Carp::confess("action $action_url in unknown state: $status");
}
}

async sub _get_droplets ($self, $arg = {}) {
my $path = '/droplets?per_page=200';
$path .= "&tag_name=$arg->{tag}" if $arg->{tag};

my $droplets_data = await $self->json_get($path);

# TODO Obviously, this should lazily fetch etc.
if ($droplets_data->{links}{pages}{forward_links}) {
Carp::cluck("Single-page fetch did not find all droplets!");
}

unless ($droplets_data->{droplets}) {
Carp::cluck(
"getting /droplets didn't supply droplets: " . encode_json($droplets_data)
);
}

return $droplets_data->{droplets}->@*;
}

async sub get_all_droplets ($self) {
await $self->_get_droplets;
}

async sub get_droplets_with_tag ($self, $tag) {
await $self->_get_droplets({ tag => $tag });
}

async sub add_droplet_to_project ($self, $droplet_id, $project_id) {
my $path = "/projects/$project_id/resources";

await $self->json_post($path, {
resources => [ "do:droplet:$droplet_id" ],
});
}

async sub get_all_domain_records_for_domain ($self, $domain) {
my $path = '/domains/' . $domain . '/records';

# TODO Obviously, this should lazily fetch etc.
my $record_res = await $self->json_get("$path?per_page=200");
return unless $record_res->{domain_records};
return $record_res->{domain_records}->@*;
}

async sub remove_domain_records_for_ip ($self, $domain, $ip) {
my $path = '/domains/' . $domain . '/records';

my @records = await $self->get_all_domain_records_for_domain($domain);
my @to_delete = grep {; $_->{data} eq $ip } @records;
my @deletions = map {; $self->delete_url("$path/$_->{id}") } @to_delete;

return await Future->wait_all(@deletions);
}

async sub point_domain_record_at_ip ($self, $domain, $name, $ip) {
my $path = '/domains/' . $domain . '/records';

my @records = await $self->get_all_domain_records_for_domain($domain);

my (@existing) = grep {; $_->{name} eq $name && $_->{type} eq 'A' } @records;

if (@existing) {
my @to_update = map {; $self->json_put("$path/$_->{id}", { data => $ip }) }
@existing;
await Future->wait_all(@to_update);
return;
}

await $self->json_post($path, {
type => 'A',
name => $name,
data => $ip,
ttl => 30,
});

return;
}

1;
Loading
Loading