Skip to content

Commit

Permalink
Added debug_import class for federated reporting import debugging
Browse files Browse the repository at this point in the history
Enables debug logging during attach_feeder_schema() plpgsql function which is
where many issues occur.

Ticket: ENT-10896
Changelog: title
  • Loading branch information
craigcomstock committed Nov 20, 2023
1 parent 4637203 commit 784766c
Show file tree
Hide file tree
Showing 4 changed files with 28 additions and 1 deletion.
18 changes: 18 additions & 0 deletions MPF.md
Original file line number Diff line number Diff line change
Expand Up @@ -2011,6 +2011,24 @@ config when it notices a change in *policy*.
**History**: Added in 3.11.

### Federated Reporting
#### Debug import process

In order to get detailed logs about import failures define the class `default:cfengine_mp_fr_debug_import` on the _superhub_.

For example, to define this class via Augments:

```json
{
"classes": {
"cfengine_mp_fr_debug_import": [ "any::" ]
}
}
```

**History:**

* Added in CFEngine 3.23.0

#### Enable Federated Reporting Distributed Cleanup

Hosts that report to multiple feeders result in duplicate entries and other issues. Distributed cleanup helps to deal with this condition.
Expand Down
3 changes: 3 additions & 0 deletions cfe_internal/enterprise/federation/federation.cf
Original file line number Diff line number Diff line change
Expand Up @@ -458,6 +458,8 @@ bundle agent federation_manage_files
"workdir" data => parsejson('{"workdir":"$(sys.workdir)"}');
"handle_duplicates_value" string => ifelse("default:cfengine_mp_fr_handle_duplicate_hostkeys", "yes", "no");
"handle_duplicates" data => parsejson('{"handle_duplicates":"$(handle_duplicates_value)"}');
"debug_import_value" string => ifelse("default:cfengine_mp_fr_debug_import", "yes", "no");
"debug_import" data => parsejson('{"debug_import":"$(debug_import_value)"}');

files:
enterprise_edition.(policy_server|am_policy_hub)::
Expand Down Expand Up @@ -521,6 +523,7 @@ bundle agent federation_manage_files
@(feeder_username),
@(feeder),
parsejson('{"superhub_hostkeys": "$(superhub_hostkeys)"}'),
@(debug_import),
@(this_hostkey),
@(cf_version),
@(handle_duplicates),
Expand Down
2 changes: 2 additions & 0 deletions templates/federated_reporting/config.sh.mustache
Original file line number Diff line number Diff line change
Expand Up @@ -83,6 +83,8 @@ CFE_FR_DB_USER="{{db_user}}"
CFE_FR_DB_USER="${CFE_FR_DB_USER:-cfpostgres}"
CFE_FR_HANDLE_DUPLICATES="{{handle_duplicates}}" # default is no (don't handle duplicates as it adds to time to import)
CFE_FR_HANDLE_DUPLICATES="${CFE_FR_HANDLE_DUPLICATES:-no}"
CFE_FR_DEBUG_IMPORT="{{debug_import}}" # default is no (don't run imports with debug level logging)
CFE_FR_DEBUG_IMPORT="${CFE_FR_DEBUG_IMPORT:-no}"

# distributed_cleanup dir
CFE_FR_DISTRIBUTED_CLEANUP_DIR="{{distributed_cleanup_dir}}"
Expand Down
6 changes: 5 additions & 1 deletion templates/federated_reporting/import.sh
Original file line number Diff line number Diff line change
Expand Up @@ -126,7 +126,11 @@ for file in $dump_files; do
if [ ! -f "${file}.failed" ]; then
hostkey=$(basename "$file" | cut -d. -f1)
logfile="$WORKDIR"/outputs/"$hostkey"-schema-attach-$(date +%F-%T)-failure.log
"$CFE_BIN_DIR"/psql -U $CFE_FR_DB_USER -d cfdb --set "ON_ERROR_STOP=1" \
debug_import_arg=""
if [ "${CFE_FR_DEBUG_IMPORT}" = "yes" ]; then
debug_import_arg=' -c "SET client_min_messages TO DEBUG5" '
fi
"$CFE_BIN_DIR"/psql -U $CFE_FR_DB_USER -d cfdb --set "ON_ERROR_STOP=1" "$debug_import_arg" \
-c "SET SCHEMA 'public'; SELECT attach_feeder_schema('$hostkey', ARRAY[$table_whitelist]);" \
> "$logfile" 2>&1 || failed=1
if [ "$failed" = "0" ]; then
Expand Down

0 comments on commit 784766c

Please sign in to comment.