Skip to content

Commit

Permalink
Updating jedi log data reader (JCSDA-internal#182)
Browse files Browse the repository at this point in the history
## Description
We attempted to produce diagnostics from jedi logs we've generated, and
we found ourselves running into issues with the associated reader. A few
modifications have allowed us to successfully generate jedi log
diagnostics.

List of changes:
- Added `residual_norm` to list of variables to be built
- Added `var_array` resize if `len(var_array) > total_iter`
- Keeping list of variables that are saved in the convergence dataset to
be used in the normalization logic later on
- Added test config and example log to eva's test cases
  • Loading branch information
asewnath authored Mar 13, 2024
1 parent 455c377 commit eac6d99
Show file tree
Hide file tree
Showing 3 changed files with 8,743 additions and 2 deletions.
19 changes: 17 additions & 2 deletions src/eva/data/jedi_log.py
Original file line number Diff line number Diff line change
Expand Up @@ -237,6 +237,13 @@ def parse_convergence(self):
var_position.append(1)
var_dtype.append('float32')

# Residual norm
var_names.append('residual_norm')
var_search_criteria.append('Residual norm (')
var_split.append('=')
var_position.append(1)
var_dtype.append('float32')

# Norm reduction
var_names.append('norm_reduction')
var_search_criteria.append('Norm reduction (')
Expand Down Expand Up @@ -278,6 +285,7 @@ def parse_convergence(self):
# Concatenate chunks to simplify search algorithm
min_and_j_chunks = minimizer_chunks + j_chunks

ds_vars = []
for var_ind, var in enumerate(var_names):
var_array = []
for min_and_j_chunk in min_and_j_chunks:
Expand All @@ -291,7 +299,12 @@ def parse_convergence(self):
if var_array:
gn = f'convergence::{var_names[var_ind]}' # group::variable name
convergence_ds[gn] = xr.DataArray(np.zeros(total_iter, dtype=var_dtype[var_ind]))

# If var is greater than total_iter, clip var array
if len(var_array) > total_iter:
var_array = var_array[0:total_iter]
convergence_ds[gn].data[:] = var_array
ds_vars.append(var)

# Create special case variables

Expand All @@ -316,10 +329,12 @@ def parse_convergence(self):

# Normalized versions of data
# ---------------------------
normalize_var_names = ['gradient_reduction', 'norm_reduction', 'j', 'jb', 'jojc']
normalize_var_names = ['gradient_reduction', 'residual_norm',
'norm_reduction', 'j', 'jb', 'jojc']

for normalize_var_name in normalize_var_names:
if normalize_var_name in var_names:
if (normalize_var_name in var_names) and (normalize_var_name in ds_vars):

# Index in lists for the variable being normalized
var_ind = var_names.index(normalize_var_name)

Expand Down
57 changes: 57 additions & 0 deletions src/eva/tests/config/testJediLog2.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
datasets:

- type: JediLog
collection_name: jedi_log_test
jedi_log_to_parse: ${data_input_path}/jedi_variational_log.txt
data_to_parse:
convergence: true

graphics:

plotting_backend: Emcpy
figure_list:

- figure:
layout: [3,1]
figure size: [12,10]
title: 'Residual norm and Norm Reduction Plots'
output name: jedi_log/convergence/norm_gradient_reduction.png
plots:
- add_xlabel: 'Total inner iteration number'
add_ylabel: 'Residual norm'
layers:
- type: LinePlot
x:
variable: jedi_log_test::convergence::total_iteration
y:
variable: jedi_log_test::convergence::residual_norm
color: 'black'

- add_xlabel: 'Total inner iteration number'
add_ylabel: 'Norm reduction'
layers:
- type: LinePlot
x:
variable: jedi_log_test::convergence::total_iteration
y:
variable: jedi_log_test::convergence::norm_reduction
color: 'black'

- add_xlabel: 'Total inner iteration number'
add_ylabel: 'Normalized Value'
add_legend:
layers:
- type: LinePlot
x:
variable: jedi_log_test::convergence::total_iteration
y:
variable: jedi_log_test::convergence::residual_norm_normalized
color: 'red'
label: 'Normalized residual norm'
- type: LinePlot
x:
variable: jedi_log_test::convergence::total_iteration
y:
variable: jedi_log_test::convergence::norm_reduction_normalized
color: 'blue'
label: 'Normalized norm reduction'
Loading

0 comments on commit eac6d99

Please sign in to comment.