Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generalize inheritance and composition of model classes #12

Merged
merged 2 commits into from
Nov 18, 2021

Conversation

michaelosthege
Copy link
Member

@michaelosthege michaelosthege commented Nov 14, 2021

This PR refactors the model classes such ultimately code duplication is reduced.

The full reasoning behind this inheritance/composition design is described in a new documentation page.

Changes

Docs (done with #14)

  • Sphinx was configured to also render Markdown pages.
  • A Markdown document explaining the model classes, inheritance and distribution/noisemodel mixins was added.

Utils (done with #14)

  • The HAS_PYMC constant is now exposed as calibr8.HAS_PYMC. And the pm module may be accessed via calibr8.utils.pm.

Generalization of noise model (see #16)

  • Implement calibr8.core.DistributionMixin and a few calibr8.*Noise classes.
  • Update inheritance of existing contrib models to use mixins.
  • Refactor BaseModelT.loglikelihood to be distribution-agnostic.
  • Move BaseModelT.loglikelihoodCalibrationModel.loglikelihood.
  • Refactor (log)likelihood to slice correctly independent of ndim(x).

Inference type and dimensionality

  • Split InferenceResult into subclasses.
  • Introduce ContinuousUnivariateModel and ContinuousMultivariateModel model subclasses with infer_independent implementations.
  • Update contrib models to inherit from model subclasses.

Tests (see #17)

Addition of examples (deferred to follow-up)

  • Univariate, continuous with Laplace noise
  • Univariate, discrete with Poisson noise
  • Bivarite, continuous with Normal noise

@michaelosthege michaelosthege added the enhancement New feature or request label Nov 14, 2021
@michaelosthege michaelosthege self-assigned this Nov 14, 2021
@codecov-commenter
Copy link

codecov-commenter commented Nov 14, 2021

Codecov Report

Merging #12 (70a26c6) into master (7d1b7a4) will increase coverage by 3.14%.
The diff coverage is 99.33%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master      #12      +/-   ##
==========================================
+ Coverage   93.67%   96.81%   +3.14%     
==========================================
  Files           5        6       +1     
  Lines         664      723      +59     
==========================================
+ Hits          622      700      +78     
+ Misses         42       23      -19     
Impacted Files Coverage Δ
calibr8/core.py 99.71% <98.88%> (+3.99%) ⬆️
calibr8/__init__.py 100.00% <100.00%> (+33.33%) ⬆️
calibr8/contrib/base.py 100.00% <100.00%> (+3.88%) ⬆️
calibr8/contrib/noise.py 100.00% <100.00%> (ø)
calibr8/utils.py 90.47% <100.00%> (+0.06%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 7d1b7a4...70a26c6. Read the comment docs.

Copy link
Member

@lhelleckes lhelleckes left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think a good point to split would be the noise refactoring, which is a big enough change to stand for itself. I would for sure split the docs etc. because they need to be changed when everything else is done.

I added two comment here already, I hope that doesn't confuse you. For a detailed review, I would split it in three smaller PRs:

Noise-related, InferenceResult-related and documentation-related changes.

docs/source/calibr8_inheritance.md Outdated Show resolved Hide resolved
docs/source/conf.py Outdated Show resolved Hide resolved
@lhelleckes
Copy link
Member

You also asked for weaknesses: except for the (probably) unnecessary subclass that I commented on above, I think we should clarify how the distributions would work in the discrete case. I think it's okay that this is not implemented for the first publication, but it should be clarified at least (that is also a nice outlook for the paper).

Copy link
Member

@lhelleckes lhelleckes left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK

@michaelosthege michaelosthege marked this pull request as ready for review November 18, 2021 10:10
@michaelosthege michaelosthege merged commit ebaf7c4 into master Nov 18, 2021
@michaelosthege michaelosthege deleted the generalize-inheritance branch November 18, 2021 10:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants