-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Generalize inheritance and composition of model classes #12
Conversation
Codecov Report
@@ Coverage Diff @@
## master #12 +/- ##
==========================================
+ Coverage 93.67% 96.81% +3.14%
==========================================
Files 5 6 +1
Lines 664 723 +59
==========================================
+ Hits 622 700 +78
+ Misses 42 23 -19
Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think a good point to split would be the noise refactoring, which is a big enough change to stand for itself. I would for sure split the docs etc. because they need to be changed when everything else is done.
I added two comment here already, I hope that doesn't confuse you. For a detailed review, I would split it in three smaller PRs:
Noise-related, InferenceResult-related and documentation-related changes.
You also asked for weaknesses: except for the (probably) unnecessary subclass that I commented on above, I think we should clarify how the distributions would work in the discrete case. I think it's okay that this is not implemented for the first publication, but it should be clarified at least (that is also a nice outlook for the paper). |
70a26c6
to
9e1a0d4
Compare
9e1a0d4
to
c474805
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK
This PR refactors the model classes such ultimately code duplication is reduced.
The full reasoning behind this inheritance/composition design is described in a new documentation page.
Changes
Docs (done with #14)
Utils (done with #14)
HAS_PYMC
constant is now exposed ascalibr8.HAS_PYMC
. And thepm
module may be accessed viacalibr8.utils.pm
.Generalization of noise model (see #16)
calibr8.core.DistributionMixin
and a fewcalibr8.*Noise
classes.BaseModelT.loglikelihood
to be distribution-agnostic.BaseModelT.loglikelihood
→CalibrationModel.loglikelihood
.(log)likelihood
to slice correctly independent ofndim(x)
.Inference type and dimensionality
InferenceResult
into subclasses.ContinuousUnivariateModel
andContinuousMultivariateModel
model subclasses withinfer_independent
implementations.Tests (see #17)
loglikelihood
andlikelihood
shape broadcasting #15)Addition of examples (deferred to follow-up)
Univariate, discrete with Poisson noise