Skip to content

Commit

Permalink
add treegamregressor
Browse files Browse the repository at this point in the history
  • Loading branch information
csinva committed Aug 22, 2023
1 parent f29235d commit 86645d0
Show file tree
Hide file tree
Showing 12 changed files with 1,356 additions and 514 deletions.
311 changes: 0 additions & 311 deletions docs/algebraic/gam.html

This file was deleted.

10 changes: 5 additions & 5 deletions docs/algebraic/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -29,14 +29,14 @@
<section>
<h2 class="section-title" id="header-submodules">Sub-modules</h2>
<dl>
<dt><code class="name"><a title="imodels.algebraic.gam" href="gam.html">imodels.algebraic.gam</a></code></dt>
<dd>
<div class="desc"></div>
</dd>
<dt><code class="name"><a title="imodels.algebraic.slim" href="slim.html">imodels.algebraic.slim</a></code></dt>
<dd>
<div class="desc"><p>Wrapper for sparse, integer linear models …</p></div>
</dd>
<dt><code class="name"><a title="imodels.algebraic.tree_gam" href="tree_gam.html">imodels.algebraic.tree_gam</a></code></dt>
<dd>
<div class="desc"></div>
</dd>
</dl>
</section>
<section>
Expand Down Expand Up @@ -65,8 +65,8 @@ <h1>Index 🔍</h1>
</li>
<li><h3><a href="#header-submodules">Sub-modules</a></h3>
<ul>
<li><code><a title="imodels.algebraic.gam" href="gam.html">imodels.algebraic.gam</a></code></li>
<li><code><a title="imodels.algebraic.slim" href="slim.html">imodels.algebraic.slim</a></code></li>
<li><code><a title="imodels.algebraic.tree_gam" href="tree_gam.html">imodels.algebraic.tree_gam</a></code></li>
</ul>
</li>
</ul>
Expand Down
872 changes: 872 additions & 0 deletions docs/algebraic/tree_gam.html

Large diffs are not rendered by default.

81 changes: 67 additions & 14 deletions docs/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -158,7 +158,7 @@ <h3 id="supported-models">Supported models</h3>
</tr>
<tr>
<td style="text-align: left;">Tree GAM</td>
<td><a href="https://csinva.io/imodels/algebraic/gam.html">🗂️</a>, <a href="https://github.com/interpretml/interpret">🔗</a>, <a href="https://dl.acm.org/doi/abs/10.1145/2339530.2339556">📄</a></td>
<td><a href="https://csinva.io/imodels/algebraic/tree_gam.html">🗂️</a>, <a href="https://github.com/interpretml/interpret">🔗</a>, <a href="https://dl.acm.org/doi/abs/10.1145/2339530.2339556">📄</a></td>
<td>Generalized additive model fit with short boosted trees</td>
</tr>
<tr>
Expand Down Expand Up @@ -363,6 +363,12 @@ <h2 id="support-for-different-tasks">Support for different tasks</h2>
<td>Requires extra dependencies for speed</td>
</tr>
<tr>
<td style="text-align: left;">Tree GAM</td>
<td style="text-align: center;"><a href="https://csinva.io/imodels/algebraic/tree_gam.html">TreeGAMClassifier</a></td>
<td style="text-align: center;"><a href="https://csinva.io/imodels/algebraic/tree_gam.html">TreeGAMRegressor</a></td>
<td></td>
</tr>
<tr>
<td style="text-align: left;">Greedy tree sums (FIGS)</td>
<td style="text-align: center;"><a href="https://csinva.io/imodels/tree/figs.html#imodels.tree.figs.FIGSClassifier">FIGSClassifier</a></td>
<td style="text-align: center;"><a href="https://csinva.io/imodels/tree/figs.html#imodels.tree.figs.FIGSRegressor">FIGSRegressor</a></td>
Expand Down Expand Up @@ -444,6 +450,9 @@ <h3 id="hierarchical-shrinkage-post-hoc-regularization-for-tree-based-methods">H
<p align="center">
<i><b>HS Example.</b> HS applies post-hoc regularization to any decision tree by shrinking each node towards its parent.</i>
</p>
<h3 id="mdi-a-flexible-random-forest-based-feature-importance-framework">MDI+: A Flexible Random Forest-Based Feature Importance Framework</h3>
<p><a href="https://arxiv.org/pdf/2307.01932.pdf">📄 Paper</a>, <a href="https://scholar.google.com/scholar?hl=en&amp;as_sdt=0%2C23&amp;q=MDI%2B%3A+A+Flexible+Random+Forest-Based+Feature+Importance+Framework&amp;btnG=#d=gs_cit&amp;t=1690399844081&amp;u=%2Fscholar%3Fq%3Dinfo%3Axc0LcHXE_lUJ%3Ascholar.google.com%2F%26output%3Dcite%26scirp%3D0%26hl%3Den">📌 Citation</a></p>
<p>MDI+ is a novel feature importance framework, which generalizes the popular mean decrease in impurity (MDI) importance score for random forests. At its core, MDI+ expands upon a recently discovered connection between linear regression and decision trees. In doing so, MDI+ enables practitioners to (1) tailor the feature importance computation to the data/problem structure and (2) incorporate additional features or knowledge to mitigate known biases of decision trees. In both real data case studies and extensive real-data-inspired simulations, MDI+ outperforms commonly used feature importance measures (e.g., MDI, permutation-based scores, and TreeSHAP) by substantional margins.</p>
<h2 id="references">References</h2>
<details>
<summary>Readings</summary>
Expand Down Expand Up @@ -511,7 +520,7 @@ <h2 id="references">References</h2>
# Github repo available [here](https://github.com/csinva/imodels)

from .algebraic.slim import SLIMRegressor, SLIMClassifier
from .algebraic.gam import TreeGAMClassifier
from .algebraic.tree_gam import TreeGAMClassifier, TreeGAMRegressor
from .discretization.discretizer import RFDiscretizer, BasicDiscretizer
from .discretization.mdlp import MDLPDiscretizer, BRLDiscretizer
from .experimental.bartpy import BART
Expand All @@ -529,27 +538,64 @@ <h2 id="references">References</h2>
from .rule_set.skope_rules import SkopeRulesClassifier
from .rule_set.slipper import SlipperClassifier
from .tree.c45_tree.c45_tree import C45TreeClassifier
from .tree.cart_ccp import DecisionTreeCCPClassifier, DecisionTreeCCPRegressor, HSDecisionTreeCCPClassifierCV, \
HSDecisionTreeCCPRegressorCV
from .tree.cart_ccp import (
DecisionTreeCCPClassifier,
DecisionTreeCCPRegressor,
HSDecisionTreeCCPClassifierCV,
HSDecisionTreeCCPRegressorCV,
)

# from .tree.iterative_random_forest.iterative_random_forest import IRFClassifier
# from .tree.optimal_classification_tree import OptimalTreeModel
from .tree.cart_wrapper import GreedyTreeClassifier, GreedyTreeRegressor
from .tree.figs import FIGSRegressor, FIGSClassifier, FIGSRegressorCV, FIGSClassifierCV
from .tree.gosdt.pygosdt import OptimalTreeClassifier
from .tree.gosdt.pygosdt_shrinkage import HSOptimalTreeClassifier, HSOptimalTreeClassifierCV
from .tree.hierarchical_shrinkage import HSTreeRegressor, HSTreeClassifier, HSTreeRegressorCV, HSTreeClassifierCV
from .tree.gosdt.pygosdt_shrinkage import (
HSOptimalTreeClassifier,
HSOptimalTreeClassifierCV,
)
from .tree.hierarchical_shrinkage import (
HSTreeRegressor,
HSTreeClassifier,
HSTreeRegressorCV,
HSTreeClassifierCV,
)
from .tree.tao import TaoTreeClassifier, TaoTreeRegressor
from .util.data_util import get_clean_dataset
from .util.distillation import DistilledRegressor
from .util.explain_errors import explain_classification_errors

CLASSIFIERS = [BayesianRuleListClassifier, GreedyRuleListClassifier, SkopeRulesClassifier,
BoostedRulesClassifier, SLIMClassifier, SlipperClassifier, BayesianRuleSetClassifier,
C45TreeClassifier, OptimalTreeClassifier, OptimalRuleListClassifier, OneRClassifier,
SlipperClassifier, RuleFitClassifier, TaoTreeClassifier,
FIGSClassifier, HSTreeClassifier, HSTreeClassifierCV] # , IRFClassifier
REGRESSORS = [RuleFitRegressor, SLIMRegressor, GreedyTreeClassifier, FIGSRegressor,
TaoTreeRegressor, HSTreeRegressor, HSTreeRegressorCV, BART]
CLASSIFIERS = [
BayesianRuleListClassifier,
GreedyRuleListClassifier,
SkopeRulesClassifier,
BoostedRulesClassifier,
SLIMClassifier,
SlipperClassifier,
BayesianRuleSetClassifier,
C45TreeClassifier,
OptimalTreeClassifier,
OptimalRuleListClassifier,
OneRClassifier,
SlipperClassifier,
RuleFitClassifier,
TaoTreeClassifier,
TreeGAMClassifier,
FIGSClassifier,
HSTreeClassifier,
HSTreeClassifierCV,
] # , IRFClassifier
REGRESSORS = [
RuleFitRegressor,
SLIMRegressor,
GreedyTreeClassifier,
FIGSRegressor,
TaoTreeRegressor,
TreeGAMRegressor,
HSTreeRegressor,
HSTreeRegressorCV,
BART,
]
ESTIMATORS = CLASSIFIERS + REGRESSORS
DISCRETIZERS = [RFDiscretizer, BasicDiscretizer, MDLPDiscretizer, BRLDiscretizer]</code></pre>
</details>
Expand Down Expand Up @@ -609,7 +655,14 @@ <h1>Index 🔍</h1>
<li><a href="#support-for-different-tasks">Support for different tasks</a><ul>
<li><a href="#extras">Extras</a></li>
</ul>
</li><li><a href="#references">References</a></li>
</li>
<li><a href="#our-favorite-models">Our favorite models</a><ul>
<li><a href="#figs-fast-interpretable-greedy-tree-sums">FIGS: Fast interpretable greedy-tree sums</a></li>
<li><a href="#hierarchical-shrinkage-post-hoc-regularization-for-tree-based-methods">Hierarchical shrinkage: post-hoc regularization for tree-based methods</a></li>
<li><a href="#mdi-a-flexible-random-forest-based-feature-importance-framework">MDI+: A Flexible Random Forest-Based Feature Importance Framework</a></li>
</ul>
</li>
<li><a href="#references">References</a></li>
</ul>
</div>
<ul id="index">
Expand Down
15 changes: 12 additions & 3 deletions docs/tree/figs.html
Original file line number Diff line number Diff line change
Expand Up @@ -401,7 +401,10 @@
# require the tree to have more than 1 node, otherwise just leave importance_data_tree as zeros
if 1 &lt; next(node_counter):
tree_samples = _importances(tree_)
importance_data_tree /= tree_samples
if tree_samples != 0:
importance_data_tree /= tree_samples
else:
importance_data_tree = 0

importance_data.append(importance_data_tree)

Expand Down Expand Up @@ -1015,7 +1018,10 @@ <h2 id="params">Params</h2>
# require the tree to have more than 1 node, otherwise just leave importance_data_tree as zeros
if 1 &lt; next(node_counter):
tree_samples = _importances(tree_)
importance_data_tree /= tree_samples
if tree_samples != 0:
importance_data_tree /= tree_samples
else:
importance_data_tree = 0

importance_data.append(importance_data_tree)

Expand Down Expand Up @@ -1410,7 +1416,10 @@ <h3>Methods</h3>
# require the tree to have more than 1 node, otherwise just leave importance_data_tree as zeros
if 1 &lt; next(node_counter):
tree_samples = _importances(tree_)
importance_data_tree /= tree_samples
if tree_samples != 0:
importance_data_tree /= tree_samples
else:
importance_data_tree = 0

importance_data.append(importance_data_tree)

Expand Down
Loading

0 comments on commit 86645d0

Please sign in to comment.