Skip to content

Commit

Permalink
Various API fixes for pyshgp's first release. (#104)
Browse files Browse the repository at this point in the history
  • Loading branch information
erp12 authored Feb 22, 2019
1 parent 4ef508e commit a7a3e09
Show file tree
Hide file tree
Showing 16 changed files with 216 additions and 131 deletions.
7 changes: 0 additions & 7 deletions ROADMAPS.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,10 +69,3 @@

- Users should be able to define their own PushTypes which get associated Stacks.
- This is meant to aid in domain-specific applications (ie. NeuroEvolution, Quantum Algorithms, etc)


## CLI

### To Do

- [x] Build instruction set documentation HTML/MD
Binary file modified docs/doctrees/environment.pickle
Binary file not shown.
Binary file modified docs/doctrees/index.doctree
Binary file not shown.
15 changes: 9 additions & 6 deletions docs/html/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -42,6 +42,8 @@ <h3>Navigation</h3>

<div class="section" id="pyghgp">
<h1>PyghGP<a class="headerlink" href="#pyghgp" title="Permalink to this headline"></a></h1>
<a class="reference external image-reference" href="https://badge.fury.io/py/pyshgp"><img alt="PyPI version" src="https://badge.fury.io/py/pyshgp.svg" /></a>
<a class="reference external image-reference" href="https://circleci.com/gh/erp12/pyshgp/tree/master"><img alt="CircleCI" src="https://circleci.com/gh/erp12/pyshgp/tree/master.svg?style=svg" /></a>
<p>Push Genetic Programming in Python</p>
<div class="section" id="motivation">
<h2>Motivation<a class="headerlink" href="#motivation" title="Permalink to this headline"></a></h2>
Expand Down Expand Up @@ -74,34 +76,35 @@ <h2>Installing pyshgp<a class="headerlink" href="#installing-pyshgp" title="Perm
<p><code class="docutils literal notranslate"><span class="pre">pyshgp</span></code> is compatible with python 3.5 and up.</p>
<div class="section" id="install-from-pip">
<h3>Install from pip<a class="headerlink" href="#install-from-pip" title="Permalink to this headline"></a></h3>
<p>Coming with initial first release of <code class="docutils literal notranslate"><span class="pre">pyshgp</span></code>.
Check the project milestones to get a sense of how far off this is.</p>
<div class="highlight-guess notranslate"><div class="highlight"><pre><span></span>pip install pyshgp
</pre></div>
</div>
</div>
<div class="section" id="build-frome-source">
<h3>Build Frome source<a class="headerlink" href="#build-frome-source" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li>Clone the repo</li>
<li>cd into the <code class="docutils literal notranslate"><span class="pre">pyshgp</span></code> repo directory</li>
<li>run <code class="docutils literal notranslate"><span class="pre">pip</span> <span class="pre">install</span> <span class="pre">.</span> <span class="pre">--upgrade</span></code></li>
<li>Thats it! Check out the examples and</li>
<li>Thats it! Check out the examples and documentation.</li>
</ul>
</div>
</div>
<div class="section" id="documentation">
<h2>Documentation<a class="headerlink" href="#documentation" title="Permalink to this headline"></a></h2>
<div class="section" id="example-usage">
<h3>Example Usage<a class="headerlink" href="#example-usage" title="Permalink to this headline"></a></h3>
<p>Example usages of <code class="docutils literal notranslate"><span class="pre">pyshgp</span></code> can be found in the <code class="docutils literal notranslate"><span class="pre">examples/</span></code> <a class="reference external" href="#">folder of the Github repository</a>.</p>
<p>Example usages of <code class="docutils literal notranslate"><span class="pre">pyshgp</span></code> can be found in the <code class="docutils literal notranslate"><span class="pre">examples/</span></code> <a class="reference external" href="https://github.com/erp12/pyshgp/tree/master/examples">folder of the Github repository</a>.</p>
</div>
<div class="section" id="api">
<h3>API<a class="headerlink" href="#api" title="Permalink to this headline"></a></h3>
<p>The full <code class="docutils literal notranslate"><span class="pre">pyshgp</span></code> API can be found on <a class="reference external" href="#">official website</a>.</p>
<p>The full <code class="docutils literal notranslate"><span class="pre">pyshgp</span></code> API can be found on <a class="reference external" href="http://erp12.github.io/pyshgp">official website</a>.</p>
</div>
</div>
<div class="section" id="pysh-roadmap-contributing">
<h2>Pysh Roadmap / Contributing<a class="headerlink" href="#pysh-roadmap-contributing" title="Permalink to this headline"></a></h2>
<p>PyshGP isn’t quite ready for its 1.0 release. It still has a few key features that need implementing. More details can be found in <code class="docutils literal notranslate"><span class="pre">ROADMAPS.md</span></code> and in the projects tab on Github.</p>
<p>For information about contributing, see the <a class="reference external" href="#">Contributing Guide</a>.</p>
<p>For information about contributing, see the <a class="reference external" href="http://erp12.github.io/pyshgp/html/contributing.html">Contributing Guide</a>.</p>
</div>
</div>
<div class="section" id="contents">
Expand Down
2 changes: 1 addition & 1 deletion docs/html/searchindex.js

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion pyshgp/__init__.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
"""pyshgp."""

__version__ = "0.1.1"
__version__ = "0.1.2"
82 changes: 30 additions & 52 deletions pyshgp/gp/estimators.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
"""The :mod:`estimator` module defines a ``PushEstimator`` class."""
import json
import inspect
from typing import Union, Tuple, Sequence

import numpy as np
Expand Down Expand Up @@ -109,8 +108,8 @@ class PushEstimator:
"""

def __init__(self,
search: Union[sr.SearchAlgorithm, str] = "GA",
spawner: Union[GeneSpawner, str] = "default",
spawner: GeneSpawner,
search: str = "GA",
selector: Union[sl.Selector, str] = "lexicase",
variation_strategy: Union[vr.VariationStrategy, dict, str] = "umad",
population_size: int = 300,
Expand All @@ -119,7 +118,7 @@ def __init__(self,
simplification_steps: int = 2000,
interpreter: PushInterpreter = "default",
**kwargs):
self.search = search
self._search_name = search
self.spawner = spawner
self.selector = selector
self.variation_strategy = variation_strategy
Expand All @@ -135,64 +134,28 @@ def __init__(self,

self.ext = kwargs

def _build_component(self, component_cls, **kwargs):
arg_names = inspect.getfullargspec(component_cls.__init__)[0][1:]
args = kwargs
for arg_name in arg_names:
if arg_name in self.ext:
args[arg_name] = self.ext[arg_name]
return component_cls(**args)

def _build_search_algo(self):
if isinstance(self.spawner, GeneSpawner):
self._spawner = self.spawner
elif self.spawner == "default":
self._spawner = GeneSpawner(
instruction_set=self.interpreter.instruction_set,
literals=[],
erc_generators=[]
)
else:
raise ValueError("Bad spawner: {}".format(self.spawner))

if isinstance(self.selector, sl.Selector):
self._selector = self.selector
else:
selector_cls = sl.get_selector(self.selector)
self._selector = self._build_component(selector_cls)

if isinstance(self.variation_strategy, vr.VariationStrategy):
self._variation_strategy = self.variation_strategy
else:
self._variation_strategy = vr.VariationStrategy()
if isinstance(self.variation_strategy, dict):
for op_name, prob in self.variation_strategy.items():
var_op = vr.get_variation_operator(op_name)
if not isinstance(var_op, vr.VariationOperator):
var_op = self._build_component(var_op)
self._variation_strategy.add(var_op, prob)
else:
var_op = vr.get_variation_operator(self.variation_strategy)
if isinstance(self.variation_strategy, dict):
var_strat = vr.VariationStrategy()
for op_name, prob in self.variation_strategy.items():
var_op = vr.get_variation_operator(op_name)
if not isinstance(var_op, vr.VariationOperator):
var_op = self._build_component(var_op)
self._variation_strategy.add(var_op, 1.0)
var_strat.add(var_op, prob)
self.variation_strategy = var_strat

search_config = sr.SearchConfiguration(
spawning=self._spawner,
spawner=self.spawner,
evaluator=self.evaluator,
selection=self._selector,
variation=self._variation_strategy,
selection=self.selector,
variation=self.variation_strategy,
population_size=self.population_size,
max_generations=self.max_generations,
initial_genome_size=self.initial_genome_size,
simplification_steps=self.simplification_steps
)

if isinstance(self.search, sr.SearchAlgorithm):
self._search = self.search
else:
search_algo_cls = sr.get_search_algo(self.search)
self._search = self._build_component(search_algo_cls, config=search_config)
self.search = sr.get_search_algo(self._search_name, config=search_config, **self.ext)

def fit(self, X, y, verbose: bool = False):
"""Run the search algorithm to synthesize a push program.
Expand All @@ -209,8 +172,15 @@ def fit(self, X, y, verbose: bool = False):
Default is False.
"""
X, y = check_X_y(X, y, force_all_finite=False, allow_nd=True, multi_output=True)
X, y = check_X_y(
X, y,
dtype=None,
force_all_finite=False,
allow_nd=True,
multi_output=True
)

# Determine required arity of programs.
if isinstance(X, (pd.DataFrame, np.ndarray)):
if len(X.shape) > 1:
arity = X.shape[1]
Expand All @@ -225,6 +195,7 @@ def fit(self, X, y, verbose: bool = False):
arity = 1
self.interpreter.instruction_set.register_n_inputs(arity)

# Determine the output signature of programs.
if isinstance(y, pd.DataFrame):
y_types = list(y.dtypes)
elif isinstance(y, pd.Series):
Expand All @@ -243,7 +214,7 @@ def fit(self, X, y, verbose: bool = False):

self.evaluator = DatasetEvaluator(X, y, interpreter=self.interpreter)
self._build_search_algo()
best_seen = self._search.run(verbose)
best_seen = self.search.run(verbose)

self._result = SearchResult(best_seen.program, output_types)

Expand Down Expand Up @@ -282,6 +253,13 @@ def score(self, X, y):
"""
check_is_fitted(self, "_result")
X, y = check_X_y(
X, y,
dtype=None,
force_all_finite=False,
allow_nd=True,
multi_output=True
)
self.evaluator = DatasetEvaluator(X, y)
return self.evaluator.evaluate(self._result.program)

Expand Down
59 changes: 28 additions & 31 deletions pyshgp/gp/search.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@
from pyshgp.gp.population import Population
from pyshgp.gp.selection import Selector, get_selector
from pyshgp.gp.variation import VariationOperator, get_variation_operator
from pyshgp.utils import instantiate_using


# @TODO: Should SearchConfiguration be JSON serializable?
Expand All @@ -23,10 +24,10 @@ class SearchConfiguration:
----------
evaluator : Evaluator
The Evaluator to use when evaluating individuals.
spawning : Union[GeneSpawner, str], optional
The GeneSpawner, or DiscreteProbDistrib of gene spawners to use when
producing Genomes during initialization and variation.
selection :Union[Selector, DiscreteProbDistrib, str], optional
spawning : GeneSpawner
The GeneSpawner to use when producing Genomes during initialization and
variation.
selection : Union[Selector, DiscreteProbDistrib, str], optional
A Selector, or DiscreteProbDistrib of selectors, to use when selecting
parents. The default is lexicase selection.
variation : Union[VariationOperator, DiscreteProbDistrib, str], optional
Expand All @@ -51,49 +52,45 @@ class SearchConfiguration:

def __init__(self,
evaluator: Evaluator,
spawning: Union[GeneSpawner, DiscreteProbDistrib],
spawner: GeneSpawner,
selection: Union[Selector, DiscreteProbDistrib, str] = "lexicase",
variation: Union[VariationOperator, DiscreteProbDistrib, str] = "umad",
population_size: int = 500,
max_generations: int = 100,
error_threshold: float = 0.0,
initial_genome_size: Tuple[int, int] = (10, 50),
simplification_steps: int = 2000):
simplification_steps: int = 2000,
**kwargs):
self.evaluator = evaluator
self.spawner = spawner
self.population_size = population_size
self.max_generations = max_generations
self.error_threshold = error_threshold
self.initial_genome_size = initial_genome_size
self.simplification_steps = simplification_steps
self.ext = kwargs

if isinstance(spawning, DiscreteProbDistrib):
self._spawning = spawning
else:
self._spawning = DiscreteProbDistrib().add(spawning, 1.0)

if isinstance(selection, str):
self._selection = DiscreteProbDistrib().add(get_selector(selection), 1.0)
if isinstance(selection, Selector):
self._selection = DiscreteProbDistrib().add(selection, 1.0)
elif isinstance(selection, DiscreteProbDistrib):
self._selection = selection
else:
self._selection = DiscreteProbDistrib().add(selection, 1.0)
selector = get_selector(selection, **self.ext)
self._selection = DiscreteProbDistrib().add(selector, 1.0)

if variation == "umad":
self._variation = DiscreteProbDistrib().add(get_variation_operator, 1.0)
if isinstance(variation, VariationOperator):
self._variation = DiscreteProbDistrib().add(variation, 1.0)
elif isinstance(variation, DiscreteProbDistrib):
self._variation = variation
else:
self._variation = DiscreteProbDistrib().add(variation, 1.0)

def get_spawner(self):
"""Return a GeneSpawner."""
return self._spawning.sample()
variationOp = get_variation_operator(variation, **self.ext)
self._variation = DiscreteProbDistrib().add(variationOp, 1.0)

def get_selector(self):
"""Return a Selector."""
return self._selection.sample()

def get_variation_operator(self):
def get_variation_op(self):
"""Return a VariationOperator."""
return self._variation.sample()

Expand Down Expand Up @@ -129,7 +126,7 @@ def init_population(self):
"""Initialize the population."""
self.population = Population()
for i in range(self.config.population_size):
spawner = self.config.get_spawner()
spawner = self.config.spawner
genome = spawner.spawn_genome(self.config.initial_genome_size)
self.population.add(Individual(genome))

Expand Down Expand Up @@ -214,10 +211,10 @@ def __init__(self, config: SearchConfiguration):
super().__init__(config)

def _make_child(self) -> Individual:
op = self.config.get_variation_operator()
op = self.config.get_variation_op()
selector = self.config.get_selector()
parent_genomes = [p.genome for p in selector.select(self.population, n=op.num_parents)]
child_genome = op.produce(parent_genomes, self.config.get_spawner())
child_genome = op.produce(parent_genomes, self.config.spawner)
return Individual(child_genome)

def step(self):
Expand Down Expand Up @@ -281,9 +278,9 @@ def step(self):
return

candidate = Individual(
self.config.get_variation_operator().produce(
self.config.get_variation_op().produce(
[self.population.best().genome],
self.config.get_spawner()
self.config.spawner
)
)
candidate.error_vector = self.config.evaluator.evaluate(candidate.program)
Expand All @@ -297,17 +294,17 @@ def step(self):
# ...


def get_search_algo(name: str) -> SearchAlgorithm:
def get_search_algo(name: str, **kwargs) -> SearchAlgorithm:
"""Return the search algorithm class with the given name."""
name_to_cls = {
"GA": GeneticAlgorithm,
"SA": SimulatedAnnealing,
# "ES": EvolutionaryStrategy,
}
search_algo = name_to_cls.get(name, None)
if search_algo is None:
_cls = name_to_cls.get(name, None)
if _cls is None:
raise ValueError("No search algo '{nm}'. Supported names: {lst}.".format(
nm=name,
lst=list(name_to_cls.keys())
))
return search_algo
return instantiate_using(_cls, kwargs)
9 changes: 5 additions & 4 deletions pyshgp/gp/selection.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@

from pyshgp.gp.individual import Individual
from pyshgp.gp.population import Population
from pyshgp.utils import instantiate_using


class Selector(ABC):
Expand Down Expand Up @@ -260,18 +261,18 @@ def select(self, population: Population, n: int = 1) -> Sequence[Individual]:
return population.best_n(n)


def get_selector(name: str) -> Selector:
def get_selector(name: str, **kwargs) -> Selector:
"""Get the selector class with the given name."""
name_to_cls = {
"roulette": FitnessProportionate,
"tournament": Tournament,
"lexicase": Lexicase,
"elite": Elite,
}
selector = name_to_cls.get(name, None)
if selector is None:
_cls = name_to_cls.get(name, None)
if _cls is None:
raise ValueError("No selector '{nm}'. Supported names: {lst}.".format(
nm=name,
lst=list(name_to_cls.keys())
))
return selector
return instantiate_using(_cls, kwargs)
Loading

0 comments on commit a7a3e09

Please sign in to comment.