From 16c5ad9e4347ea11b83bcd541c2269eb95e62fd9 Mon Sep 17 00:00:00 2001 From: Ahmed Gad Date: Fri, 9 Sep 2022 01:38:58 -0400 Subject: [PATCH] PyGAD 2.18.0 Documentation MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit 1. Raise an exception if the sum of fitness values is zero while either roulette wheel or stochastic universal parent selection is used. https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/129 2. Initialize the value of the `run_completed` property to `False`. https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/122 3. The values of these properties are no longer reset with each call to the `run()` method `self.best_solutions, self.best_solutions_fitness, self.solutions, self.solutions_fitness`: https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/123. Now, the user can have the flexibility of calling the `run()` method more than once while extending the data collected after each generation. Another advantage happens when the instance is loaded and the `run()` method is called, as the old fitness value are shown on the graph alongside with the new fitness values. Read more in this section: [Continue without Loosing Progress](https://pygad.readthedocs.io/en/latest/README_pygad_ReadTheDocs.html#continue-without-loosing-progress) 4. Thanks [Prof. Fernando Jiménez Barrionuevo](http://webs.um.es/fernan) (Dept. of Information and Communications Engineering, University of Murcia, Murcia, Spain) for editing this [comment](https://github.com/ahmedfgad/GeneticAlgorithmPython/blob/5315bbec02777df96ce1ec665c94dece81c440f4/pygad.py#L73) in the code. https://github.com/ahmedfgad/GeneticAlgorithmPython/commit/5315bbec02777df96ce1ec665c94dece81c440f4 5. A bug fixed when `crossover_type=None`. 6. Support of elitism selection through a new parameter named `keep_elitism`. It defaults to 1 which means for each generation keep only the best solution in the next generation. If assigned 0, then it has no effect. Read more in this section: [Elitism Selection](https://pygad.readthedocs.io/en/latest/README_pygad_ReadTheDocs.html#elitism-selection). https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/74 7. A new instance attribute named `last_generation_elitism` added to hold the elitism in the last generation. 8. A new parameter called `random_seed` added to accept a seed for the random function generators. Credit to this issue https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/70 and [Prof. Fernando Jiménez Barrionuevo](http://webs.um.es/fernan). Read more in this section: [Random Seed](https://pygad.readthedocs.io/en/latest/README_pygad_ReadTheDocs.html#random-seed). 9. Editing the `pygad.TorchGA` module to make sure the tensor data is moved from GPU to CPU. Thanks to Rasmus Johansson for opening this pull request: https://github.com/ahmedfgad/TorchGA/pull/2 --- docs/source/Footer.rst | 67 +++++- docs/source/README_pygad_ReadTheDocs.rst | 248 ++++++++++++++++++++++- docs/source/conf.py | 2 +- 3 files changed, 300 insertions(+), 17 deletions(-) diff --git a/docs/source/Footer.rst b/docs/source/Footer.rst index 1b71985..b079598 100644 --- a/docs/source/Footer.rst +++ b/docs/source/Footer.rst @@ -19,7 +19,7 @@ Release Date: 15 April 2020 .. _pygad-1020: PyGAD 1.0.20 ------------- +------------- Release Date: 4 May 2020 @@ -39,7 +39,7 @@ Release Date: 4 May 2020 .. _pygad-200: PyGAD 2.0.0 ------------ +------------ Release Date: 13 May 2020 @@ -258,7 +258,7 @@ Release date: 19 July 2020 .. _pygad-260: PyGAD 2.6.0 ------------ +------------ Release Date: 6 August 2020 @@ -378,7 +378,7 @@ Release Date: 3 October 2020 .. _pygad-290: PyGAD 2.9.0 ------------ +------------ Release Date: 06 December 2020 @@ -585,7 +585,7 @@ issue. .. _pygad-2130: PyGAD 2.13.0 ------------- +------------- Release Date: 12 March 2021 @@ -997,6 +997,63 @@ Release Date: 8 July 2022 PyGAD `__ section for more information and examples. +.. _pygad-2180: + +PyGAD 2.18.0 +------------ + +Release Date: 9 September 2022 + +1. Raise an exception if the sum of fitness values is zero while either + roulette wheel or stochastic universal parent selection is used. + https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/129 + +2. Initialize the value of the ``run_completed`` property to ``False``. + https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/122 + +3. The values of these properties are no longer reset with each call to + the ``run()`` method + ``self.best_solutions, self.best_solutions_fitness, self.solutions, self.solutions_fitness``: + https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/123. Now, + the user can have the flexibility of calling the ``run()`` method + more than once while extending the data collected after each + generation. Another advantage happens when the instance is loaded and + the ``run()`` method is called, as the old fitness value are shown on + the graph alongside with the new fitness values. Read more in this + section: `Continue without Loosing + Progress `__ + +4. Thanks `Prof. Fernando Jiménez + Barrionuevo `__ (Dept. of Information and + Communications Engineering, University of Murcia, Murcia, Spain) for + editing this + `comment `__ + in the code. + https://github.com/ahmedfgad/GeneticAlgorithmPython/commit/5315bbec02777df96ce1ec665c94dece81c440f4 + +5. A bug fixed when ``crossover_type=None``. + +6. Support of elitism selection through a new parameter named + ``keep_elitism``. It defaults to 1 which means for each generation + keep only the best solution in the next generation. If assigned 0, + then it has no effect. Read more in this section: `Elitism + Selection `__. + https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/74 + +7. A new instance attribute named ``last_generation_elitism`` added to + hold the elitism in the last generation. + +8. A new parameter called ``random_seed`` added to accept a seed for the + random function generators. Credit to this issue + https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/70 and + `Prof. Fernando Jiménez Barrionuevo `__. + Read more in this section: `Random + Seed `__. + +9. Editing the ``pygad.TorchGA`` module to make sure the tensor data is + moved from GPU to CPU. Thanks to Rasmus Johansson for opening this + pull request: https://github.com/ahmedfgad/TorchGA/pull/2 + PyGAD Projects at GitHub ======================== diff --git a/docs/source/README_pygad_ReadTheDocs.rst b/docs/source/README_pygad_ReadTheDocs.rst index 0b78b19..71a4f7d 100644 --- a/docs/source/README_pygad_ReadTheDocs.rst +++ b/docs/source/README_pygad_ReadTheDocs.rst @@ -114,7 +114,22 @@ The ``pygad.GA`` class constructor supports the following parameters: value ``greater than 0`` means keeps the specified number of parents in the next population. Note that the value assigned to ``keep_parents`` cannot be ``< - 1`` or greater than the number of - solutions within the population ``sol_per_pop``. + solutions within the population ``sol_per_pop``. Starting from `PyGAD + 2.18.0 `__, + this parameter have an effect only when the ``keep_elitism`` + parameter is ``0``. + +- ``keep_elitism=1``: Added in `PyGAD + 2.18.0 `__. + It can take the value ``0`` or a positive integer that satisfies + (``0 <= keep_elitism <= sol_per_pop``). It defaults to ``1`` which + means only the best solution in the current generation is kept in the + next generation. If assigned ``0``, this means it has no effect. If + assigned a positive integer ``K``, then the best ``K`` solutions are + kept in the next generation. It cannot be assigned a value greater + than the value assigned to the ``sol_per_pop`` parameter. If this + parameter has a value different than ``0``, then the ``keep_parents`` + parameter will have no effect. - ``K_tournament=3``: In case that the parent selection type is ``tournament``, the ``K_tournament`` specifies the number of parents @@ -387,6 +402,14 @@ The ``pygad.GA`` class constructor supports the following parameters: PyGAD `__ section. +- ``random_seed=None``: Added in `PyGAD + 2.18.0 `__. + It defines the random seed to be used by the random function + generators (we use random functions in the NumPy and random modules). + This helps to reproduce the same results by setting the same random + seed (e.g. ``random_seed=2``). If given the value ``None``, then it + has no effect. + The user doesn't have to specify all of such parameters while creating an instance of the GA class. A very important parameter you must care about is ``fitness_func`` which defines the fitness function. @@ -496,6 +519,11 @@ Other Attributes of the selected parents in the last generation. Supported in `PyGAD 2.15.0 `__. +- ``last_generation_elitism``: This attribute holds the elitism in the + last generation. It is effective only if the ``keep_elitism`` + parameter has a non-zero value. Supported in `PyGAD + 2.18.0 `__. + Note that the attributes with its name start with ``last_generation_`` are updated after each generation. @@ -970,8 +998,6 @@ Accepts the following parameter: Returns the genetic algorithm instance. -.. _steps-to-use-pyga: - Steps to Use ``pygad`` ====================== @@ -998,7 +1024,7 @@ Let's discuss how to do each of these steps. .. _preparing-the-fitnessfunc-parameter: Preparing the ``fitness_func`` Parameter ----------------------------------------- +----------------------------------------- Even there are some steps in the genetic algorithm pipeline that can work the same regardless of the problem being solved, one critical step @@ -1087,9 +1113,9 @@ Here is an example for preparing the other parameters: The ``callback_generation`` Parameter ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -This parameter should be replaced by ``on_generation``. The +==This parameter should be replaced by ``on_generation``. The ``callback_generation`` parameter will be removed in a later release of -PyGAD. +PyGAD.== In `PyGAD 2.0.0 `__ @@ -1726,6 +1752,205 @@ reached ``127.4`` or if the fitness saturates for ``15`` generations. ga_instance.run() print("Number of generations passed is {generations_completed}".format(generations_completed=ga_instance.generations_completed)) +Elitism Selection +================= + +In `PyGAD +2.18.0 `__, +a new parameter called ``keep_elitism`` is supported. It accepts an +integer to define the number of elitism (i.e. best solutions) to keep in +the next generation. This parameter defaults to ``1`` which means only +the best solution is kept in the next generation. + +In the next example, the ``keep_elitism`` parameter in the constructor +of the ``pygad.GA`` class is set to 2. Thus, the best 2 solutions in +each generation are kept in the next generation. + +.. code:: python + + import numpy + import pygad + + function_inputs = [4,-2,3.5,5,-11,-4.7] + desired_output = 44 + + def fitness_func(solution, solution_idx): + output = numpy.sum(solution*function_inputs) + fitness = 1.0 / numpy.abs(output - desired_output) + return fitness + + ga_instance = pygad.GA(num_generations=2, + num_parents_mating=3, + fitness_func=fitness_func, + num_genes=6, + sol_per_pop=5, + keep_elitism=2) + + ga_instance.run() + +The value passed to the ``keep_elitism`` parameter must satisfy 2 +conditions: + +1. It must be ``>= 0``. + +2. It must be ``<= sol_per_pop``. That is its value cannot exceed the + number of solutions in the current population. + +In the previous example, if the ``keep_elitism`` parameter is set equal +to the value passed to the ``sol_per_pop`` parameter, which is 5, then +there will be no evolution at all as in the next figure. This is because +all the 5 solutions are used as elitism in the next generation and no +offspring will be created. + +.. code:: python + + ... + + ga_instance = pygad.GA(..., + sol_per_pop=5, + keep_elitism=5) + + ga_instance.run() + +.. figure:: https://user-images.githubusercontent.com/16560492/189273225-67ffad41-97ab-45e1-9324-429705e17b20.png + :alt: + +Note that if the ``keep_elitism`` parameter is effective (i.e. is +assigned a positive integer, not zero), then the ``keep_parents`` +parameter will have no effect. Because the default value of the +``keep_elitism`` parameter is 1, then the ``keep_parents`` parameter has +no effect by default. The ``keep_parents`` parameter is only effective +when ``keep_elitism=0``. + +Random Seed +=========== + +In `PyGAD +2.18.0 `__, +a new parameter called ``random_seed`` is supported. Its value is used +as a seed for the random function generators. + +PyGAD uses random functions in these 2 libraries: + +1. NumPy + +2. random + +The ``random_seed`` parameter defaults to ``None`` which means no seed +is used. As a result, different random numbers are generated for each +run of PyGAD. + +If this parameter is assigned a proper seed, then the results will be +reproducible. In the next example, the integer 2 is used as a random +seed. + +.. code:: python + + import numpy + import pygad + + function_inputs = [4,-2,3.5,5,-11,-4.7] + desired_output = 44 + + def fitness_func(solution, solution_idx): + output = numpy.sum(solution*function_inputs) + fitness = 1.0 / numpy.abs(output - desired_output) + return fitness + + ga_instance = pygad.GA(num_generations=2, + num_parents_mating=3, + fitness_func=fitness_func, + sol_per_pop=5, + num_genes=6, + random_seed=2) + + ga_instance.run() + best_solution, best_solution_fitness, best_match_idx = ga_instance.best_solution() + print(best_solution) + print(best_solution_fitness) + +This is the best solution found and its fitness value. + +.. code:: + + [ 2.77249188 -4.06570662 0.04196872 -3.47770796 -0.57502138 -3.22775267] + 0.04872203136549972 + +After running the code again, it will find the same result. + +.. code:: + + [ 2.77249188 -4.06570662 0.04196872 -3.47770796 -0.57502138 -3.22775267] + 0.04872203136549972 + +Continue without Loosing Progress +================================= + +In `PyGAD +2.18.0 `__, +and thanks for `Felix Bernhard `__ for +opening `this GitHub +issue `__, +the values of these 4 instance attributes are no longer reset after each +call to the ``run()`` method. + +1. ``self.best_solutions`` + +2. ``self.best_solutions_fitness`` + +3. ``self.solutions`` + +4. ``self.solutions_fitness`` + +This helps the user to continue where the last run stopped without +loosing the values of these 4 attributes. + +Now, the user can save the model by calling the ``save()`` method. + +.. code:: python + + import pygad + + def fitness_func(solution, solution_idx): + ... + return fitness + + ga_instance = pygad.GA(...) + + ga_instance.run() + + ga_instance.plot_fitness() + + ga_instance.save("pygad_GA") + +Then the saved model is loaded by calling the ``load()`` function. After +calling the ``run()`` method over the loaded instance, then the data +from the previous 4 attributes are not reset but extended with the new +data. + +.. code:: python + + import pygad + + def fitness_func(solution, solution_idx): + ... + return fitness + + loaded_ga_instance = pygad.load("pygad_GA") + + loaded_ga_instance.run() + + loaded_ga_instance.plot_fitness() + +The plot created by the ``plot_fitness()`` method will show the data +collected from both the runs. + +Note that the 2 attributes (``self.best_solutions`` and +``self.best_solutions_fitness``) only work if the +``save_best_solutions`` parameter is set to ``True``. Also, the 2 +attributes (``self.solutions`` and ``self.solutions_fitness``) only work +if the ``save_solutions`` parameter is ``True``. + Prevent Duplicates in Gene Values ================================= @@ -3161,11 +3386,12 @@ processes are preferred over threads when most of the work in on the CPU. Threads are preferred over processes in some situations like doing input/output operations. -*Before releasing*\ `PyGAD -2.17.0 `__\ *,*\ `László -Fazekas `__\ *wrote -an article to parallelize the fitness function with PyGAD. Check -it:*\ `How Genetic Algorithms Can Compete with Gradient Descent and +*Before releasing* `PyGAD +2.17.0 `__\ *,* +`László +Fazekas `__ +*wrote an article to parallelize the fitness function with PyGAD. Check +it:* `How Genetic Algorithms Can Compete with Gradient Descent and Backprop `__. .. _examples-2: diff --git a/docs/source/conf.py b/docs/source/conf.py index e034de2..a70e247 100644 --- a/docs/source/conf.py +++ b/docs/source/conf.py @@ -22,7 +22,7 @@ author = 'Ahmed Fawzy Gad' # The full version, including alpha/beta/rc tags -release = '2.17.0' +release = '2.18.0' master_doc = 'index'