diff --git a/Documentation/source/advanced_config.rst b/Documentation/source/advanced_config.rst index 0babf822..faaf0b25 100644 --- a/Documentation/source/advanced_config.rst +++ b/Documentation/source/advanced_config.rst @@ -254,7 +254,7 @@ in particular where it creates files within it. *.o (compiled object files) *.mod (mod files) metrics/ - my_program.exe + my_program log.txt The *project workspace* folder takes its name from the project label passed in to the build configuration. diff --git a/Documentation/source/glossary.rst b/Documentation/source/glossary.rst index d0cedfe0..b9ff75cb 100644 --- a/Documentation/source/glossary.rst +++ b/Documentation/source/glossary.rst @@ -29,7 +29,7 @@ Glossary Fab's built-in steps come with sensible defaults so the user doesn't have to write unnecessary config. As an example, the Fortran preprocessor has a default artefact getter which reads *".F90"* files - from the :term:`Artefact Collection` called ``"all_source"``. + from the :term:`Artefact Collection` called ``"INITIAL_SOURCE"``. Artefact getters are derived from :class:`~fab.artefacts.ArtefactsGetter`. @@ -65,7 +65,7 @@ Glossary A folder inside the :term:`Fab Workspace`, containing all source and output from a build config. Root Symbol - The name of a Fortran PROGRAM, or ``"main"`` for C code. Fab builds an exe for every root symbol it's given. + The name of a Fortran PROGRAM, or ``"main"`` for C code. Fab builds an executable for every root symbol it's given. Source Tree The :class:`~fab.steps.analyse.analyse` step produces a dependency tree of the entire project source. diff --git a/Documentation/source/writing_config.rst b/Documentation/source/writing_config.rst index 54a742aa..6ad35a09 100644 --- a/Documentation/source/writing_config.rst +++ b/Documentation/source/writing_config.rst @@ -79,7 +79,7 @@ Please see the documentation for :func:`~fab.steps.find_source_files.find_source including how to exclude certain source code from the build. More grab steps can be found in the :mod:`~fab.steps.grab` module. -After the find_source_files step, there will be a collection called ``"all_source"``, in the artefact store. +After the find_source_files step, there will be a collection called ``"INITIAL_SOURCE"``, in the artefact store. .. [1] See :func:`~fab.steps.c_pragma_injector.c_pragma_injector` for an example of a step which creates artefacts in the source folder. @@ -94,7 +94,7 @@ which must happen before we analyse it. Steps generally create and find artefacts in the :term:`Artefact Store`, arranged into named collections. The :func:`~fab.steps.preprocess.preprocess_fortran` -automatically looks for Fortran source code in a collection named `'all_source'`, +automatically looks for Fortran source code in a collection named `'INITIAL_SOURCE'`, which is the default output from the preceding :funcfind_source_files step. It filters just the (uppercase) ``.F90`` files. @@ -179,7 +179,7 @@ before you run the :func:`~fab.steps.analyse.analyse` step below. After the psyclone step, two new source files will be created for each .x90 file in the `'build_output'` folder. -These two output files will be added under ``"psyclone_output"`` collection to the artefact store. +These two output files will be added under ``FORTRAN_BUILD_FILES`` collection to the artefact store. .. _Analyse Overview: @@ -190,11 +190,10 @@ Analyse We must :func:`~fab.steps.analyse.analyse` the source code to determine which Fortran files to compile, and in which order. -The Analyse step looks for source to analyse in several collections: +The Analyse step looks for source to analyse in two collections: -* ``.f90`` found in the source -* ``.F90`` we pre-processed into ``.f90`` -* preprocessed c +* ``FORTRAN_BUILD_FILES``, which contains all ``.f90`` found in the source, all ``.F90`` files we pre-processed into ``.f90``, and files created by any additional step (e.g. PSyclone). +* ``C_BUILD_FILES``, all preprocessed c files. .. code-block:: :linenos: @@ -227,14 +226,14 @@ The Analyse step looks for source to analyse in several collections: Here we tell the analyser which :term:`Root Symbol` we want to build into an executable. Alternatively, we can use the ``find_programs`` flag for Fab to discover and build all programs. -After the Analyse step, there will be a collection called ``"build_trees"``, in the artefact store. +After the Analyse step, there will be a collection called ``BUILD_TREES``, in the artefact store. Compile and Link ================ The :func:`~fab.steps.compile_fortran.compile_fortran` step compiles files in -the ``"build_trees"`` collection. The :func:`~fab.steps.link.link_exe` step +the ``BUILD_TREES`` collection. The :func:`~fab.steps.link.link_exe` step then creates the executable. .. code-block:: @@ -269,7 +268,42 @@ then creates the executable. link_exe(state) -After the :func:`~fab.steps.link.link_exe` step, the executable name can be found in a collection called ``"executables"``. +After the :func:`~fab.steps.link.link_exe` step, the executable name can be found in a collection called ``EXECUTABLES``. + +ArtefactStore +============= +Each build configuration contains an artefact store, containing various +sets of artefacts. The artefact sets used by Fab are defined in the +enum :class:`~fab.artefacts.ArtefactSet`. The most important sets are ``FORTRAN_BUILD_FILES``, +``C_BUILD_FILES``, which will always contain all known source files that +will need to be analysed for dependencies, compiled, and linked. All existing +steps in Fab will make sure to maintain these artefact sets consistently, +for example, if a ``.F90`` file is preprocessed, the ``.F90`` file in +``FORTRAN_BUILD_FILES`` will be replaced with the corresponding preprocessed +``.f90`` file. Similarly, new files (for examples created by PSyclone) +will be added to ``FORTRAN_BUILD_FILES``. A user script can adds its own +artefacts using strings as keys if required. + +The exact flow of artefact sets is as follows. Note that any artefact +sets mentioned here can typically be overwritten by the user, but then +it is the user's responsibility to maintain the default artefact sets +(or change them all): + +.. + My apologies for the LONG lines, they were the only way I could find + to have properly indented paragraphs :( + +1. :func:`~fab.steps.find_source_files.find_source_files` will add all source files it finds to ``INITIAL_SOURCE`` (by default, can be overwritten by the user). Any ``.F90`` and ``.f90`` file will also be added to ``FORTRAN_BUILD_FILES``, any ``.c`` file to ``C_BUILD_FILES``, and any ``.x90`` or ``.X90`` file to ``X90_BUILD_FILES``. It can be called several times if files from different root directories need to be added, and it will automatically update the ``*_BUILD_FILES`` sets. +2. Any user script that creates new files can add files to ``INITIAL_SOURCE`` if required, but also to the corresponding ``*_BUILD_FILES``. This will happen automatically if :func:`~fab.steps.find_source_files.find_source_files` is called to add these newly created files. +3. If :func:`~fab.steps.c_pragma_injector.c_pragma_injector` is being called, it will handle all files in ``C_BUILD_FILES``, and will replace all the original C files with the newly created ones. For backward compatibility it will also store the new objects in the ``PRAGMAD_C`` set. +4. If :func:`~fab.steps.preprocess.preprocess_c` is called, it will preprocess all files in ``C_BUILD_FILES`` (at this stage typically preprocess the files in the original source folder, writing the output files to the build folder), and update that artefact set accordingly. For backward compatibility it will also store the preprocessed files in ``PREPROCESSED_C``. +5. If :func:`~fab.steps.preprocess.preprocess_fortran` is called, it will preprocess all files in ``FORTRAN_BUILD_FILES`` that end on ``.F90``, creating new ``.f90`` files in the build folder. These files will be added to ``PREPROCESSED_FORTRAN``. Then the original ``.F90`` are removed from ``FORTRAN_BUILD_FILES``, and the new preprocessed files (which are in ``PREPROCESSED_FORTRAN``) will be added. Then any ``.f90`` files that are not already in the build folder (an example of this are files created by a user script) are copied from the original source folder into the build folder, and ``FORTRAN_BUILD_FILES`` is updated to use the files in the new location. +6. If :func:`~fab.steps.psyclone.preprocess_x90` is called, it will similarly preprocess all ``.X90`` files in ``X90_BUILD_FILES``, creating the output files in the build folder, and replacing the files in ``X90_BUILD_FILES``. +7. If :func:`~fab.steps.psyclone.psyclone` is called, it will process all files in ``X90_BUILD_FILES`` and add any newly created file to ``FORTRAN_BUILD_FILES``, and removing them from ``X90_BUILD_FILES``. +8. The :func:`~fab.steps.analyse.analyse` step analyses all files in ``FORTRAN_BUILD_FILES`` and ``C_BUILD_FILES``, and add all dependencies to ``BUILD_TREES``. +9. The :func:`~fab.steps.compile_c.compile_c` and :func:`~fab.steps.compile_fortran.compile_fortran` steps will compile all files from ``C_BUILD_FILES`` and ``FORTRAN_BUILD_FILES``, and add them to ``OBJECT_FILES``. +10. If :func:`~fab.steps.archive_objects.archive_objects` is called, it will create libraries based on ``OBJECT_FILES``, adding the libraries to ``OBJECT_ARCHIVES``. +11. If :func:`~fab.steps.link.link_exe` is called, it will either use ``OBJECT_ARCHIVES``, or if this is empty, use ``OBJECT_FILES``, create the binaries, and add them to ``EXECUTABLES``. Flags diff --git a/run_configs/lfric/atm.py b/run_configs/lfric/atm.py index c297499c..7a085733 100755 --- a/run_configs/lfric/atm.py +++ b/run_configs/lfric/atm.py @@ -1,4 +1,8 @@ #!/usr/bin/env python3 + +'''Example LFRic_atm build script. +''' + import logging from fab.build_config import BuildConfig, AddFlags @@ -16,7 +20,8 @@ from fab.tools import ToolBox from grab_lfric import lfric_source_config, gpl_utils_source_config -from lfric_common import configurator, fparser_workaround_stop_concatenation +from lfric_common import (configurator, fparser_workaround_stop_concatenation, + get_transformation_script) logger = logging.getLogger('fab') @@ -162,26 +167,6 @@ def file_filtering(config): ] -def get_transformation_script(fpath, config): - ''':returns: the transformation script to be used by PSyclone. - :rtype: Path - - ''' - optimisation_path = config.source_root / 'optimisation' / 'meto-spice' - for base_path in [config.source_root, config.build_output]: - try: - relative_path = fpath.relative_to(base_path) - except ValueError: - pass - local_transformation_script = optimisation_path / (relative_path.with_suffix('.py')) - if local_transformation_script.exists(): - return local_transformation_script - global_transformation_script = optimisation_path / 'global.py' - if global_transformation_script.exists(): - return global_transformation_script - return "" - - if __name__ == '__main__': lfric_source = lfric_source_config.source_root / 'lfric' gpl_utils_source = gpl_utils_source_config.source_root / 'gpl_utils' diff --git a/run_configs/lfric/gungho.py b/run_configs/lfric/gungho.py index 5454d8ca..2f90a41a 100755 --- a/run_configs/lfric/gungho.py +++ b/run_configs/lfric/gungho.py @@ -4,6 +4,10 @@ # For further details please refer to the file COPYRIGHT # which you should have received as part of this distribution # ############################################################################## +''' +A simple build script for gungho_model +''' + import logging from fab.build_config import BuildConfig @@ -18,31 +22,12 @@ from fab.tools import ToolBox from grab_lfric import lfric_source_config, gpl_utils_source_config -from lfric_common import configurator, fparser_workaround_stop_concatenation +from lfric_common import (configurator, fparser_workaround_stop_concatenation, + get_transformation_script) logger = logging.getLogger('fab') -def get_transformation_script(fpath, config): - ''':returns: the transformation script to be used by PSyclone. - :rtype: Path - - ''' - optimisation_path = config.source_root / 'optimisation' / 'meto-spice' - for base_path in [config.source_root, config.build_output]: - try: - relative_path = fpath.relative_to(base_path) - except ValueError: - pass - local_transformation_script = optimisation_path / (relative_path.with_suffix('.py')) - if local_transformation_script.exists(): - return local_transformation_script - global_transformation_script = optimisation_path / 'global.py' - if global_transformation_script.exists(): - return global_transformation_script - return "" - - if __name__ == '__main__': lfric_source = lfric_source_config.source_root / 'lfric' gpl_utils_source = gpl_utils_source_config.source_root / 'gpl_utils' diff --git a/run_configs/lfric/lfric_common.py b/run_configs/lfric/lfric_common.py index fd4488c6..a330b1d5 100644 --- a/run_configs/lfric/lfric_common.py +++ b/run_configs/lfric/lfric_common.py @@ -1,9 +1,13 @@ import logging import os import shutil +from typing import Optional from pathlib import Path +from fab.artefacts import ArtefactSet +from fab.build_config import BuildConfig from fab.steps import step +from fab.steps.find_source_files import find_source_files from fab.tools import Category, Tool logger = logging.getLogger('fab') @@ -21,7 +25,7 @@ def check_available(self): return True -# todo: is this part of psyclone? if so, put it in the psyclone step module? +# ============================================================================ @step def configurator(config, lfric_source: Path, gpl_utils_source: Path, rose_meta_conf: Path, config_dir=None): @@ -29,71 +33,105 @@ def configurator(config, lfric_source: Path, gpl_utils_source: Path, rose_meta_c gen_namelist_tool = lfric_source / 'infrastructure/build/tools/GenerateNamelist' gen_loader_tool = lfric_source / 'infrastructure/build/tools/GenerateLoader' gen_feigns_tool = lfric_source / 'infrastructure/build/tools/GenerateFeigns' - config_dir = config_dir or config.source_root / 'configuration' + config_dir.mkdir(parents=True, exist_ok=True) env = os.environ.copy() rose_lfric_path = gpl_utils_source / 'lib/python' env['PYTHONPATH'] += f':{rose_lfric_path}' - # "rose picker" - # creates rose-meta.json and config_namelists.txt in gungho/source/configuration + # rose picker + # ----------- + # creates rose-meta.json and config_namelists.txt in + # gungho/build logger.info('rose_picker') rose_picker = Script(rose_picker_tool) - rose_picker.run(additional_parameters=[str(rose_meta_conf), - '-directory', str(config_dir), + rose_picker.run(additional_parameters=[rose_meta_conf, + '-directory', config_dir, '-include_dirs', lfric_source], env=env) + rose_meta = config_dir / 'rose-meta.json' - # "build_config_loaders" + # build_config_loaders + # -------------------- # builds a bunch of f90s from the json logger.info('GenerateNamelist') gen_namelist = Script(gen_namelist_tool) - gen_namelist.run(additional_parameters=['-verbose', - str(config_dir / 'rose-meta.json'), - '-directory', str(config_dir)]) + gen_namelist.run(additional_parameters=['-verbose', rose_meta, + '-directory', config_dir], + cwd=config_dir) # create configuration_mod.f90 in source root + # ------------------------------------------- logger.info('GenerateLoader') + names = [name.strip() for name in + open(config_dir / 'config_namelists.txt').readlines()] + configuration_mod_fpath = config_dir / 'configuration_mod.f90' gen_loader = Script(gen_loader_tool) - names = [name.strip() for name in open(config_dir / 'config_namelists.txt').readlines()] - configuration_mod_fpath = config.source_root / 'configuration_mod.f90' gen_loader.run(additional_parameters=[configuration_mod_fpath, *names]) # create feign_config_mod.f90 in source root + # ------------------------------------------ logger.info('GenerateFeigns') - feign_config = Script(gen_feigns_tool) - feign_config_mod_fpath = config.source_root / 'feign_config_mod.f90' - feign_config.run(additional_parameters=[str(config_dir / 'rose-meta.json'), - '-output', feign_config_mod_fpath]) + feign_config_mod_fpath = config_dir / 'feign_config_mod.f90' + gft = Script(gen_feigns_tool) + gft.run(additional_parameters=[rose_meta, + '-output', feign_config_mod_fpath]) - # put the generated source into an artefact - # todo: we shouldn't need to do this, should we? - # it's just going to be found in the source folder with everything else. - config._artefact_store['configurator_output'] = [ - configuration_mod_fpath, - feign_config_mod_fpath - ] + find_source_files(config, source_root=config_dir) +# ============================================================================ @step def fparser_workaround_stop_concatenation(config): """ - fparser can't handle string concat in a stop statement. This step is a workaround. + fparser can't handle string concat in a stop statement. This step is + a workaround. https://github.com/stfc/fparser/issues/330 """ - feign_config_mod_fpath = config.source_root / 'feign_config_mod.f90' + feign_path = None + for file_path in config.artefact_store[ArtefactSet.FORTRAN_BUILD_FILES]: + if file_path.name == 'feign_config_mod.f90': + feign_path = file_path + break + else: + raise RuntimeError("Could not find 'feign_config_mod.f90'.") # rename "broken" version - broken_version = feign_config_mod_fpath.with_suffix('.broken') - shutil.move(feign_config_mod_fpath, broken_version) + broken_version = feign_path.with_suffix('.broken') + shutil.move(feign_path, broken_version) # make fixed version bad = "_config: '// &\n 'Unable to close temporary file'" good = "_config: Unable to close temporary file'" - open(feign_config_mod_fpath, 'wt').write( + open(feign_path, 'wt').write( open(broken_version, 'rt').read().replace(bad, good)) + + +# ============================================================================ +def get_transformation_script(fpath: Path, + config: BuildConfig) -> Optional[Path]: + ''':returns: the transformation script to be used by PSyclone. + ''' + + optimisation_path = config.source_root / 'optimisation' / 'meto-spice' + relative_path = None + for base_path in [config.source_root, config.build_output]: + try: + relative_path = fpath.relative_to(base_path) + except ValueError: + pass + if relative_path: + local_transformation_script = (optimisation_path / + (relative_path.with_suffix('.py'))) + if local_transformation_script.exists(): + return local_transformation_script + + global_transformation_script = optimisation_path / 'global.py' + if global_transformation_script.exists(): + return global_transformation_script + return None diff --git a/run_configs/um/build_um.py b/run_configs/um/build_um.py index ce769865..05177bd2 100755 --- a/run_configs/um/build_um.py +++ b/run_configs/um/build_um.py @@ -13,9 +13,8 @@ import re import warnings -from fab.artefacts import CollectionGetter +from fab.artefacts import ArtefactSet, CollectionGetter from fab.build_config import AddFlags, BuildConfig -from fab.constants import PRAGMAD_C from fab.steps import step from fab.steps.analyse import analyse from fab.steps.archive_objects import archive_objects @@ -177,7 +176,7 @@ def replace_in_file(inpath, outpath, find, replace): preprocess_c( state, - source=CollectionGetter(PRAGMAD_C), + source=CollectionGetter(ArtefactSet.C_BUILD_FILES), path_flags=[ # todo: this is a bit "codey" - can we safely give longer strings and split later? AddFlags(match="$source/um/*", flags=[ diff --git a/source/fab/artefacts.py b/source/fab/artefacts.py index 235a91d1..727aa2be 100644 --- a/source/fab/artefacts.py +++ b/source/fab/artefacts.py @@ -4,27 +4,51 @@ # which you should have received as part of this distribution ############################################################################## """ -This module contains :term:`Artefacts Getter` classes which return :term:`Artefact Collections ` -from the :term:`Artefact Store`. +This module contains :term:`Artefacts Getter` classes which return +:term:`Artefact Collections ` from the +:term:`Artefact Store`. -These classes are used by the `run` method of :class:`~fab.steps.Step` classes to retrieve the artefacts -which need to be processed. Most steps have sensible defaults and can be configured with user-defined getters. +These classes are used by the `run` method of :class:`~fab.steps.Step` +classes to retrieve the artefacts which need to be processed. Most steps +have sensible defaults and can be configured with user-defined getters. """ + from abc import ABC, abstractmethod +from collections import defaultdict +from enum import auto, Enum from pathlib import Path -from typing import Iterable, Union, Dict, List +from typing import Dict, Iterable, List, Optional, Union -from fab.constants import BUILD_TREES, CURRENT_PREBUILDS from fab.dep_tree import filter_source_tree, AnalysedDependent from fab.util import suffix_filter +class ArtefactSet(Enum): + '''A simple enum with the artefact types used internally in Fab. + ''' + INITIAL_SOURCE = auto() + PREPROCESSED_FORTRAN = auto() + PREPROCESSED_C = auto() + FORTRAN_BUILD_FILES = auto() + C_BUILD_FILES = auto() + X90_BUILD_FILES = auto() + CURRENT_PREBUILDS = auto() + PRAGMAD_C = auto() + BUILD_TREES = auto() + OBJECT_FILES = auto() + OBJECT_ARCHIVES = auto() + EXECUTABLES = auto() + + class ArtefactStore(dict): - '''This object stores artefacts (which can be of any type). Each artefact - is indexed by a string. + '''This object stores sets of artefacts (which can be of any type). + Each artefact is indexed by either an ArtefactSet enum, or a string. ''' + def __init__(self): + '''The constructor calls reset, which will mean all the internal + artefact categories are created.''' super().__init__() self.reset() @@ -32,7 +56,80 @@ def reset(self): '''Clears the artefact store (but does not delete any files). ''' self.clear() - self[CURRENT_PREBUILDS] = set() + for artefact in ArtefactSet: + if artefact in [ArtefactSet.OBJECT_FILES, + ArtefactSet.OBJECT_ARCHIVES]: + # ObjectFiles store a default dictionary (i.e. a non-existing + # key will automatically add an empty `set`) + self[artefact] = defaultdict(set) + else: + self[artefact] = set() + + def add(self, collection: Union[str, ArtefactSet], + files: Union[Path, str, Iterable[Path], Iterable[str]]): + '''Adds the specified artefacts to a collection. The artefact + can be specified as a simple string, a list of string or a set, in + which case all individual entries of the list/set will be added. + :param collection: the name of the collection to add this to. + :param files: the artefacts to add. + ''' + if isinstance(files, list): + files = set(files) + elif not isinstance(files, Iterable): + # We need to use a list, otherwise each character is added + files = set([files]) + + self[collection].update(files) + + def update_dict(self, collection: Union[str, ArtefactSet], + key: str, values: Union[str, Iterable]): + '''For ArtefactSets that are a dictionary of sets: update + the set with the specified values. + :param collection: the name of the collection to add this to. + :param key: the key in the dictionary to update. + :param values: the values to update with. + ''' + self[collection][key].update([values] if isinstance(values, str) + else values) + + def copy_artefacts(self, source: Union[str, ArtefactSet], + dest: Union[str, ArtefactSet], + suffixes: Optional[Union[str, List[str]]] = None): + '''Copies all artefacts from `source` to `destination`. If a + suffix_fiter is specified, only files with the given suffix + will be copied. + + :param source: the source artefact set. + :param dest: the destination artefact set. + :param suffixes: a string or list of strings specifying the + suffixes to copy. + ''' + if suffixes: + suffixes = [suffixes] if isinstance(suffixes, str) else suffixes + self.add(dest, set(suffix_filter(self[source], suffixes))) + else: + self.add(dest, self[source]) + + def replace(self, artefact: Union[str, ArtefactSet], + remove_files: List[Union[str, Path]], + add_files: Union[List[Union[str, Path]], dict]): + '''Replaces artefacts in one artefact set with other artefacts. This + can be used e.g to replace files that have been preprocessed + and renamed. There is no requirement for these lists to have the + same number of elements, nor is there any check if an artefact to + be removed is actually in the artefact set. + + :param artefact: the artefact set to modify. + :param remove_files: files to remove from the artefact set. + :param add_files: files to add to the artefact set. + ''' + + art_set = self[artefact] + if not isinstance(art_set, set): + raise RuntimeError(f"Replacing artefacts in dictionary " + f"'{artefact}' is not supported.") + art_set.difference_update(set(remove_files)) + art_set.update(add_files) class ArtefactsGetter(ABC): @@ -53,14 +150,15 @@ def __call__(self, artefact_store): class CollectionGetter(ArtefactsGetter): """ - A simple artefact getter which returns one :term:`Artefact Collection` from the artefact_store. + A simple artefact getter which returns one :term:`Artefact Collection` + from the artefact_store. Example:: `CollectionGetter('preprocessed_fortran')` """ - def __init__(self, collection_name): + def __init__(self, collection_name: Union[str, ArtefactSet]): """ :param collection_name: The name of the artefact collection to retrieve. @@ -69,40 +167,46 @@ def __init__(self, collection_name): self.collection_name = collection_name def __call__(self, artefact_store): - return artefact_store.get(self.collection_name, []) + return artefact_store.get(self.collection_name, set()) class CollectionConcat(ArtefactsGetter): """ - Returns a concatenated list from multiple :term:`Artefact Collections ` - (each expected to be an iterable). + Returns a concatenated list from multiple + :term:`Artefact Collections ` (each expected to be + an iterable). - An :class:`~fab.artefacts.ArtefactsGetter` can be provided instead of a collection_name. + An :class:`~fab.artefacts.ArtefactsGetter` can be provided instead of a + collection_name. Example:: - # The default source code getter for the Analyse step might look like this. + # The default source code getter for the Analyse step might look + # like this. DEFAULT_SOURCE_GETTER = CollectionConcat([ 'preprocessed_c', 'preprocessed_fortran', - SuffixFilter('all_source', '.f90'), + SuffixFilter(ArtefactSet.INITIAL_SOURCE, '.f90'), ]) """ - def __init__(self, collections: Iterable[Union[str, ArtefactsGetter]]): + def __init__(self, collections: Iterable[Union[ArtefactSet, str, + ArtefactsGetter]]): """ :param collections: - An iterable containing collection names (strings) or other ArtefactsGetters. + An iterable containing collection names (strings) or + other ArtefactsGetters. """ self.collections = collections # todo: ensure the labelled values are iterables def __call__(self, artefact_store: ArtefactStore): - # todo: this should be a set, in case a file appears in multiple collections + # todo: this should be a set, in case a file appears in + # multiple collections result = [] for collection in self.collections: - if isinstance(collection, str): + if isinstance(collection, (str, ArtefactSet)): result.extend(artefact_store.get(collection, [])) elif isinstance(collection, ArtefactsGetter): result.extend(collection(artefact_store)) @@ -111,16 +215,18 @@ def __call__(self, artefact_store: ArtefactStore): class SuffixFilter(ArtefactsGetter): """ - Returns the file paths in a :term:`Artefact Collection` (expected to be an iterable), - filtered by suffix. + Returns the file paths in a :term:`Artefact Collection` (expected to be + an iterable), filtered by suffix. Example:: # The default source getter for the FortranPreProcessor step. - DEFAULT_SOURCE = SuffixFilter('all_source', '.F90') + DEFAULT_SOURCE = SuffixFilter(ArtefactSet.INITIAL_SOURCE, '.F90') """ - def __init__(self, collection_name: str, suffix: Union[str, List[str]]): + def __init__(self, + collection_name: Union[str, ArtefactSet], + suffix: Union[str, List[str]]): """ :param collection_name: The name of the artefact collection. @@ -132,7 +238,8 @@ def __init__(self, collection_name: str, suffix: Union[str, List[str]]): self.suffixes = [suffix] if isinstance(suffix, str) else suffix def __call__(self, artefact_store: ArtefactStore): - # todo: returning an empty list is probably "dishonest" if the collection doesn't exist - return None instead? + # todo: returning an empty list is probably "dishonest" if the + # collection doesn't exist - return None instead? fpaths: Iterable[Path] = artefact_store.get(self.collection_name, []) return suffix_filter(fpaths, self.suffixes) @@ -141,32 +248,30 @@ class FilterBuildTrees(ArtefactsGetter): """ Filter build trees by suffix. - Returns one list of files to compile per build tree, of the form Dict[name, List[AnalysedDependent]] - Example:: # The default source getter for the CompileFortran step. DEFAULT_SOURCE_GETTER = FilterBuildTrees(suffix='.f90') + :returns: one list of files to compile per build tree, of the form + Dict[name, List[AnalysedDependent]] + """ - def __init__(self, suffix: Union[str, List[str]], collection_name: str = BUILD_TREES): + def __init__(self, suffix: Union[str, List[str]]): """ :param suffix: A suffix string, or iterable of, including the preceding dot. - :param collection_name: - The name of the artefact collection where we find the source trees. - Defaults to the value in :py:const:`fab.constants.BUILD_TREES`. """ - self.collection_name = collection_name self.suffixes = [suffix] if isinstance(suffix, str) else suffix def __call__(self, artefact_store: ArtefactStore): - build_trees = artefact_store[self.collection_name] + build_trees = artefact_store[ArtefactSet.BUILD_TREES] build_lists: Dict[str, List[AnalysedDependent]] = {} for root, tree in build_trees.items(): - build_lists[root] = filter_source_tree(source_tree=tree, suffixes=self.suffixes) + build_lists[root] = filter_source_tree(source_tree=tree, + suffixes=self.suffixes) return build_lists diff --git a/source/fab/build_config.py b/source/fab/build_config.py index f55ef185..614c4328 100644 --- a/source/fab/build_config.py +++ b/source/fab/build_config.py @@ -20,8 +20,8 @@ from string import Template from typing import List, Optional, Iterable -from fab.artefacts import ArtefactStore -from fab.constants import BUILD_OUTPUT, SOURCE_ROOT, PREBUILD, CURRENT_PREBUILDS +from fab.artefacts import ArtefactSet, ArtefactStore +from fab.constants import BUILD_OUTPUT, SOURCE_ROOT, PREBUILD from fab.metrics import send_metric, init_metrics, stop_metrics, metrics_summary from fab.tools.category import Category from fab.tools.tool_box import ToolBox @@ -169,7 +169,7 @@ def add_current_prebuilds(self, artefacts: Iterable[Path]): Mark the given file paths as being current prebuilds, not to be cleaned during housekeeping. """ - self.artefact_store[CURRENT_PREBUILDS].update(artefacts) + self.artefact_store[ArtefactSet.CURRENT_PREBUILDS].update(artefacts) def _run_prep(self): self._init_logging() diff --git a/source/fab/cli.py b/source/fab/cli.py index 5cc40315..07154eec 100644 --- a/source/fab/cli.py +++ b/source/fab/cli.py @@ -11,14 +11,13 @@ from pathlib import Path from typing import Dict, Optional +from fab.artefacts import ArtefactSet, CollectionGetter +from fab.build_config import BuildConfig from fab.steps.analyse import analyse from fab.steps.c_pragma_injector import c_pragma_injector from fab.steps.compile_c import compile_c from fab.steps.link import link_exe from fab.steps.root_inc_files import root_inc_files -from fab.artefacts import CollectionGetter -from fab.build_config import BuildConfig -from fab.constants import PRAGMAD_C from fab.steps.compile_fortran import compile_fortran from fab.steps.find_source_files import find_source_files from fab.steps.grab.folder import grab_folder @@ -52,7 +51,7 @@ def _generic_build_config(folder: Path, kwargs=None) -> BuildConfig: root_inc_files(config) # JULES helper, get rid of this eventually preprocess_fortran(config) c_pragma_injector(config) - preprocess_c(config, source=CollectionGetter(PRAGMAD_C)) + preprocess_c(config, source=CollectionGetter(ArtefactSet.C_BUILD_FILES)) analyse(config, find_programs=True) compile_fortran(config) compile_c(config) diff --git a/source/fab/constants.py b/source/fab/constants.py index 093aefe7..883a5155 100644 --- a/source/fab/constants.py +++ b/source/fab/constants.py @@ -8,23 +8,9 @@ """ - -# Might be better to use enums - - # folders underneath workspace SOURCE_ROOT = "source" BUILD_OUTPUT = "build_output" # prebuild folder name PREBUILD = '_prebuild' - -# names of artefact collections -PROJECT_SOURCE_TREE = 'project source tree' -PRAGMAD_C = 'pragmad_c' -BUILD_TREES = 'build trees' -OBJECT_FILES = 'object files' -OBJECT_ARCHIVES = 'object archives' -EXECUTABLES = 'executables' - -CURRENT_PREBUILDS = 'current prebuilds' diff --git a/source/fab/parse/fortran.py b/source/fab/parse/fortran.py index 6f8b35d3..c5ea7f60 100644 --- a/source/fab/parse/fortran.py +++ b/source/fab/parse/fortran.py @@ -298,10 +298,15 @@ def _process_comment(self, analysed_file, obj): if comment[:2] == "!$": # Check if it is a use statement with an OpenMP sentinel: # Use fparser's string reader to discard potential comment - # TODO #13: once fparser supports reading the sentinels, + # TODO #327: once fparser supports reading the sentinels, # this can be removed. + # fparser issue: https://github.com/stfc/fparser/issues/443 reader = FortranStringReader(comment[2:]) - line = reader.next() + try: + line = reader.next() + except StopIteration: + # No other item, ignore + return try: # match returns a 5-tuple, the third one being the module name module_name = Use_Stmt.match(line.strline)[2] diff --git a/source/fab/steps/analyse.py b/source/fab/steps/analyse.py index 26c6cdc8..091da6c0 100644 --- a/source/fab/steps/analyse.py +++ b/source/fab/steps/analyse.py @@ -41,8 +41,7 @@ from typing import Dict, List, Iterable, Set, Optional, Union from fab import FabException -from fab.artefacts import ArtefactsGetter, CollectionConcat, SuffixFilter -from fab.constants import BUILD_TREES +from fab.artefacts import ArtefactsGetter, ArtefactSet, CollectionConcat from fab.dep_tree import extract_sub_tree, validate_dependencies, AnalysedDependent from fab.mo import add_mo_commented_file_deps from fab.parse import AnalysedFile, EmptySourceFile @@ -54,14 +53,8 @@ logger = logging.getLogger(__name__) DEFAULT_SOURCE_GETTER = CollectionConcat([ - SuffixFilter('all_source', '.f90'), - 'preprocessed_c', - 'preprocessed_fortran', - - # todo: this is lfric stuff so might be better placed elsewhere - SuffixFilter('psyclone_output', '.f90'), - 'preprocessed_psyclone', # todo: this is no longer a collection, remove - 'configurator_output', + ArtefactSet.FORTRAN_BUILD_FILES, + ArtefactSet.C_BUILD_FILES, ]) @@ -78,12 +71,13 @@ def analyse( special_measure_analysis_results: Optional[Iterable[FortranParserWorkaround]] = None, unreferenced_deps: Optional[Iterable[str]] = None, ignore_mod_deps: Optional[Iterable[str]] = None, - name='analyser'): + ): """ Produce one or more build trees by analysing source code dependencies. The resulting artefact collection is a mapping from root symbol to build tree. - The name of this artefact collection is taken from :py:const:`fab.constants.BUILD_TREES`. + The name of this artefact collection is taken from + :py:const:`fab.artefacts.ArtefactSet.BUILD_TREES`. If no artefact getter is specified in *source*, a default is used which provides input files from multiple artefact collections, including the default C and Fortran preprocessor outputs @@ -139,28 +133,16 @@ def analyse( fortran_analyser = FortranAnalyser(std=std, ignore_mod_deps=ignore_mod_deps) c_analyser = CAnalyser() - """ - Creates the *build_trees* artefact from the files in `self.source_getter`. - - Does the following, in order: - - Create a hash of every source file. Used to check if it's already been analysed. - - Parse the C and Fortran files to find external symbol definitions and dependencies in each file. - - Analysis results are stored in a csv as-we-go, so analysis can be resumed if interrupted. - - Create a 'symbol table' recording which file each symbol is in. - - Work out the file dependencies from the symbol dependencies. - - At this point we have a source tree for the entire source. - - (Optionally) Extract a sub tree for every root symbol, if provided. For building executables. - - This step uses multiprocessing, unless disabled in the :class:`~fab.steps.Step` class. - - :param artefact_store: - Contains artefacts created by previous Steps, and where we add our new artefacts. - This is where the given :class:`~fab.artefacts.ArtefactsGetter` finds the artefacts to process. - :param config: - The :class:`fab.build_config.BuildConfig` object where we can read settings - such as the project workspace folder or the multiprocessing flag. + # Creates the *build_trees* artefact from the files in `self.source_getter`. - """ + # Does the following, in order: + # - Create a hash of every source file. Used to check if it's already been analysed. + # - Parse the C and Fortran files to find external symbol definitions and dependencies in each file. + # - Analysis results are stored in a csv as-we-go, so analysis can be resumed if interrupted. + # - Create a 'symbol table' recording which file each symbol is in. + # - Work out the file dependencies from the symbol dependencies. + # - At this point we have a source tree for the entire source. + # - (Optionally) Extract a sub tree for every root symbol, if provided. For building executables. # todo: code smell - refactor (in another PR to keep things small) fortran_analyser._config = config @@ -206,7 +188,7 @@ def analyse( _add_unreferenced_deps(unreferenced_deps, symbol_table, project_source_tree, build_tree) validate_dependencies(build_tree) - config.artefact_store[BUILD_TREES] = build_trees + config.artefact_store[ArtefactSet.BUILD_TREES] = build_trees def _analyse_dependencies(analysed_files: Iterable[AnalysedDependent]): @@ -317,7 +299,7 @@ def _gen_symbol_table(analysed_files: Iterable[AnalysedDependent]) -> Dict[str, Create a dictionary mapping symbol names to the files in which they appear. """ - symbols: Dict[str, Path] = dict() + symbols: Dict[str, Path] = {} duplicates = [] for analysed_file in analysed_files: for symbol_def in analysed_file.symbol_defs: @@ -330,7 +312,8 @@ def _gen_symbol_table(analysed_files: Iterable[AnalysedDependent]) -> Dict[str, symbols[symbol_def] = analysed_file.fpath if duplicates: - # we don't break the build because these symbols might not be required to build the exe + # we don't break the build because these symbols might not be + # required to build the executable. # todo: put a big warning at the end of the build? err_msg = "\n".join(map(str, duplicates)) warnings.warn(f"Duplicates found while generating symbol table:\n{err_msg}") diff --git a/source/fab/steps/archive_objects.py b/source/fab/steps/archive_objects.py index f4d5efcf..1ebbcdcf 100644 --- a/source/fab/steps/archive_objects.py +++ b/source/fab/steps/archive_objects.py @@ -12,8 +12,8 @@ from string import Template from typing import Optional +from fab.artefacts import ArtefactSet from fab.build_config import BuildConfig -from fab.constants import OBJECT_FILES, OBJECT_ARCHIVES from fab.steps import step from fab.util import log_or_dot from fab.tools import Ar, Category @@ -21,26 +21,30 @@ logger = logging.getLogger(__name__) -DEFAULT_SOURCE_GETTER = CollectionGetter(OBJECT_FILES) +DEFAULT_SOURCE_GETTER = CollectionGetter(ArtefactSet.OBJECT_FILES) -# todo: two diagrams showing the flow of artefacts in the exe and library use cases -# show how the library has a single build target with None as the name. +# todo: two diagrams showing the flow of artefacts in the executables and +# library use cases show how the library has a single build target with None +# as the name. -# todo: all this documentation for such a simple step - should we split it up somehow? +# todo: all this documentation for such a simple step - should we split it +# up somehow? @step def archive_objects(config: BuildConfig, source: Optional[ArtefactsGetter] = None, output_fpath=None, - output_collection=OBJECT_ARCHIVES): + output_collection=ArtefactSet.OBJECT_ARCHIVES): """ Create an object archive for every build target, from their object files. - An object archive is a set of object (*.o*) files bundled into a single file, typically with a *.a* extension. + An object archive is a set of object (*.o*) files bundled into a single + file, typically with a *.a* extension. - Expects one or more build targets from its artefact getter, of the form Dict[name, object_files]. - By default, it finds the build targets and their object files in the artefact collection named by + Expects one or more build targets from its artefact getter, of the form + Dict[name, object_files]. By default, it finds the build targets and + their object files in the artefact collection named by :py:const:`fab.constants.COMPILED_FILES`. This step has three use cases: @@ -49,46 +53,54 @@ def archive_objects(config: BuildConfig, * The object archive is a convenience step before linking a **shared object**. * One or more object archives as convenience steps before linking **executables**. - The benefit of creating an object archive before linking is simply to reduce the size - of the linker command, which might otherwise include thousands of .o files, making any error output - difficult to read. You don't have to use this step before linking. - The linker step has a default artefact getter which will work with or without this preceding step. + The benefit of creating an object archive before linking is simply to + reduce the size of the linker command, which might otherwise include + thousands of .o files, making any error output difficult to read. You + don't have to use this step before linking. The linker step has a default + artefact getter which will work with or without this preceding step. **Creating a Static or Shared Library:** - When building a library there is expected to be a single build target with a `None` name. - This typically happens when configuring the :class:`~fab.steps.analyser.Analyser` step *without* a root symbol. - We can assume the list of object files is the entire project source, compiled. + When building a library there is expected to be a single build target + with a `None` name. This typically happens when configuring the + :class:`~fab.steps.analyser.Analyser` step *without* a root symbol. + We can assume the list of object files is the entire project source, + compiled. In this case you must specify an *output_fpath*. **Creating Executables:** - When creating executables, there is expected to be one or more build targets, each with a name. - This typically happens when configuring the :class:`~fab.steps.analyser.Analyser` step *with* a root symbol(s). - We can assume each list of object files is sufficient to build each *.exe*. + When creating executables, there is expected to be one or more build + targets, each with a name. This typically happens when configuring the + :class:`~fab.steps.analyser.Analyser` step *with* a root symbol(s). + We can assume each list of object files is sufficient to build each + ** executable. - In this case you cannot specify an *output_fpath* path because they are automatically created from the - target name. + In this case you cannot specify an *output_fpath* path because they are + automatically created from the target name. :param config: - The :class:`fab.build_config.BuildConfig` object where we can read settings - such as the project workspace folder or the multiprocessing flag. + The :class:`fab.build_config.BuildConfig` object where we can read + settings such as the project workspace folder or the multiprocessing flag. :param source: - An :class:`~fab.artefacts.ArtefactsGetter` which give us our lists of objects to archive. - The artefacts are expected to be of the form `Dict[root_symbol_name, list_of_object_files]`. + An :class:`~fab.artefacts.ArtefactsGetter` which give us our lists of + objects to archive. The artefacts are expected to be of the form + `Dict[root_symbol_name, list_of_object_files]`. :param output_fpath: - The file path of the archive file to create. - This string can include templating, where "$output" is replaced with the output folder. + The file path of the archive file to create. This string can include + templating, where "$output" is replaced with the output folder. * Must be specified when building a library file (no build target name). - * Must not be specified when building linker input (one or more build target names). + * Must not be specified when building linker input (one or more build + target names). :param output_collection: The name of the artefact collection to create. Defaults to the name in - :const:`fab.constants.OBJECT_ARCHIVES`. + :const:`fab.artefacts.ArtefactSet.OBJECT_ARCHIVES`. """ - # todo: the output path should not be an abs fpath, it should be relative to the proj folder + # todo: the output path should not be an abs fpath, it should be relative + # to the proj folder source_getter = source or DEFAULT_SOURCE_GETTER ar = config.tool_box[Category.AR] @@ -100,15 +112,15 @@ def archive_objects(config: BuildConfig, target_objects = source_getter(config.artefact_store) assert target_objects.keys() if output_fpath and list(target_objects.keys()) != [None]: - raise ValueError("You must not specify an output path (library) when there are root symbols (exes)") + raise ValueError("You must not specify an output path (library) when " + "there are root symbols (executables)") if not output_fpath and list(target_objects.keys()) == [None]: raise ValueError("You must specify an output path when building a library.") - output_archives = config.artefact_store.setdefault(output_collection, {}) for root, objects in target_objects.items(): if root: - # we're building an object archive for an exe + # we're building an object archive for an executable output_fpath = str(config.build_output / f'{root}.a') else: # we're building a single object archive with a given filename @@ -123,4 +135,5 @@ def archive_objects(config: BuildConfig, except RuntimeError as err: raise RuntimeError(f"error creating object archive:\n{err}") from err - output_archives[root] = [output_fpath] + config.artefact_store.update_dict(output_collection, root, + output_fpath) diff --git a/source/fab/steps/c_pragma_injector.py b/source/fab/steps/c_pragma_injector.py index 623172a2..196d0867 100644 --- a/source/fab/steps/c_pragma_injector.py +++ b/source/fab/steps/c_pragma_injector.py @@ -12,40 +12,49 @@ from typing import Generator, Pattern, Optional, Match from fab import FabException -from fab.constants import PRAGMAD_C +from fab.artefacts import ArtefactSet, ArtefactsGetter, SuffixFilter from fab.steps import run_mp, step -from fab.artefacts import ArtefactsGetter, SuffixFilter -DEFAULT_SOURCE_GETTER = SuffixFilter('all_source', '.c') +DEFAULT_SOURCE_GETTER = SuffixFilter(ArtefactSet.C_BUILD_FILES, '.c') # todo: test @step -def c_pragma_injector(config, source: Optional[ArtefactsGetter] = None, output_name=None): +def c_pragma_injector(config, source: Optional[ArtefactsGetter] = None, + output_name=None): """ - A build step to inject custom pragmas to mark blocks of user and system include statements. + A build step to inject custom pragmas to mark blocks of user and system + include statements. - By default, reads .c files from the *all_source* artefact and creates the *pragmad_c* artefact. + By default, reads .c files from the *INITIAL_SOURCE* artefact and creates + the *pragmad_c* artefact. - This step does not write to the build output folder, it creates the pragmad c in the same folder as the c file. - This is because a subsequent preprocessing step needs to look in the source folder for header files, + This step does not write to the build output folder, it creates the + pragmad c in the same folder as the c file. This is because a subsequent + preprocessing step needs to look in the source folder for header files, including in paths relative to the c file. :param config: - The :class:`fab.build_config.BuildConfig` object where we can read settings - such as the project workspace folder or the multiprocessing flag. + The :class:`fab.build_config.BuildConfig` object where we can + read settings such as the project workspace folder or the + multiprocessing flag. :param source: - An :class:`~fab.artefacts.ArtefactsGetter` which give us our c files to process. + An :class:`~fab.artefacts.ArtefactsGetter` which give us our c files + to process. :param output_name: - The name of the artefact collection to create in the artefact store, with a sensible default + The name of the artefact collection to create in the artefact store, + with a sensible default """ source_getter = source or DEFAULT_SOURCE_GETTER - output_name = output_name or PRAGMAD_C + output_name = output_name or ArtefactSet.PRAGMAD_C files = source_getter(config.artefact_store) results = run_mp(config, items=files, func=_process_artefact) - config.artefact_store[output_name] = list(results) + config.artefact_store[output_name] = set(results) + config.artefact_store.replace(ArtefactSet.C_BUILD_FILES, + remove_files=files, + add_files=results) def _process_artefact(fpath: Path): diff --git a/source/fab/steps/cleanup_prebuilds.py b/source/fab/steps/cleanup_prebuilds.py index 8d1548b2..fcba7970 100644 --- a/source/fab/steps/cleanup_prebuilds.py +++ b/source/fab/steps/cleanup_prebuilds.py @@ -13,7 +13,7 @@ from pathlib import Path from typing import Dict, Optional, Iterable, Set -from fab.constants import CURRENT_PREBUILDS +from fab.artefacts import ArtefactSet from fab.steps import run_mp, step from fab.util import file_walk, get_prebuild_file_groups @@ -58,12 +58,14 @@ def cleanup_prebuilds( # see what's in the prebuild folder prebuild_files = list(file_walk(config.prebuild_folder)) + current_prebuild = ArtefactSet.CURRENT_PREBUILDS if not prebuild_files: logger.info('no prebuild files found') elif all_unused: num_removed = remove_all_unused( - found_files=prebuild_files, current_files=config.artefact_store[CURRENT_PREBUILDS]) + found_files=prebuild_files, + current_files=config.artefact_store[current_prebuild]) else: # get the file access time for every artefact @@ -71,8 +73,10 @@ def cleanup_prebuilds( dict(zip(prebuild_files, run_mp(config, prebuild_files, get_access_time))) # type: ignore # work out what to delete - to_delete = by_age(older_than, prebuilds_ts, current_files=config.artefact_store[CURRENT_PREBUILDS]) - to_delete |= by_version_age(n_versions, prebuilds_ts, current_files=config.artefact_store[CURRENT_PREBUILDS]) + to_delete = by_age(older_than, prebuilds_ts, + current_files=config.artefact_store[current_prebuild]) + to_delete |= by_version_age(n_versions, prebuilds_ts, + current_files=config.artefact_store[current_prebuild]) # delete them all run_mp(config, to_delete, os.remove) diff --git a/source/fab/steps/compile_c.py b/source/fab/steps/compile_c.py index 81e9bef5..8ac03f65 100644 --- a/source/fab/steps/compile_c.py +++ b/source/fab/steps/compile_c.py @@ -9,14 +9,12 @@ """ import logging import os -from collections import defaultdict from dataclasses import dataclass from typing import List, Dict, Optional, Tuple from fab import FabException -from fab.artefacts import ArtefactsGetter, FilterBuildTrees +from fab.artefacts import ArtefactsGetter, ArtefactSet, FilterBuildTrees from fab.build_config import BuildConfig, FlagsConfig -from fab.constants import OBJECT_FILES from fab.metrics import send_metric from fab.parse.c import AnalysedC from fab.steps import check_for_errors, run_mp, step @@ -101,10 +99,9 @@ def store_artefacts(compiled_files: List[CompiledFile], build_lists: Dict[str, L """ # add the new object files to the artefact store, by target lookup = {c.input_fpath: c for c in compiled_files} - object_files = artefact_store.setdefault(OBJECT_FILES, defaultdict(set)) for root, source_files in build_lists.items(): new_objects = [lookup[af.fpath].output_fpath for af in source_files] - object_files[root].update(new_objects) + artefact_store.update_dict(ArtefactSet.OBJECT_FILES, root, new_objects) def _compile_file(arg: Tuple[AnalysedC, MpCommonArgs]): diff --git a/source/fab/steps/compile_fortran.py b/source/fab/steps/compile_fortran.py index 8b3fa632..38146b1b 100644 --- a/source/fab/steps/compile_fortran.py +++ b/source/fab/steps/compile_fortran.py @@ -11,15 +11,14 @@ import logging import os import shutil -from collections import defaultdict from dataclasses import dataclass from itertools import chain from pathlib import Path from typing import List, Set, Dict, Tuple, Optional, Union -from fab.artefacts import ArtefactsGetter, ArtefactStore, FilterBuildTrees +from fab.artefacts import (ArtefactsGetter, ArtefactSet, ArtefactStore, + FilterBuildTrees) from fab.build_config import BuildConfig, FlagsConfig -from fab.constants import OBJECT_FILES from fab.metrics import send_metric from fab.parse.fortran import AnalysedFortran from fab.steps import check_for_errors, run_mp, step @@ -200,10 +199,9 @@ def store_artefacts(compiled_files: Dict[Path, CompiledFile], """ # add the new object files to the artefact store, by target lookup = {c.input_fpath: c for c in compiled_files.values()} - object_files = artefact_store.setdefault(OBJECT_FILES, defaultdict(set)) for root, source_files in build_lists.items(): - new_objects = [lookup[af.fpath].output_fpath for af in source_files] - object_files[root].update(new_objects) + new_objects = {lookup[af.fpath].output_fpath for af in source_files} + artefact_store.update_dict(ArtefactSet.OBJECT_FILES, root, new_objects) def process_file(arg: Tuple[AnalysedFortran, MpCommonArgs]) \ diff --git a/source/fab/steps/find_source_files.py b/source/fab/steps/find_source_files.py index 25191d5f..570b0efc 100644 --- a/source/fab/steps/find_source_files.py +++ b/source/fab/steps/find_source_files.py @@ -10,6 +10,7 @@ import logging from typing import Optional, Iterable +from fab.artefacts import ArtefactSet from fab.steps import step from fab.util import file_walk @@ -39,7 +40,8 @@ def check(self, path): class Include(_PathFilter): """ - A path filter which includes matching paths, this convenience class improves config readability. + A path filter which includes matching paths, this convenience class + improves config readability. """ def __init__(self, *filter_strings): @@ -56,7 +58,8 @@ def __str__(self): class Exclude(_PathFilter): """ - A path filter which excludes matching paths, this convenience class improves config readability. + A path filter which excludes matching paths, this convenience class + improves config readability. """ @@ -73,7 +76,8 @@ def __str__(self): @step -def find_source_files(config, source_root=None, output_collection="all_source", +def find_source_files(config, source_root=None, + output_collection=ArtefactSet.INITIAL_SOURCE, path_filters: Optional[Iterable[_PathFilter]] = None): """ Find the files in the source folder, with filtering. @@ -81,9 +85,10 @@ def find_source_files(config, source_root=None, output_collection="all_source", Files can be included or excluded with simple pattern matching. Every file is included by default, unless the filters say otherwise. - Path filters are expected to be provided by the user in an *ordered* collection. - The two convenience subclasses, :class:`~fab.steps.walk_source.Include` and :class:`~fab.steps.walk_source.Exclude`, - improve readability. + Path filters are expected to be provided by the user in an *ordered* + collection. The two convenience subclasses, + :class:`~fab.steps.walk_source.Include` and + :class:`~fab.steps.walk_source.Exclude`, improve readability. Order matters. For example:: @@ -92,14 +97,17 @@ def find_source_files(config, source_root=None, output_collection="all_source", Include('my_folder/my_file.F90'), ] - In the above example, swapping the order would stop the file being included in the build. + In the above example, swapping the order would stop the file being + included in the build. A path matches a filter string simply if it *contains* it, - so the path *my_folder/my_file.F90* would match filters "my_folder", "my_file" and "er/my". + so the path *my_folder/my_file.F90* would match filters + "my_folder", "my_file" and "er/my". :param config: - The :class:`fab.build_config.BuildConfig` object where we can read settings - such as the project workspace folder or the multiprocessing flag. + The :class:`fab.build_config.BuildConfig` object where we can read + settings such as the project workspace folder or the multiprocessing + flag. :param source_root: Optional path to source folder, with a sensible default. :param output_collection: @@ -112,23 +120,16 @@ def find_source_files(config, source_root=None, output_collection="all_source", """ path_filters = path_filters or [] - """ - Recursively get all files in the given folder, with filtering. - - :param artefact_store: - Contains artefacts created by previous Steps, and where we add our new artefacts. - This is where the given :class:`~fab.artefacts.ArtefactsGetter` finds the artefacts to process. - :param config: - The :class:`fab.build_config.BuildConfig` object where we can read settings - such as the project workspace folder or the multiprocessing flag. + # Recursively get all files in the given folder, with filtering. - """ source_root = source_root or config.source_root # file filtering - filtered_fpaths = [] - # todo: we shouldn't need to ignore the prebuild folder here, it's not underneath the source root. - for fpath in file_walk(source_root, ignore_folders=[config.prebuild_folder]): + filtered_fpaths = set() + # todo: we shouldn't need to ignore the prebuild folder here, it's not + # underneath the source root. + for fpath in file_walk(source_root, + ignore_folders=[config.prebuild_folder]): wanted = True for path_filter in path_filters: @@ -138,11 +139,25 @@ def find_source_files(config, source_root=None, output_collection="all_source", wanted = res if wanted: - filtered_fpaths.append(fpath) + filtered_fpaths.add(fpath) else: logger.debug(f"excluding {fpath}") if not filtered_fpaths: raise RuntimeError("no source files found after filtering") - config.artefact_store[output_collection] = filtered_fpaths + config.artefact_store.add(output_collection, filtered_fpaths) + + # Now split the files into the various main groups: + # Fortran, C, and PSyclone + config.artefact_store.copy_artefacts(output_collection, + ArtefactSet.FORTRAN_BUILD_FILES, + suffixes=[".f90", ".F90"]) + + config.artefact_store.copy_artefacts(output_collection, + ArtefactSet.C_BUILD_FILES, + suffixes=[".c"]) + + config.artefact_store.copy_artefacts(output_collection, + ArtefactSet.X90_BUILD_FILES, + suffixes=[".x90", ".X90"]) diff --git a/source/fab/steps/link.py b/source/fab/steps/link.py index 693ea0ab..5c6d15ce 100644 --- a/source/fab/steps/link.py +++ b/source/fab/steps/link.py @@ -11,7 +11,7 @@ from string import Template from typing import Optional -from fab.constants import OBJECT_FILES, OBJECT_ARCHIVES, EXECUTABLES +from fab.artefacts import ArtefactSet from fab.steps import step from fab.tools import Category from fab.artefacts import ArtefactsGetter, CollectionGetter @@ -27,8 +27,8 @@ class DefaultLinkerSource(ArtefactsGetter): """ def __call__(self, artefact_store): - return CollectionGetter(OBJECT_ARCHIVES)(artefact_store) \ - or CollectionGetter(OBJECT_FILES)(artefact_store) + return CollectionGetter(ArtefactSet.OBJECT_ARCHIVES)(artefact_store) \ + or CollectionGetter(ArtefactSet.OBJECT_FILES)(artefact_store) @step @@ -62,7 +62,7 @@ def link_exe(config, flags=None, source: Optional[ArtefactsGetter] = None): for root, objects in target_objects.items(): exe_path = config.project_workspace / f'{root}' linker.link(objects, exe_path, flags) - config.artefact_store.setdefault(EXECUTABLES, []).append(exe_path) + config.artefact_store.add(ArtefactSet.EXECUTABLES, exe_path) # todo: the bit about Dict[None, object_files] seems too obscure - try to rethink this. diff --git a/source/fab/steps/preprocess.py b/source/fab/steps/preprocess.py index 11777e96..325ada70 100644 --- a/source/fab/steps/preprocess.py +++ b/source/fab/steps/preprocess.py @@ -11,16 +11,16 @@ import shutil from dataclasses import dataclass from pathlib import Path -from typing import Collection, List, Optional, Tuple +from typing import Collection, List, Optional, Tuple, Union +from fab.artefacts import (ArtefactSet, ArtefactsGetter, SuffixFilter, + CollectionGetter) from fab.build_config import BuildConfig, FlagsConfig -from fab.constants import PRAGMAD_C from fab.metrics import send_metric - -from fab.util import log_or_dot_finish, input_to_output_fpath, log_or_dot, suffix_filter, Timer, by_type from fab.steps import check_for_errors, run_mp, step from fab.tools import Category, Cpp, CppFortran, Preprocessor -from fab.artefacts import ArtefactsGetter, SuffixFilter, CollectionGetter +from fab.util import (log_or_dot_finish, input_to_output_fpath, log_or_dot, + suffix_filter, Timer, by_type) logger = logging.getLogger(__name__) @@ -36,7 +36,9 @@ class MpCommonArgs(): def pre_processor(config: BuildConfig, preprocessor: Preprocessor, - files: Collection[Path], output_collection, output_suffix, + files: Collection[Path], + output_collection: Union[str, ArtefactSet], + output_suffix, common_flags: Optional[List[str]] = None, path_flags: Optional[List] = None, name="preprocess"): @@ -87,7 +89,7 @@ def pre_processor(config: BuildConfig, preprocessor: Preprocessor, check_for_errors(results, caller_label=name) log_or_dot_finish(logger) - config.artefact_store[output_collection] = list(by_type(results, Path)) + config.artefact_store.add(output_collection, set(by_type(results, Path))) def process_artefact(arg: Tuple[Path, MpCommonArgs]): @@ -117,7 +119,8 @@ def process_artefact(arg: Tuple[Path, MpCommonArgs]): try: args.preprocessor.preprocess(input_fpath, output_fpath, params) except Exception as err: - raise Exception(f"error preprocessing {input_fpath}:\n{err}") from err + raise Exception(f"error preprocessing {input_fpath}:\n" + f"{err}") from err send_metric(args.name, str(input_fpath), {'time_taken': timer.taken, 'start': timer.start}) return output_fpath @@ -136,11 +139,14 @@ def preprocess_fortran(config: BuildConfig, source: Optional[ArtefactsGetter] = The preprocessor is taken from the `FPP` environment, or falls back to `fpp -P`. - If source is not provided, it defaults to `SuffixFilter('all_source', '.F90')`. + If source is not provided, it defaults to + `SuffixFilter(ArtefactStore.FORTRAN_BUILD_FILES, '.F90')`. """ - source_getter = source or SuffixFilter('all_source', ['.F90', '.f90']) - source_files = source_getter(config.artefact_store) + if source: + source_files = source(config.artefact_store) + else: + source_files = config.artefact_store[ArtefactSet.FORTRAN_BUILD_FILES] F90s = suffix_filter(source_files, '.F90') f90s = suffix_filter(source_files, '.f90') @@ -161,14 +167,21 @@ def preprocess_fortran(config: BuildConfig, source: Optional[ArtefactsGetter] = preprocessor=fpp, common_flags=common_flags, files=F90s, - output_collection='preprocessed_fortran', output_suffix='.f90', + output_collection=ArtefactSet.PREPROCESSED_FORTRAN, + output_suffix='.f90', name='preprocess fortran', **kwargs, ) + config.artefact_store.replace(ArtefactSet.FORTRAN_BUILD_FILES, + remove_files=F90s, + add_files=config.artefact_store[ArtefactSet.PREPROCESSED_FORTRAN]) + # todo: parallel copy? # copy little f90s from source to output folder logger.info(f'Fortran preprocessor copying {len(f90s)} files to build_output') + new_files = [] + remove_files = [] for f90 in f90s: output_path = input_to_output_fpath(config, input_path=f90) if output_path != f90: @@ -176,6 +189,13 @@ def preprocess_fortran(config: BuildConfig, source: Optional[ArtefactsGetter] = output_path.parent.mkdir(parents=True) log_or_dot(logger, f'copying {f90}') shutil.copyfile(str(f90), str(output_path)) + # Only remove and add a file when it is actually copied. + remove_files.append(f90) + new_files.append(output_path) + + config.artefact_store.replace(ArtefactSet.FORTRAN_BUILD_FILES, + remove_files=remove_files, + add_files=new_files) class DefaultCPreprocessorSource(ArtefactsGetter): @@ -186,21 +206,20 @@ class DefaultCPreprocessorSource(ArtefactsGetter): """ def __call__(self, artefact_store): - return CollectionGetter(PRAGMAD_C)(artefact_store) \ - or SuffixFilter('all_source', '.c')(artefact_store) + return CollectionGetter(ArtefactSet.PRAGMAD_C)(artefact_store) \ + or SuffixFilter(ArtefactSet.INITIAL_SOURCE, '.c')(artefact_store) # todo: rename preprocess_c @step -def preprocess_c(config: BuildConfig, source=None, **kwargs): +def preprocess_c(config: BuildConfig, + source: Optional[ArtefactsGetter] = None, **kwargs): """ Wrapper to pre_processor for C files. Params as per :func:`~fab.steps.preprocess._pre_processor`. - - The preprocessor is taken from the `CPP` environment, or falls back to `cpp`. - - If source is not provided, it defaults to :class:`~fab.steps.preprocess.DefaultCPreprocessorSource`. + If source is not provided, it defaults to + :class:`~fab.steps.preprocess.DefaultCPreprocessorSource`. """ source_getter = source or DefaultCPreprocessorSource() @@ -214,7 +233,12 @@ def preprocess_c(config: BuildConfig, source=None, **kwargs): config, preprocessor=cpp, files=source_files, - output_collection='preprocessed_c', output_suffix='.c', + output_collection=ArtefactSet.PREPROCESSED_C, + output_suffix='.c', name='preprocess c', **kwargs, ) + + config.artefact_store.replace(ArtefactSet.C_BUILD_FILES, + remove_files=source_files, + add_files=config.artefact_store[ArtefactSet.PREPROCESSED_C]) diff --git a/source/fab/steps/psyclone.py b/source/fab/steps/psyclone.py index 0db38b3d..5ed53904 100644 --- a/source/fab/steps/psyclone.py +++ b/source/fab/steps/psyclone.py @@ -19,14 +19,15 @@ from fab.build_config import BuildConfig -from fab.artefacts import ArtefactsGetter, CollectionConcat, SuffixFilter +from fab.artefacts import (ArtefactSet, ArtefactsGetter, SuffixFilter) from fab.parse.fortran import FortranAnalyser, AnalysedFortran from fab.parse.x90 import X90Analyser, AnalysedX90 from fab.steps import run_mp, check_for_errors, step from fab.steps.preprocess import pre_processor from fab.tools import Category, Psyclone -from fab.util import log_or_dot, input_to_output_fpath, file_checksum, file_walk, TimerLogger, \ - string_checksum, suffix_filter, by_type, log_or_dot_finish +from fab.util import (log_or_dot, input_to_output_fpath, file_checksum, + file_walk, TimerLogger, string_checksum, suffix_filter, + by_type, log_or_dot_finish) logger = logging.getLogger(__name__) @@ -37,17 +38,22 @@ def preprocess_x90(config, common_flags: Optional[List[str]] = None): # get the tool from FPP fpp = config.tool_box[Category.FORTRAN_PREPROCESSOR] - source_files = SuffixFilter('all_source', '.X90')(config.artefact_store) + source_files = SuffixFilter(ArtefactSet.X90_BUILD_FILES, '.X90')(config.artefact_store) + # Add the pre-processed now .x90 files into X90_BUILD_FILES pre_processor( config, preprocessor=fpp, files=source_files, - output_collection='preprocessed_x90', + output_collection=ArtefactSet.X90_BUILD_FILES, output_suffix='.x90', name='preprocess x90', common_flags=common_flags, ) + # Then remove the .X90 files: + config.artefact_store.replace(ArtefactSet.X90_BUILD_FILES, + remove_files=source_files, + add_files=[]) @dataclass @@ -70,10 +76,8 @@ class MpCommonArgs: override_files: List[str] # filenames (not paths) of hand crafted overrides -DEFAULT_SOURCE_GETTER = CollectionConcat([ - 'preprocessed_x90', # any X90 we've preprocessed this run - SuffixFilter('all_source', '.x90'), # any already preprocessed x90 we pulled in -]) +# any already preprocessed x90 we pulled in +DEFAULT_SOURCE_GETTER = SuffixFilter(ArtefactSet.X90_BUILD_FILES, '.x90') @step @@ -140,11 +144,11 @@ def psyclone(config, kernel_roots: Optional[List[Path]] = None, check_for_errors(outputs, caller_label='psyclone') # flatten the list of lists we got back from run_mp - output_files: List[Path] = list(chain(*by_type(outputs, List))) + output_files: Set[Path] = set(chain(*by_type(outputs, List))) prebuild_files: List[Path] = list(chain(*by_type(prebuilds, List))) # record the output files in the artefact store for further processing - config.artefact_store['psyclone_output'] = output_files + config.artefact_store.add(ArtefactSet.FORTRAN_BUILD_FILES, output_files) outputs_str = "\n".join(map(str, output_files)) logger.debug(f'psyclone outputs:\n{outputs_str}\n') diff --git a/source/fab/steps/root_inc_files.py b/source/fab/steps/root_inc_files.py index 9ed53df4..5ab3a58b 100644 --- a/source/fab/steps/root_inc_files.py +++ b/source/fab/steps/root_inc_files.py @@ -4,10 +4,11 @@ # which you should have received as part of this distribution ############################################################################## """ -A helper step to copy .inc files to the root of the build source folder, for easy include by the preprocessor. +A helper step to copy .inc files to the root of the build source folder, +for easy include by the preprocessor. -Currently only used for building JULES, .inc files are due to be removed from dev practices, -at which point this step should be deprecated. +Currently only used for building JULES, .inc files are due to be removed +from dev practices, at which point this step should be deprecated. """ import logging @@ -15,6 +16,7 @@ import warnings from pathlib import Path +from fab.artefacts import ArtefactSet from fab.build_config import BuildConfig from fab.steps import step from fab.util import suffix_filter @@ -35,8 +37,9 @@ def root_inc_files(config: BuildConfig): Artefacts created by previous Steps. This is where we find the artefacts to process. :param config: - The :class:`fab.build_config.BuildConfig` object where we can read settings - such as the project workspace folder or the multiprocessing flag. + The :class:`fab.build_config.BuildConfig` object where we can + read settings such as the project workspace folder or the + multiprocessing flag. """ @@ -44,14 +47,18 @@ def root_inc_files(config: BuildConfig): build_output: Path = config.build_output build_output.mkdir(parents=True, exist_ok=True) - warnings.warn("RootIncFiles is deprecated as .inc files are due to be removed.", DeprecationWarning) + warnings.warn("RootIncFiles is deprecated as .inc files are due to " + "be removed.", DeprecationWarning) - # inc files all go in the root - they're going to be removed altogether, soon + # inc files all go in the root - they're going to be + # removed altogether, soon inc_copied = set() - for fpath in suffix_filter(config.artefact_store["all_source"], [".inc"]): + initial_source = config.artefact_store[ArtefactSet.INITIAL_SOURCE] + for fpath in suffix_filter(initial_source, [".inc"]): # don't copy from the output root to the output root! - # this is currently unlikely to happen but did in the past, and caused problems. + # this is currently unlikely to happen but did in the past, + # and caused problems. if fpath.parent == build_output: continue diff --git a/source/fab/tools/tool.py b/source/fab/tools/tool.py index 4fe97de4..af9b8bfb 100644 --- a/source/fab/tools/tool.py +++ b/source/fab/tools/tool.py @@ -124,8 +124,9 @@ def run(self, Run the binary as a subprocess. :param additional_parameters: - List of strings to be sent to :func:`subprocess.run` as the - command. + List of strings or paths to be sent to :func:`subprocess.run` + as additional parameters for the command. Any path will be + converted to a normal string. :param env: Optional env for the command. By default it will use the current session's environment. diff --git a/tests/system_tests/CFortranInterop/test_CFortranInterop.py b/tests/system_tests/CFortranInterop/test_CFortranInterop.py index cc5632ae..d667506b 100644 --- a/tests/system_tests/CFortranInterop/test_CFortranInterop.py +++ b/tests/system_tests/CFortranInterop/test_CFortranInterop.py @@ -6,8 +6,8 @@ import subprocess from pathlib import Path +from fab.artefacts import ArtefactSet from fab.build_config import BuildConfig -from fab.constants import EXECUTABLES from fab.steps.analyse import analyse from fab.steps.c_pragma_injector import c_pragma_injector from fab.steps.compile_c import compile_c @@ -44,10 +44,10 @@ def test_CFortranInterop(tmp_path): # '/lib/x86_64-linux-gnu/libgfortran.so.5', # ] - assert len(config.artefact_store[EXECUTABLES]) == 1 + assert len(config.artefact_store[ArtefactSet.EXECUTABLES]) == 1 # run - command = [str(config.artefact_store[EXECUTABLES][0])] + command = [str(list(config.artefact_store[ArtefactSet.EXECUTABLES])[0])] res = subprocess.run(command, capture_output=True) output = res.stdout.decode() assert output == ''.join(open(PROJECT_SOURCE / 'expected.exec.txt').readlines()) diff --git a/tests/system_tests/CUserHeader/test_CUserHeader.py b/tests/system_tests/CUserHeader/test_CUserHeader.py index 98d2ccb5..8c3878b0 100644 --- a/tests/system_tests/CUserHeader/test_CUserHeader.py +++ b/tests/system_tests/CUserHeader/test_CUserHeader.py @@ -6,8 +6,8 @@ import subprocess from pathlib import Path +from fab.artefacts import ArtefactSet from fab.build_config import BuildConfig -from fab.constants import EXECUTABLES from fab.steps.analyse import analyse from fab.steps.c_pragma_injector import c_pragma_injector from fab.steps.compile_c import compile_c @@ -34,10 +34,10 @@ def test_CUseHeader(tmp_path): compile_c(config, common_flags=['-c', '-std=c99']) link_exe(config, flags=['-lgfortran']) - assert len(config.artefact_store[EXECUTABLES]) == 1 + assert len(config.artefact_store[ArtefactSet.EXECUTABLES]) == 1 # run - command = [str(config.artefact_store[EXECUTABLES][0])] + command = [str(list(config.artefact_store[ArtefactSet.EXECUTABLES])[0])] res = subprocess.run(command, capture_output=True) output = res.stdout.decode() assert output == ''.join(open(PROJECT_SOURCE / 'expected.exec.txt').readlines()) diff --git a/tests/system_tests/FortranDependencies/test_FortranDependencies.py b/tests/system_tests/FortranDependencies/test_FortranDependencies.py index 6ee57e3e..98aff404 100644 --- a/tests/system_tests/FortranDependencies/test_FortranDependencies.py +++ b/tests/system_tests/FortranDependencies/test_FortranDependencies.py @@ -6,8 +6,8 @@ import subprocess from pathlib import Path +from fab.artefacts import ArtefactSet from fab.build_config import BuildConfig -from fab.constants import EXECUTABLES from fab.parse.fortran import AnalysedFortran from fab.steps.analyse import analyse from fab.steps.compile_c import compile_c @@ -35,11 +35,11 @@ def test_fortran_dependencies(tmp_path): compile_fortran(config, common_flags=['-c']) link_exe(config, flags=['-lgfortran']) - assert len(config.artefact_store[EXECUTABLES]) == 2 + assert len(config.artefact_store[ArtefactSet.EXECUTABLES]) == 2 # run both exes output = set() - for exe in config.artefact_store[EXECUTABLES]: + for exe in config.artefact_store[ArtefactSet.EXECUTABLES]: res = subprocess.run(str(exe), capture_output=True) output.add(res.stdout.decode()) @@ -51,28 +51,28 @@ def test_fortran_dependencies(tmp_path): # check the analysis results assert AnalysedFortran.load(config.prebuild_folder / 'first.193489053.an') == AnalysedFortran( - fpath=config.source_root / 'first.f90', file_hash=193489053, + fpath=config.build_output / 'first.f90', file_hash=193489053, program_defs={'first'}, module_defs=None, symbol_defs={'first'}, module_deps={'greeting_mod', 'constants_mod'}, symbol_deps={'greeting_mod', 'constants_mod', 'greet'}) assert AnalysedFortran.load(config.prebuild_folder / 'two.2557739057.an') == AnalysedFortran( - fpath=config.source_root / 'two.f90', file_hash=2557739057, + fpath=config.build_output / 'two.f90', file_hash=2557739057, program_defs={'second'}, module_defs=None, symbol_defs={'second'}, module_deps={'constants_mod', 'bye_mod'}, symbol_deps={'constants_mod', 'bye_mod', 'farewell'}) assert AnalysedFortran.load(config.prebuild_folder / 'greeting_mod.62446538.an') == AnalysedFortran( - fpath=config.source_root / 'greeting_mod.f90', file_hash=62446538, + fpath=config.build_output / 'greeting_mod.f90', file_hash=62446538, module_defs={'greeting_mod'}, symbol_defs={'greeting_mod'}, module_deps={'constants_mod'}, symbol_deps={'constants_mod'}) assert AnalysedFortran.load(config.prebuild_folder / 'bye_mod.3332267073.an') == AnalysedFortran( - fpath=config.source_root / 'bye_mod.f90', file_hash=3332267073, + fpath=config.build_output / 'bye_mod.f90', file_hash=3332267073, module_defs={'bye_mod'}, symbol_defs={'bye_mod'}, module_deps={'constants_mod'}, symbol_deps={'constants_mod'}) assert AnalysedFortran.load(config.prebuild_folder / 'constants_mod.233796393.an') == AnalysedFortran( - fpath=config.source_root / 'constants_mod.f90', file_hash=233796393, + fpath=config.build_output / 'constants_mod.f90', file_hash=233796393, module_defs={'constants_mod'}, symbol_defs={'constants_mod'}, module_deps=None, symbol_deps=None) diff --git a/tests/system_tests/FortranPreProcess/test_FortranPreProcess.py b/tests/system_tests/FortranPreProcess/test_FortranPreProcess.py index cd22f528..2081e9de 100644 --- a/tests/system_tests/FortranPreProcess/test_FortranPreProcess.py +++ b/tests/system_tests/FortranPreProcess/test_FortranPreProcess.py @@ -6,8 +6,8 @@ import subprocess from pathlib import Path +from fab.artefacts import ArtefactSet from fab.build_config import BuildConfig -from fab.constants import EXECUTABLES from fab.steps.analyse import analyse from fab.steps.compile_fortran import compile_fortran from fab.steps.find_source_files import find_source_files @@ -37,15 +37,16 @@ def build(fab_workspace, fpp_flags=None): def test_FortranPreProcess(tmp_path): # stay - stay_config = build(fab_workspace=tmp_path, fpp_flags=['-P', '-DSHOULD_I_STAY=yes']) + stay_config = build(fab_workspace=tmp_path, + fpp_flags=['-P', '-DSHOULD_I_STAY=yes']) - stay_exe = stay_config.artefact_store[EXECUTABLES][0] + stay_exe = list(stay_config.artefact_store[ArtefactSet.EXECUTABLES])[0] stay_res = subprocess.run(str(stay_exe), capture_output=True) assert stay_res.stdout.decode().strip() == 'I should stay' # go go_config = build(fab_workspace=tmp_path, fpp_flags=['-P']) - go_exe = go_config.artefact_store[EXECUTABLES][0] + go_exe = list(go_config.artefact_store[ArtefactSet.EXECUTABLES])[0] go_res = subprocess.run(str(go_exe), capture_output=True) assert go_res.stdout.decode().strip() == 'I should go now' diff --git a/tests/system_tests/MinimalC/test_MinimalC.py b/tests/system_tests/MinimalC/test_MinimalC.py index 4d32751e..471e48b0 100644 --- a/tests/system_tests/MinimalC/test_MinimalC.py +++ b/tests/system_tests/MinimalC/test_MinimalC.py @@ -6,8 +6,8 @@ import subprocess from pathlib import Path +from fab.artefacts import ArtefactSet from fab.build_config import BuildConfig -from fab.constants import EXECUTABLES from fab.steps.analyse import analyse from fab.steps.c_pragma_injector import c_pragma_injector from fab.steps.compile_c import compile_c @@ -34,10 +34,10 @@ def test_minimal_c(tmp_path): compile_c(config, common_flags=['-c', '-std=c99']) link_exe(config) - assert len(config.artefact_store[EXECUTABLES]) == 1 + assert len(config.artefact_store[ArtefactSet.EXECUTABLES]) == 1 # run - command = [str(config.artefact_store[EXECUTABLES][0])] + command = [str(list(config.artefact_store[ArtefactSet.EXECUTABLES])[0])] res = subprocess.run(command, capture_output=True) output = res.stdout.decode() assert output == 'Hello world!' diff --git a/tests/system_tests/MinimalFortran/test_MinimalFortran.py b/tests/system_tests/MinimalFortran/test_MinimalFortran.py index 4d0efaab..71e58ae4 100644 --- a/tests/system_tests/MinimalFortran/test_MinimalFortran.py +++ b/tests/system_tests/MinimalFortran/test_MinimalFortran.py @@ -6,8 +6,8 @@ import subprocess from pathlib import Path +from fab.artefacts import ArtefactSet from fab.build_config import BuildConfig -from fab.constants import EXECUTABLES from fab.steps.analyse import analyse from fab.steps.compile_fortran import compile_fortran from fab.steps.find_source_files import find_source_files @@ -34,10 +34,10 @@ def test_minimal_fortran(tmp_path): compile_fortran(config, common_flags=['-c']) link_exe(config, flags=['-lgfortran']) - assert len(config.artefact_store[EXECUTABLES]) == 1 + assert len(config.artefact_store[ArtefactSet.EXECUTABLES]) == 1 # run - command = [str(config.artefact_store[EXECUTABLES][0])] + command = [str(list(config.artefact_store[ArtefactSet.EXECUTABLES])[0])] res = subprocess.run(command, capture_output=True) output = res.stdout.decode() assert output.strip() == 'Hello world!' diff --git a/tests/system_tests/incremental_fortran/test_incremental_fortran.py b/tests/system_tests/incremental_fortran/test_incremental_fortran.py index b7b9b4a1..bc4c39eb 100644 --- a/tests/system_tests/incremental_fortran/test_incremental_fortran.py +++ b/tests/system_tests/incremental_fortran/test_incremental_fortran.py @@ -6,8 +6,9 @@ import pytest +from fab.artefacts import ArtefactSet from fab.build_config import BuildConfig -from fab.constants import PREBUILD, CURRENT_PREBUILDS, BUILD_OUTPUT +from fab.constants import PREBUILD, BUILD_OUTPUT from fab.steps.analyse import analyse from fab.steps.cleanup_prebuilds import cleanup_prebuilds from fab.steps.compile_fortran import compile_fortran @@ -21,7 +22,7 @@ PROJECT_LABEL = 'tiny_project' -class TestIncremental(object): +class TestIncremental(): """ Checks: - basic Fortran project build @@ -223,7 +224,7 @@ def assert_one_artefact(self, pb_keys, prebuild_groups, prebuild_folder, assert clean_hashes[prebuild_folder / pb_fpath] == rebuild_hashes[prebuild_folder / pb_fpath] -class TestCleanupPrebuilds(object): +class TestCleanupPrebuilds(): # Test cleanup of the incremental build artefacts in_out = [ @@ -253,10 +254,11 @@ def test_clean(self, tmp_path, kwargs, expect): def test_prune_unused(self, tmp_path): # pruning everything not current + current_prebuilds = ArtefactSet.CURRENT_PREBUILDS with BuildConfig(project_label=PROJECT_LABEL, tool_box=ToolBox(), fab_workspace=tmp_path, multiprocessing=False) as config: - config._artefact_store = {CURRENT_PREBUILDS: { + config._artefact_store = {current_prebuilds: { tmp_path / PROJECT_LABEL / BUILD_OUTPUT / PREBUILD / 'a.123.foo', tmp_path / PROJECT_LABEL / BUILD_OUTPUT / PREBUILD / 'a.456.foo', }} diff --git a/tests/unit_tests/steps/test_archive_objects.py b/tests/unit_tests/steps/test_archive_objects.py index 805459e3..d366f422 100644 --- a/tests/unit_tests/steps/test_archive_objects.py +++ b/tests/unit_tests/steps/test_archive_objects.py @@ -10,13 +10,13 @@ from unittest import mock from unittest.mock import call +import pytest + +from fab.artefacts import ArtefactSet from fab.build_config import BuildConfig -from fab.constants import OBJECT_FILES, OBJECT_ARCHIVES from fab.steps.archive_objects import archive_objects from fab.tools import Category, ToolBox -import pytest - class TestArchiveObjects: '''Test the achive step. @@ -28,8 +28,10 @@ def test_for_exes(self): targets = ['prog1', 'prog2'] config = BuildConfig('proj', ToolBox()) - config._artefact_store = {OBJECT_FILES: {target: [f'{target}.o', 'util.o'] - for target in targets}} + for target in targets: + config.artefact_store.update_dict( + ArtefactSet.OBJECT_FILES, target, + set([f'{target}.o', 'util.o'])) mock_result = mock.Mock(returncode=0, return_value=123) with mock.patch('fab.tools.tool.subprocess.run', @@ -48,8 +50,8 @@ def test_for_exes(self): mock_run_command.assert_has_calls(expected_calls) # ensure the correct artefacts were created - assert config.artefact_store[OBJECT_ARCHIVES] == { - target: [str(config.build_output / f'{target}.a')] for target in targets} + assert config.artefact_store[ArtefactSet.OBJECT_ARCHIVES] == { + target: set([str(config.build_output / f'{target}.a')]) for target in targets} def test_for_library(self): '''As used when building an object archive or archiving before linking @@ -57,7 +59,8 @@ def test_for_library(self): ''' config = BuildConfig('proj', ToolBox()) - config._artefact_store = {OBJECT_FILES: {None: ['util1.o', 'util2.o']}} + config.artefact_store.update_dict( + ArtefactSet.OBJECT_FILES, None, {'util1.o', 'util2.o'}) mock_result = mock.Mock(returncode=0, return_value=123) with mock.patch('fab.tools.tool.subprocess.run', @@ -71,8 +74,8 @@ def test_for_library(self): capture_output=True, env=None, cwd=None, check=False) # ensure the correct artefacts were created - assert config.artefact_store[OBJECT_ARCHIVES] == { - None: [str(config.build_output / 'mylib.a')]} + assert config.artefact_store[ArtefactSet.OBJECT_ARCHIVES] == { + None: set([str(config.build_output / 'mylib.a')])} def test_incorrect_tool(self): '''Test that an incorrect archive tool is detected diff --git a/tests/unit_tests/steps/test_cleanup_prebuilds.py b/tests/unit_tests/steps/test_cleanup_prebuilds.py index 99a26952..ba075401 100644 --- a/tests/unit_tests/steps/test_cleanup_prebuilds.py +++ b/tests/unit_tests/steps/test_cleanup_prebuilds.py @@ -10,7 +10,7 @@ import pytest -from fab.constants import CURRENT_PREBUILDS +from fab.artefacts import ArtefactSet from fab.steps.cleanup_prebuilds import by_age, by_version_age, cleanup_prebuilds, remove_all_unused from fab.util import get_prebuild_file_groups @@ -18,10 +18,11 @@ class TestCleanupPrebuilds(object): def test_init_no_args(self): + current_prebuilds = ArtefactSet.CURRENT_PREBUILDS with mock.patch('fab.steps.cleanup_prebuilds.file_walk', return_value=[Path('foo.o')]), \ pytest.warns(UserWarning, match="_metric_send_conn not set, cannot send metrics"): with mock.patch('fab.steps.cleanup_prebuilds.remove_all_unused') as mock_remove_all_unused: - cleanup_prebuilds(config=mock.Mock(artefact_store={CURRENT_PREBUILDS: [Path('bar.o')]})) + cleanup_prebuilds(config=mock.Mock(artefact_store={current_prebuilds: [Path('bar.o')]})) mock_remove_all_unused.assert_called_once_with(found_files=[Path('foo.o')], current_files=[Path('bar.o')]) def test_init_bad_args(self): diff --git a/tests/unit_tests/steps/test_compile_c.py b/tests/unit_tests/steps/test_compile_c.py index 93419b41..b5e65624 100644 --- a/tests/unit_tests/steps/test_compile_c.py +++ b/tests/unit_tests/steps/test_compile_c.py @@ -13,8 +13,8 @@ import pytest +from fab.artefacts import ArtefactSet from fab.build_config import AddFlags, BuildConfig -from fab.constants import BUILD_TREES, OBJECT_FILES from fab.parse.c import AnalysedC from fab.steps.compile_c import _get_obj_combo_hash, _compile_file, compile_c from fab.tools import Category, Flags @@ -30,7 +30,8 @@ def fixture_content(tmp_path, tool_box): fab_workspace=tmp_path) analysed_file = AnalysedC(fpath=Path(f'{config.source_root}/foo.c'), file_hash=0) - config._artefact_store[BUILD_TREES] = {None: {analysed_file.fpath: analysed_file}} + config._artefact_store[ArtefactSet.BUILD_TREES] = \ + {None: {analysed_file.fpath: analysed_file}} expect_hash = 7435424994 return config, analysed_file, expect_hash @@ -89,7 +90,7 @@ def test_vanilla(self, content): send_metric.assert_called_once() # ensure it created the correct artefact collection - assert config.artefact_store[OBJECT_FILES] == { + assert config.artefact_store[ArtefactSet.OBJECT_FILES] == { None: {config.prebuild_folder / f'foo.{expect_hash:x}.o', } } diff --git a/tests/unit_tests/steps/test_compile_fortran.py b/tests/unit_tests/steps/test_compile_fortran.py index 5fc6c629..aab44747 100644 --- a/tests/unit_tests/steps/test_compile_fortran.py +++ b/tests/unit_tests/steps/test_compile_fortran.py @@ -5,8 +5,8 @@ import pytest +from fab.artefacts import ArtefactSet, ArtefactStore from fab.build_config import BuildConfig, FlagsConfig -from fab.constants import BUILD_TREES, OBJECT_FILES from fab.parse.fortran import AnalysedFortran from fab.steps.compile_fortran import ( compile_pass, get_compile_next, @@ -28,7 +28,7 @@ def fixture_analysed_files(): @pytest.fixture(name="artefact_store") def fixture_artefact_store(analysed_files): build_tree = {af.fpath: af for af in analysed_files} - artefact_store = {BUILD_TREES: {None: build_tree}} + artefact_store = {ArtefactSet.BUILD_TREES: {None: build_tree}} return artefact_store @@ -134,16 +134,15 @@ def test_vanilla(self): } # where it stores the results - artefact_store = {} + artefact_store = ArtefactStore() - store_artefacts(compiled_files=compiled_files, build_lists=build_lists, artefact_store=artefact_store) + store_artefacts(compiled_files=compiled_files, build_lists=build_lists, + artefact_store=artefact_store) - assert artefact_store == { - OBJECT_FILES: { + assert artefact_store[ArtefactSet.OBJECT_FILES] == { 'root1': {Path('root1.o'), Path('dep1.o')}, 'root2': {Path('root2.o'), Path('dep2.o')}, } - } # This avoids pylint warnings about Redefining names from outer scope diff --git a/tests/unit_tests/steps/test_link.py b/tests/unit_tests/steps/test_link.py index 60a69a7a..f015bb27 100644 --- a/tests/unit_tests/steps/test_link.py +++ b/tests/unit_tests/steps/test_link.py @@ -7,7 +7,7 @@ from types import SimpleNamespace from unittest import mock -from fab.constants import OBJECT_FILES +from fab.artefacts import ArtefactSet, ArtefactStore from fab.steps.link import link_exe from fab.tools import Linker @@ -21,9 +21,10 @@ def test_run(self, tool_box): config = SimpleNamespace( project_workspace=Path('workspace'), - artefact_store={OBJECT_FILES: {'foo': {'foo.o', 'bar.o'}}}, + artefact_store=ArtefactStore(), tool_box=tool_box ) + config.artefact_store[ArtefactSet.OBJECT_FILES] = {'foo': {'foo.o', 'bar.o'}} with mock.patch('os.getenv', return_value='-L/foo1/lib -L/foo2/lib'): # We need to create a linker here to pick up the env var: diff --git a/tests/unit_tests/steps/test_link_shared_object.py b/tests/unit_tests/steps/test_link_shared_object.py index 117971d1..224dda19 100644 --- a/tests/unit_tests/steps/test_link_shared_object.py +++ b/tests/unit_tests/steps/test_link_shared_object.py @@ -11,7 +11,7 @@ from types import SimpleNamespace from unittest import mock -from fab.constants import OBJECT_FILES +from fab.artefacts import ArtefactSet, ArtefactStore from fab.steps.link import link_shared_object from fab.tools import Linker @@ -25,9 +25,11 @@ def test_run(tool_box): config = SimpleNamespace( project_workspace=Path('workspace'), build_output=Path("workspace"), - artefact_store={OBJECT_FILES: {None: {'foo.o', 'bar.o'}}}, + artefact_store=ArtefactStore(), tool_box=tool_box ) + config.artefact_store[ArtefactSet.OBJECT_FILES] = \ + {None: {'foo.o', 'bar.o'}} with mock.patch('os.getenv', return_value='-L/foo1/lib -L/foo2/lib'): # We need to create a linker here to pick up the env var: diff --git a/tests/unit_tests/steps/test_root_inc_files.py b/tests/unit_tests/steps/test_root_inc_files.py index 50466c19..6c0e94b9 100644 --- a/tests/unit_tests/steps/test_root_inc_files.py +++ b/tests/unit_tests/steps/test_root_inc_files.py @@ -3,6 +3,7 @@ import pytest +from fab.artefacts import ArtefactSet from fab.build_config import BuildConfig from fab.steps.root_inc_files import root_inc_files from fab.tools import ToolBox @@ -15,38 +16,44 @@ def test_vanilla(self): inc_files = [Path('/foo/source/bar.inc')] config = BuildConfig('proj', ToolBox()) - config.artefact_store['all_source'] = inc_files + config.artefact_store[ArtefactSet.INITIAL_SOURCE] = inc_files with mock.patch('fab.steps.root_inc_files.shutil') as mock_shutil: with mock.patch('fab.steps.root_inc_files.Path.mkdir'), \ - pytest.warns(UserWarning, match="_metric_send_conn not set, cannot send metrics"): + pytest.warns(UserWarning, match="_metric_send_conn not set, " + "cannot send metrics"): root_inc_files(config) - mock_shutil.copy.assert_called_once_with(inc_files[0], config.build_output) + mock_shutil.copy.assert_called_once_with(inc_files[0], + config.build_output) def test_skip_output_folder(self): # ensure it doesn't try to copy a file in the build output config = BuildConfig('proj', ToolBox()) - inc_files = [Path('/foo/source/bar.inc'), config.build_output / 'fab.inc'] - config.artefact_store['all_source'] = inc_files + inc_files = [Path('/foo/source/bar.inc'), + config.build_output / 'fab.inc'] + config.artefact_store[ArtefactSet.INITIAL_SOURCE] = inc_files with mock.patch('fab.steps.root_inc_files.shutil') as mock_shutil: with mock.patch('fab.steps.root_inc_files.Path.mkdir'), \ - pytest.warns(UserWarning, match="_metric_send_conn not set, cannot send metrics"): + pytest.warns(UserWarning, match="_metric_send_conn not set, " + "cannot send metrics"): root_inc_files(config) - mock_shutil.copy.assert_called_once_with(inc_files[0], config.build_output) + mock_shutil.copy.assert_called_once_with(inc_files[0], + config.build_output) def test_name_clash(self): # ensure raises an exception if there is a name clash inc_files = [Path('/foo/source/bar.inc'), Path('/foo/sauce/bar.inc')] config = BuildConfig('proj', ToolBox()) - config.artefact_store['all_source'] = inc_files + config.artefact_store[ArtefactSet.INITIAL_SOURCE] = inc_files with pytest.raises(FileExistsError): with mock.patch('fab.steps.root_inc_files.shutil'): with mock.patch('fab.steps.root_inc_files.Path.mkdir'), \ pytest.warns(DeprecationWarning, - match="RootIncFiles is deprecated as .inc files are due to be removed."): + match="RootIncFiles is deprecated as .inc " + "files are due to be removed."): root_inc_files(config) diff --git a/tests/unit_tests/test_artefacts.py b/tests/unit_tests/test_artefacts.py index cd011143..06f6037b 100644 --- a/tests/unit_tests/test_artefacts.py +++ b/tests/unit_tests/test_artefacts.py @@ -1,10 +1,95 @@ +'''Tests the artefacts file. +''' + from unittest import mock from unittest.mock import call - +from pathlib import Path import pytest -from fab.artefacts import ArtefactStore, ArtefactsGetter, FilterBuildTrees -from fab.constants import BUILD_TREES, CURRENT_PREBUILDS +from fab.artefacts import (ArtefactSet, ArtefactStore, ArtefactsGetter, + CollectionConcat, CollectionGetter, + FilterBuildTrees, SuffixFilter) + + +def test_artefact_store() -> None: + '''Tests the ArtefactStore class.''' + artefact_store = ArtefactStore() + assert len(artefact_store) == len(ArtefactSet) + assert isinstance(artefact_store, dict) + for artefact in ArtefactSet: + if artefact in [ArtefactSet.OBJECT_FILES, + ArtefactSet.OBJECT_ARCHIVES]: + assert isinstance(artefact_store[artefact], dict) + else: + assert isinstance(artefact_store[artefact], set) + + +def test_artefact_store_copy() -> None: + '''Tests the add and copy operations.''' + artefact_store = ArtefactStore() + # We need paths for suffix filtering, so create some + a = Path("a.f90") + b = Path("b.F90") + c = Path("c.f90") + d = Path("d.F90.nocopy") + e = Path("e.f90.donotcopyeither") + # Try adding a single path, a set and a list: + artefact_store.add(ArtefactSet.INITIAL_SOURCE, a) + artefact_store.copy_artefacts(ArtefactSet.INITIAL_SOURCE, + ArtefactSet.CURRENT_PREBUILDS) + assert artefact_store[ArtefactSet.CURRENT_PREBUILDS] == set([a]) + artefact_store.add(ArtefactSet.INITIAL_SOURCE, [b, c]) + artefact_store.add(ArtefactSet.INITIAL_SOURCE, set([d, e])) + assert (artefact_store[ArtefactSet.INITIAL_SOURCE] == + set([a, b, c, d, e])) + + # Make sure that the previous copy did not get modified: + assert artefact_store[ArtefactSet.CURRENT_PREBUILDS] == set([a]) + artefact_store.copy_artefacts(ArtefactSet.INITIAL_SOURCE, + ArtefactSet.CURRENT_PREBUILDS) + assert (artefact_store[ArtefactSet.CURRENT_PREBUILDS] == + set([a, b, c, d, e])) + # Now copy with suffix filtering: + artefact_store.copy_artefacts(ArtefactSet.INITIAL_SOURCE, + ArtefactSet.FORTRAN_BUILD_FILES, + suffixes=[".F90", ".f90"]) + assert artefact_store[ArtefactSet.FORTRAN_BUILD_FILES] == set([a, b, c]) + + # Make sure filtering is case sensitive + artefact_store.copy_artefacts(ArtefactSet.INITIAL_SOURCE, + ArtefactSet.C_BUILD_FILES, + suffixes=[".f90"]) + assert artefact_store[ArtefactSet.C_BUILD_FILES] == set([a, c]) + + +def test_artefact_store_update_dict() -> None: + '''Tests the update_dict function.''' + artefact_store = ArtefactStore() + artefact_store.update_dict(ArtefactSet.OBJECT_FILES, "a", [Path("AA")]) + assert artefact_store[ArtefactSet.OBJECT_FILES] == {"a": {Path("AA")}} + artefact_store.update_dict(ArtefactSet.OBJECT_FILES, + "b", set([Path("BB")])) + assert (artefact_store[ArtefactSet.OBJECT_FILES] == {"a": {Path("AA")}, + "b": {Path("BB")}}) + + +def test_artefact_store_replace() -> None: + '''Tests the replace function.''' + artefact_store = ArtefactStore() + artefact_store.add(ArtefactSet.INITIAL_SOURCE, [Path("a"), Path("b"), + Path("c")]) + artefact_store.replace(ArtefactSet.INITIAL_SOURCE, + remove_files=[Path("a"), Path("b")], + add_files=[Path("B")]) + assert artefact_store[ArtefactSet.INITIAL_SOURCE] == set([Path("B"), + Path("c")]) + + # Test the behaviour for dictionaries + with pytest.raises(RuntimeError) as err: + artefact_store.replace(ArtefactSet.OBJECT_FILES, + remove_files=[Path("a")], add_files=["c"]) + assert ("Replacing artefacts in dictionary 'ArtefactSet.OBJECT_FILES' " + "is not supported" in str(err.value)) def test_artefacts_getter(): @@ -43,13 +128,15 @@ def __call__(self, artefact_store): class TestFilterBuildTrees(): + '''Tests for FilterBuildTrees.''' @pytest.fixture - def artefact_store(self): + def artefact_store(self) -> ArtefactStore: '''A fixture that returns an ArtefactStore with some elements.''' artefact_store = ArtefactStore() - artefact_store[BUILD_TREES] = {'tree1': {'a.foo': None, + build_trees = ArtefactSet.BUILD_TREES + artefact_store[build_trees] = {'tree1': {'a.foo': None, 'b.foo': None, 'c.bar': None, }, 'tree2': {'d.foo': None, @@ -58,34 +145,56 @@ def artefact_store(self): } return artefact_store - def test_single_suffix(self, artefact_store): - # ensure the artefact getter passes through the trees properly to the filter func + def test_single_suffix(self, artefact_store) -> None: + '''Ensure the artefact getter passes through the trees properly to + the filter func.''' # run the artefact getter - filter_build_trees = FilterBuildTrees('.foo', BUILD_TREES) - with mock.patch('fab.artefacts.filter_source_tree') as mock_filter_func: + filter_build_trees = FilterBuildTrees('.foo') + with mock.patch('fab.artefacts.filter_source_tree') as mock_filter: filter_build_trees(artefact_store) - mock_filter_func.assert_has_calls([ - call(source_tree=artefact_store[BUILD_TREES]['tree1'], suffixes=['.foo']), - call(source_tree=artefact_store[BUILD_TREES]['tree2'], suffixes=['.foo']), + build_trees = ArtefactSet.BUILD_TREES + mock_filter.assert_has_calls([ + call(source_tree=artefact_store[build_trees]['tree1'], + suffixes=['.foo']), + call(source_tree=artefact_store[build_trees]['tree2'], + suffixes=['.foo']), ]) - def test_multiple_suffixes(self, artefact_store): - # test it works with multiple suffixes provided - filter_build_trees = FilterBuildTrees(['.foo', '.bar'], BUILD_TREES) - with mock.patch('fab.artefacts.filter_source_tree') as mock_filter_func: + def test_multiple_suffixes(self, artefact_store) -> None: + '''Test it works with multiple suffixes provided.''' + filter_build_trees = FilterBuildTrees(['.foo', '.bar']) + with mock.patch('fab.artefacts.filter_source_tree') as mock_filter: filter_build_trees(artefact_store) - mock_filter_func.assert_has_calls([ - call(source_tree=artefact_store[BUILD_TREES]['tree1'], suffixes=['.foo', '.bar']), - call(source_tree=artefact_store[BUILD_TREES]['tree2'], suffixes=['.foo', '.bar']), + build_trees = ArtefactSet.BUILD_TREES + mock_filter.assert_has_calls([ + call(source_tree=artefact_store[build_trees]['tree1'], + suffixes=['.foo', '.bar']), + call(source_tree=artefact_store[build_trees]['tree2'], + suffixes=['.foo', '.bar']), ]) -def test_artefact_store(): - '''Tests the ArtefactStore class.''' +def test_collection_getter() -> None: + '''Test CollectionGetter.''' artefact_store = ArtefactStore() - assert len(artefact_store) == 1 - assert isinstance(artefact_store, dict) - assert CURRENT_PREBUILDS in artefact_store + artefact_store.add(ArtefactSet.INITIAL_SOURCE, ["a", "b", "c"]) + cg = CollectionGetter(ArtefactSet.INITIAL_SOURCE) + assert artefact_store[ArtefactSet.INITIAL_SOURCE] == cg(artefact_store) + + +def test_collection_concat(): + '''Test CollectionContact functionality.''' + getter = CollectionConcat(collections=[ + 'fooz', + SuffixFilter('barz', '.c') + ]) + + result = getter(artefact_store={ + 'fooz': ['foo1', 'foo2'], + 'barz': [Path('bar.a'), Path('bar.b'), Path('bar.c')], + }) + + assert result == ['foo1', 'foo2', Path('bar.c')] diff --git a/tests/unit_tests/test_util.py b/tests/unit_tests/test_util.py index e5f2ad65..b5a46f66 100644 --- a/tests/unit_tests/test_util.py +++ b/tests/unit_tests/test_util.py @@ -3,7 +3,7 @@ import pytest -from fab.artefacts import CollectionConcat, SuffixFilter +from fab.artefacts import SuffixFilter from fab.util import input_to_output_fpath, suffix_filter, file_walk @@ -18,30 +18,14 @@ def fpaths(): ] -class Test_suffix_filter(object): +class Test_suffix_filter(): def test_vanilla(self, fpaths): result = suffix_filter(fpaths=fpaths, suffixes=['.F90', '.f90']) assert result == [Path('foo.F90'), Path('foo.f90')] -class TestCollectionConcat(object): - - def test_vanilla(self): - getter = CollectionConcat(collections=[ - 'fooz', - SuffixFilter('barz', '.c') - ]) - - result = getter(artefact_store={ - 'fooz': ['foo1', 'foo2'], - 'barz': [Path('bar.a'), Path('bar.b'), Path('bar.c')], - }) - - assert result == ['foo1', 'foo2', Path('bar.c')] - - -class TestSuffixFilter(object): +class TestSuffixFilter(): def test_constructor_suffix_scalar(self): getter = SuffixFilter('barz', '.c') @@ -54,7 +38,7 @@ def test_constructor_suffix_vector(self): assert result == [Path('bar.b'), Path('bar.c')] -class Test_file_walk(object): +class Test_file_walk(): @pytest.fixture def files(self, tmp_path):