Skip to content

Commit

Permalink
🔀 MERGE: v1.6.5 release (#5075)
Browse files Browse the repository at this point in the history
  • Loading branch information
chrisjsewell authored Aug 13, 2021
2 parents ec3984d + 626a482 commit ec97fcf
Show file tree
Hide file tree
Showing 42 changed files with 888 additions and 263 deletions.
5 changes: 5 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,3 +1,8 @@
ci:
autoupdate_schedule: monthly
autofix_prs: true
skip: [mypy, pylint, dm-generate-all, pyproject, dependencies, verdi-autodocs, version-number]

repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v2.5.0
Expand Down
30 changes: 30 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,35 @@
# Changelog

## v1.6.5 - 2021-08-13

[full changelog](https://github.com/aiidateam/aiida-core/compare/v1.6.4...v1.6.5)

This patch release contains a number of helpful bug fixes and improvements.

### Improvements 👌

- Add support for the `ProxyJump` SSH config option for seting up an arbitrary number of proxy jumps without additional processes by creating TCP channels over existing SSH connections.
This provides improved control over the lifetime of the different connections.
See [SSH configuration](docs/source/howto/ssh.rst) for further details. [[#4951]](https://github.com/aiidateam/aiida-core/pull/4951)
- Allow numpy arrays to be serialized to a process checkpoint. [[#4730)]](https://github.com/aiidateam/aiida-core/pull/4730))
- Add the `_merge` method to `ProcessBuilder`, to update the builder with a nested dictionary. [[#4983)]](https://github.com/aiidateam/aiida-core/pull/4983))
- `verdi setup`: Set the defaut database hostname as `localhost`. [[#4908]](https://github.com/aiidateam/aiida-core/pull/4908)
- Allow `Node.__init__` to be constructed with a specific `User` node. [[#4977]](https://github.com/aiidateam/aiida-core/pull/4977)
- Minimize database logs of failed schema version retrievals. [[#5056]](https://github.com/aiidateam/aiida-core/pull/5056)
- Remove duplicate call of normal `callback` for `InteractiveOption`. [[#5064]](https://github.com/aiidateam/aiida-core/pull/5064)
- Update requirement `pyyaml~=5.4`, which contains critical security fixes. [[#5060]](https://github.com/aiidateam/aiida-core/pull/5060)

### Bug Fixes 🐛

- Fix regression issue with `__contains__` operator in `LinkManager`, when using double underscores, e.g. for `'some__nested__namespace' in calc.inputs`. [#5067](https://github.com/aiidateam/aiida-core/pull/5067)
- Stop deprecation warning being shown when tab-completing incoming and outgoing node links. [[#5011]](https://github.com/aiidateam/aiida-core/pull/5011)
- Stop possible command hints being shown when attempting to tab complete `verdi` commands that do not exist. [[#5012]](https://github.com/aiidateam/aiida-core/pull/5012)
- Do not use `get_detailed_job_info` when retrieving a calculation job, if no job id is set. [[#4967]](https://github.com/aiidateam/aiida-core/pull/4967)
- Race condition when two processes try to create the same `Folder`/`SandboxFolder`, [[#4912]](https://github.com/aiidateam/aiida-core/pull/4912)
- Return the whole nested namespace when using `BaseRestartWorkChain.result`. [[#4961]](https://github.com/aiidateam/aiida-core/pull/4961)
- Use `numpy.nanmin` and `numpy.nanmax` for computing y-limits of `BandsData` matplotlib methods. [[#5024]](https://github.com/aiidateam/aiida-core/pull/5024)
- Use sanitized job title with `SgeScheduler` scheduler. [[#4994]](https://github.com/aiidateam/aiida-core/pull/4994)

## v1.6.4 - 2021-06-23

[full changelog](https://github.com/aiidateam/aiida-core/compare/v1.6.3...v1.6.4)
Expand Down
2 changes: 1 addition & 1 deletion aiida/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
'For further information please visit http://www.aiida.net/. All rights reserved.'
)
__license__ = 'MIT license, see LICENSE.txt file.'
__version__ = '1.6.4'
__version__ = '1.6.5'
__authors__ = 'The AiiDA team.'
__paper__ = (
'S. P. Huber et al., "AiiDA 1.0, a scalable computational infrastructure for automated reproducible workflows and '
Expand Down
18 changes: 10 additions & 8 deletions aiida/backends/djsite/manager.py
Original file line number Diff line number Diff line change
Expand Up @@ -87,14 +87,16 @@ def get_schema_generation_database(self):
backend = get_manager()._load_backend(schema_check=False) # pylint: disable=protected-access

try:
result = backend.execute_raw(r"""SELECT tval FROM db_dbsetting WHERE key = 'schema_generation';""")
except ProgrammingError:
result = backend.execute_raw(r"""SELECT val FROM db_dbsetting WHERE key = 'schema_generation';""")

try:
return str(int(result[0][0]))
except (IndexError, TypeError, ValueError):
except ProgrammingError:
# If this value does not exist, the schema has to correspond to the first generation which didn't actually
# record its value explicitly in the database until ``aiida-core>=1.0.0``.
return '1'
else:
try:
return str(int(result[0][0]))
except (IndexError, ValueError, TypeError):
return '1'

def get_schema_version_database(self):
"""Return the database schema version.
Expand All @@ -107,9 +109,9 @@ def get_schema_version_database(self):
backend = get_manager()._load_backend(schema_check=False) # pylint: disable=protected-access

try:
result = backend.execute_raw(r"""SELECT tval FROM db_dbsetting WHERE key = 'db|schemaversion';""")
except ProgrammingError:
result = backend.execute_raw(r"""SELECT val FROM db_dbsetting WHERE key = 'db|schemaversion';""")
except ProgrammingError:
result = backend.execute_raw(r"""SELECT tval FROM db_dbsetting WHERE key = 'db|schemaversion';""")
return result[0][0]

def set_schema_version_database(self, version):
Expand Down
22 changes: 16 additions & 6 deletions aiida/cmdline/commands/cmd_verdi.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,8 @@
# For further information please visit http://www.aiida.net #
###########################################################################
"""The main `verdi` click group."""

import difflib

import click

from aiida import __version__
Expand Down Expand Up @@ -52,23 +52,33 @@ def get_command(self, ctx, cmd_name):
"""
cmd = click.Group.get_command(self, ctx, cmd_name)

# return the exact match
# If we match an actual command, simply return the match
if cmd is not None:
return cmd

# If this command is called during tab-completion, we do not want to print an error message if the command can't
# be found, but instead we want to simply return here. However, in a normal command execution, we do want to
# execute the rest of this method to try and match commands that are similar in order to provide the user with
# some hints. The problem is that there is no one canonical way to determine whether the invocation is due to a
# normal command execution or a tab-complete operation. The `resilient_parsing` attribute of the `Context` is
# designed to allow things like tab-completion, however, it is not the only purpose. For now this is our best
# bet though to detect a tab-complete event. When `resilient_parsing` is switched on, we assume a tab-complete
# and do nothing in case the command name does not match an actual command.
if ctx.resilient_parsing:
return

if int(cmd_name.lower().encode('utf-8').hex(), 16) == 0x6769757365707065:
import base64
import gzip
click.echo(gzip.decompress(base64.b85decode(GIU.encode('utf-8'))).decode('utf-8'))
return None

# we might get better results with the Levenshtein distance
# or more advanced methods implemented in FuzzyWuzzy or similar libs,
# but this is an easy win for now
# We might get better results with the Levenshtein distance or more advanced methods implemented in FuzzyWuzzy
# or similar libs, but this is an easy win for now.
matches = difflib.get_close_matches(cmd_name, self.list_commands(ctx), cutoff=0.5)

if not matches:
# single letters are sometimes not matched, try with a simple startswith
# Single letters are sometimes not matched so also try with a simple startswith
matches = [c for c in sorted(self.list_commands(ctx)) if c.startswith(cmd_name)][:3]

if matches:
Expand Down
2 changes: 1 addition & 1 deletion aiida/cmdline/params/options/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -249,7 +249,7 @@ def decorator(command):
'--db-host',
type=types.HostnameType(),
help='Database server host. Leave empty for "peer" authentication.',
default=DEFAULT_DBINFO['host']
default='localhost'
)

DB_PORT = OverridableOption(
Expand Down
2 changes: 1 addition & 1 deletion aiida/cmdline/params/options/commands/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -259,7 +259,7 @@ def get_quicksetup_password(ctx, param, value): # pylint: disable=unused-argume

SETUP_DATABASE_HOSTNAME = QUICKSETUP_DATABASE_HOSTNAME.clone(
prompt='Database host',
contextual_default=functools.partial(get_profile_attribute_default, ('database_hostname', DEFAULT_DBINFO['host'])),
contextual_default=functools.partial(get_profile_attribute_default, ('database_hostname', 'localhost')),
cls=options.interactive.InteractiveOption
)

Expand Down
10 changes: 5 additions & 5 deletions aiida/cmdline/params/options/interactive.py
Original file line number Diff line number Diff line change
Expand Up @@ -271,11 +271,11 @@ def prompt_callback(self, ctx, param, value):

# If we are here, we are in interactive mode and the parameter is not specified
# We enter the prompt loop
value = self.prompt_loop(ctx, param, value)
else:
# There is a prompt_fn function and returns False (i.e. should not ask for this value
# We then set the value to None
value = None
return self.prompt_loop(ctx, param, value)

# There is a prompt_fn function and returns False (i.e. should not ask for this value
# We then set the value to None
value = None

# And then we call the callback
return self.after_callback(ctx, param, value)
Expand Down
9 changes: 3 additions & 6 deletions aiida/common/folders.py
Original file line number Diff line number Diff line change
Expand Up @@ -342,8 +342,7 @@ def create(self):
It is always safe to call it, it will do nothing if the folder
already exists.
"""
if not self.exists():
os.makedirs(self.abspath, mode=self.mode_dir)
os.makedirs(self.abspath, mode=self.mode_dir, exist_ok=True)

def replace_with_folder(self, srcdir, move=False, overwrite=False):
"""This routine copies or moves the source folder 'srcdir' to the local folder pointed to by this Folder.
Expand All @@ -370,8 +369,7 @@ def replace_with_folder(self, srcdir, move=False, overwrite=False):

# Create parent dir, if needed, with the right mode
pardir = os.path.dirname(self.abspath)
if not os.path.exists(pardir):
os.makedirs(pardir, mode=self.mode_dir)
os.makedirs(pardir, mode=self.mode_dir, exist_ok=True)

if move:
shutil.move(srcdir, self.abspath)
Expand Down Expand Up @@ -417,8 +415,7 @@ def __init__(self, sandbox_in_repo=True):
# First check if the sandbox folder already exists
if sandbox_in_repo:
sandbox = os.path.join(get_profile().repository_path, 'sandbox')
if not os.path.exists(sandbox):
os.makedirs(sandbox)
os.makedirs(sandbox, exist_ok=True)
abspath = tempfile.mkdtemp(dir=sandbox)
else:
abspath = tempfile.mkdtemp()
Expand Down
2 changes: 1 addition & 1 deletion aiida/engine/persistence.py
Original file line number Diff line number Diff line change
Expand Up @@ -121,7 +121,7 @@ def load_checkpoint(self, pid: Hashable, tag: Optional[str] = None) -> plumpy.pe
raise PersistenceError(f'Calculation<{calculation.pk}> does not have a saved checkpoint')

try:
bundle = serialize.deserialize(checkpoint)
bundle = serialize.deserialize_unsafe(checkpoint)
except Exception:
raise PersistenceError(f'Failed to load the checkpoint for process<{pid}>: {traceback.format_exc()}')

Expand Down
58 changes: 48 additions & 10 deletions aiida/engine/processes/builder.py
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ def __init__(self, port_namespace: PortNamespace) -> None:
self._valid_fields = []
self._data = {}

# The name and port objects have to be passed to the defined functions as defaults for
# The name and port objects have to be passed to the defined functions as defaults for
# their arguments, because this way the content at the time of defining the method is
# saved. If they are used directly in the body, it will try to capture the value from
# its enclosing scope at the time of being called.
Expand Down Expand Up @@ -83,16 +83,28 @@ def __setattr__(self, attr: str, value: Any) -> None:
else:
try:
port = self._port_namespace[attr]
except KeyError:
except KeyError as exception:
if not self._port_namespace.dynamic:
raise AttributeError(f'Unknown builder parameter: {attr}')
raise AttributeError(f'Unknown builder parameter: {attr}') from exception
port = None # type: ignore[assignment]
else:
value = port.serialize(value) # type: ignore[union-attr]
validation_error = port.validate(value)
if validation_error:
raise ValueError(f'invalid attribute value {validation_error.message}')

self._data[attr] = value
# If the attribute that is being set corresponds to a port that is a ``PortNamespace`` we need to make sure
# that the nested value remains a ``ProcessBuilderNamespace``. Otherwise, the nested namespaces will become
# plain dictionaries and no longer have the properties of the ``ProcessBuilderNamespace`` that provide all
# the autocompletion and validation when values are being set. Therefore we first construct a new instance
# of a ``ProcessBuilderNamespace`` for the port of the attribute that is being set and than iteratively set
# all the values within the mapping that is being assigned to the attribute.
if isinstance(port, PortNamespace):
self._data[attr] = ProcessBuilderNamespace(port)
for sub_key, sub_value in value.items():
setattr(self._data[attr], sub_key, sub_value)
else:
self._data[attr] = value

def __repr__(self):
return self._data.__repr__()
Expand All @@ -119,19 +131,45 @@ def __delitem__(self, item):
def __delattr__(self, item):
self._data.__delitem__(item)

def _update(self, *args, **kwds):
"""Update the values of the builder namespace passing a mapping as argument or individual keyword value pairs.
def _recursive_merge(self, dictionary, key, value):
"""Recursively merge the contents of ``dictionary`` setting its ``key`` to ``value``."""
if isinstance(value, collections.abc.Mapping):
for inner_key, inner_value in value.items():
self._recursive_merge(dictionary[key], inner_key, inner_value)
else:
dictionary[key] = value

def _merge(self, *args, **kwds):
"""Merge the content of a dictionary or keyword arguments in .
The method is prefixed with an underscore in order to not reserve the name for a potential port, but in
principle the method functions just as `collections.abc.MutableMapping.update`.
.. note:: This method differs in behavior from ``_update`` in that ``_merge`` will recursively update the
existing dictionary with the one that is specified in the arguments. The ``_update`` method will merge only
the keys on the top level, but any lower lying nested namespace will be replaced entirely.
:param args: a single mapping that should be mapped on the namespace
The method is prefixed with an underscore in order to not reserve the name for a potential port.
:param kwds: keyword value pairs that should be mapped onto the ports
:param args: a single mapping that should be mapped on the namespace.
:param kwds: keyword value pairs that should be mapped onto the ports.
"""
if len(args) > 1:
raise TypeError(f'update expected at most 1 arguments, got {int(len(args))}')

if args:
for key, value in args[0].items():
self._recursive_merge(self, key, value)

for key, value in kwds.items():
self._recursive_merge(self, key, value)

def _update(self, *args, **kwds):
"""Update the values of the builder namespace passing a mapping as argument or individual keyword value pairs.
The method functions just as `collections.abc.MutableMapping.update` and is merely prefixed with an underscore
in order to not reserve the name for a potential port.
:param args: a single mapping that should be mapped on the namespace.
:param kwds: keyword value pairs that should be mapped onto the ports.
"""
if args:
for key, value in args[0].items():
if isinstance(value, collections.abc.Mapping):
Expand Down
4 changes: 4 additions & 0 deletions aiida/engine/processes/calcjobs/tasks.py
Original file line number Diff line number Diff line change
Expand Up @@ -256,6 +256,10 @@ async def do_retrieve():
scheduler = node.computer.get_scheduler() # type: ignore[union-attr]
scheduler.set_transport(transport)

if node.get_job_id() is None:
logger.warning(f'there is no job id for CalcJobNoe<{node.pk}>: skipping `get_detailed_job_info`')
return execmanager.retrieve_calculation(node, transport, retrieved_temporary_folder)

try:
detailed_job_info = scheduler.get_detailed_job_info(node.get_job_id())
except FeatureNotAvailable:
Expand Down
2 changes: 1 addition & 1 deletion aiida/engine/processes/process.py
Original file line number Diff line number Diff line change
Expand Up @@ -604,7 +604,7 @@ def decode_input_args(self, encoded: str) -> Dict[str, Any]: # pylint: disable=
:param encoded: encoded (serialized) inputs
:return: The decoded input args
"""
return serialize.deserialize(encoded)
return serialize.deserialize_unsafe(encoded)

def update_node_state(self, state: plumpy.process_states.State) -> None:
self.update_outputs()
Expand Down
6 changes: 4 additions & 2 deletions aiida/engine/processes/workchains/restart.py
Original file line number Diff line number Diff line change
Expand Up @@ -303,11 +303,13 @@ def results(self) -> Optional['ExitCode']:

self.report(f'work chain completed after {self.ctx.iteration} iterations')

exposed_outputs = self.exposed_outputs(node, self.process_class)

for name, port in self.spec().outputs.items():

try:
output = node.get_outgoing(link_label_filter=name).one().node
except ValueError:
output = exposed_outputs[name]
except KeyError:
if port.required:
self.report(f"required output '{name}' was not an output of {self.ctx.process_name}<{node.pk}>")
else:
Expand Down
2 changes: 1 addition & 1 deletion aiida/manage/manager.py
Original file line number Diff line number Diff line change
Expand Up @@ -243,7 +243,7 @@ def create_communicator(
if with_orm:
from aiida.orm.utils import serialize
encoder = functools.partial(serialize.serialize, encoding='utf-8')
decoder = serialize.deserialize
decoder = serialize.deserialize_unsafe
else:
# used by verdi status to get a communicator without needing to load the dbenv
from aiida.common import json
Expand Down
7 changes: 7 additions & 0 deletions aiida/manage/tests/pytest_fixtures.py
Original file line number Diff line number Diff line change
Expand Up @@ -70,6 +70,13 @@ def clear_database_before_test(aiida_profile):
yield


@pytest.fixture(scope='class')
def clear_database_before_test_class(aiida_profile):
"""Clear the database before a test class."""
aiida_profile.reset_db()
yield


@pytest.fixture(scope='function')
def temporary_event_loop():
"""Create a temporary loop for independent test case"""
Expand Down
4 changes: 2 additions & 2 deletions aiida/orm/nodes/data/array/bands.py
Original file line number Diff line number Diff line change
Expand Up @@ -784,9 +784,9 @@ def _matplotlib_get_dict(

# axis limits
if y_max_lim is None:
y_max_lim = numpy.array(bands).max()
y_max_lim = numpy.nanmax(bands)
if y_min_lim is None:
y_min_lim = numpy.array(bands).min()
y_min_lim = numpy.nanmin(bands)
x_min_lim = min(x) # this isn't a numpy array, but a list
x_max_lim = max(x)
all_data['x_min_lim'] = x_min_lim
Expand Down
2 changes: 1 addition & 1 deletion aiida/orm/nodes/node.py
Original file line number Diff line number Diff line change
Expand Up @@ -185,7 +185,7 @@ def __init__(
raise ValueError('the computer is not stored')

computer = computer.backend_entity if computer else None
user = user.backend_entity if user else User.objects(backend).get_default()
user = user if user else User.objects(backend).get_default()

if user is None:
raise ValueError('the user cannot be None')
Expand Down
Loading

0 comments on commit ec97fcf

Please sign in to comment.