Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Hmmm - I had a go at #220 and #227
I added tests for both in
test_formatter.py
, and the new code passes them.A few thoughts & comments:
black
now (i.e., for python3.12) just ignores f-strings, i.e. doesn't reformat them, i.e. will let you do both, say:tokenize
in python 3.12 would be to basically ignore everything betweentokenize.FSTRING_START
andtokenize.FSTRING_END
, extract the literal f-string that occurs there as a string, and just add that string in the buffer. That breaks our whole 'philosophy' though, of parsing tokens, one by one, and processing themspacing_triggers
dictionary (insyntax.py
). I now check for whether we are inside anf-string
, both when performing python code parsing (inparser.py
) and snakemake parameter parsing (insyntax.py
), and if we are inside an f-string, we used the 'reduced spacing' syntax (dictionaryfstring_spacing_triggers
insidesyntax.py
)f"{1 + 2}"
asf"{1+2}"
inside f-strings, from python>=3.12. Before 3.12, we accept bothf"{1 + 2}"
andf"{1+2}"
; after, we reformat to the latter. I think that's fine, as black is fine with either. We're not because we parse f-strings, we don't ignore them. I think that's fine.Let's see how robust this is as we move forwards...