Skip to content

Commit

Permalink
Merge pull request #16851 from afuetterer/fix-typos
Browse files Browse the repository at this point in the history
chore: fix typos
  • Loading branch information
nsoranzo authored Oct 13, 2023
2 parents 57985f1 + 347e7b8 commit 80cc5dd
Show file tree
Hide file tree
Showing 22 changed files with 33 additions and 31 deletions.
6 changes: 3 additions & 3 deletions client/src/schema/schema.ts
Original file line number Diff line number Diff line change
Expand Up @@ -6914,7 +6914,7 @@ export interface components {
* Content
* @description Depending on the `source` it can be:
* - The encoded id of the source library dataset
* - The encoded id of the the HDA
* - The encoded id of the HDA
*
* @example 0123456789ABCDEF
*/
Expand Down Expand Up @@ -18140,10 +18140,10 @@ export interface operations {
* : The stored workflow's owner's username. (The tag `u` can be used a short hand alias for this tag to filter on this attribute.)
*
* `is:published`
* : Include only published workflows in the final result. Be sure the the query parameter `show_published` is set to `true` if to include all published workflows and not just the requesting user's.
* : Include only published workflows in the final result. Be sure the query parameter `show_published` is set to `true` if to include all published workflows and not just the requesting user's.
*
* `is:share_with_me`
* : Include only workflows shared with the requesting user. Be sure the the query parameter `show_shared` is set to `true` if to include shared workflows.
* : Include only workflows shared with the requesting user. Be sure the query parameter `show_shared` is set to `true` if to include shared workflows.
*
* ## Free Text
*
Expand Down
2 changes: 1 addition & 1 deletion doc/source/admin/container_resolvers.rst
Original file line number Diff line number Diff line change
Expand Up @@ -364,7 +364,7 @@ admins might want to control.

.. note::

For the the execution of jobs Galaxy already implement the `support for using
For the execution of jobs Galaxy already implement the `support for using
tarballs of container images
<https://github.com/galaxyproject/galaxy/blob/c517e805771cc16807dfe675075a13fe6343f01f/lib/galaxy/tool_util/deps/container_classes.py#L319>`_.
from ``container_image_cache_path`` (set in galaxy.yml) or the destination
Expand Down
4 changes: 2 additions & 2 deletions doc/source/admin/jobs.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ workers

The `<handlers>` configuration elements defines which Galaxy server processes (when [running multiple server processes](scaling.md)) should be used for running jobs, and how to group those processes.

The handlers configuration may define a ``default`` attribute. This is the the handler(s) that should be used if no explicit handler is defined for a job. If unset, any untagged handlers will be used by default.
The handlers configuration may define a ``default`` attribute. This is the handler(s) that should be used if no explicit handler is defined for a job. If unset, any untagged handlers will be used by default.

The collection contains `<handler>` elements.

Expand Down Expand Up @@ -316,7 +316,7 @@ The above examples demonstrate that the dynamic job destination framework will p

Also available though less likely useful are ``job_id``.

The above examples demonstrated mapping one tool to one function. Multiple tools may be mapped to the same function, by specifying a function the the dynamic destination:
The above examples demonstrated mapping one tool to one function. Multiple tools may be mapped to the same function, by specifying a function the dynamic destination:

```xml
<destination id="blast_dynamic" runner="dynamic">
Expand Down
2 changes: 1 addition & 1 deletion doc/source/admin/scaling.md
Original file line number Diff line number Diff line change
Expand Up @@ -534,7 +534,7 @@ Environment=VIRTUAL_ENV=/srv/galaxy/venv PATH=/srv/galaxy/venv/bin:/usr/local/sb
WantedBy=multi-user.target
```

We can now enable and start the the Galaxy services with systemd:
We can now enable and start the Galaxy services with systemd:

```console
# systemctl enable galaxy
Expand Down
2 changes: 1 addition & 1 deletion lib/galaxy/dependencies/update_lint_requirements.sh
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
set -e

# This script updates the pinned requirements for linting.
# The lint requirements are split from the the other ones due to incompatible
# The lint requirements are split from the other ones due to incompatible
# dependencies: flake8 requires importlib-metadata 4.2, while the Galaxy's core
# dependencies depend on importlib-metadata 4.13.

Expand Down
2 changes: 1 addition & 1 deletion lib/galaxy/jobs/runners/pbs.py
Original file line number Diff line number Diff line change
Expand Up @@ -122,7 +122,7 @@ def url_to_destination(self, url):
if not url:
return

# Determine the the PBS server
# Determine the PBS server
url_split = url.split("/")
server = url_split[2]
if server == "":
Expand Down
2 changes: 1 addition & 1 deletion lib/galaxy/managers/library_datasets.py
Original file line number Diff line number Diff line change
Expand Up @@ -252,7 +252,7 @@ def serialize(self, trans, ld):
rval["is_unrestricted"] = trans.app.security_agent.dataset_is_public(ldda.dataset)
rval["tags"] = trans.tag_handler.get_tags_list(ldda.tags)

# Manage dataset permission is always attached to the dataset itself, not the the ld or ldda to maintain consistency
# Manage dataset permission is always attached to the dataset itself, not the ld or ldda to maintain consistency
rval["can_user_manage"] = trans.user_is_admin or trans.app.security_agent.can_manage_dataset(
current_user_roles, ldda.dataset
)
Expand Down
2 changes: 1 addition & 1 deletion lib/galaxy/managers/workflows.py
Original file line number Diff line number Diff line change
Expand Up @@ -339,7 +339,7 @@ def check_security(self, trans, has_workflow, check_ownership=True, check_access

# If given an invocation verify ownership of invocation
if isinstance(has_workflow, model.WorkflowInvocation):
# We use the the owner of the history that is associated to the invocation as a proxy
# We use the owner of the history that is associated to the invocation as a proxy
# for the owner of the invocation.
if trans.user != has_workflow.history.user and not trans.user_is_admin:
raise exceptions.ItemOwnershipException()
Expand Down
2 changes: 1 addition & 1 deletion lib/galaxy/schema/schema.py
Original file line number Diff line number Diff line change
Expand Up @@ -3494,7 +3494,7 @@ class MaterializeDatasetInstanceAPIRequest(Model):
description=(
"Depending on the `source` it can be:\n"
"- The encoded id of the source library dataset\n"
"- The encoded id of the the HDA\n"
"- The encoded id of the HDA\n"
),
)

Expand Down
2 changes: 1 addition & 1 deletion lib/galaxy/schema/tasks.py
Original file line number Diff line number Diff line change
Expand Up @@ -107,7 +107,7 @@ class MaterializeDatasetInstanceTaskRequest(BaseModel):
description=(
"Depending on the `source` it can be:\n"
"- The encoded id of the source library dataset\n"
"- The encoded id of the the HDA\n"
"- The encoded id of the HDA\n"
),
)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ def build_repository_dependency_relationships(self, repo_info_dicts, tool_shed_r
components_list = repository_util.extract_components_from_tuple(repository_components_tuple)
d_toolshed, d_name, d_owner, d_changeset_revision = components_list[0:4]
for tsr in tool_shed_repositories:
# Get the the tool_shed_repository defined by name, owner and changeset_revision. This is
# Get the tool_shed_repository defined by name, owner and changeset_revision. This is
# the repository that will be dependent upon each of the tool shed repositories contained in
# val. We'll need to check tool_shed_repository.tool_shed as well if/when repository dependencies
# across tool sheds is supported.
Expand All @@ -98,7 +98,7 @@ def build_repository_dependency_relationships(self, repo_info_dicts, tool_shed_r
rd_prior_installation_required,
rd_only_if_compiling_contained_td,
) = common_util.parse_repository_dependency_tuple(repository_dependency_components_list)
# Get the the tool_shed_repository defined by rd_name, rd_owner and rd_changeset_revision. This
# Get the tool_shed_repository defined by rd_name, rd_owner and rd_changeset_revision. This
# is the repository that will be required by the current d_repository.
# TODO: Check tool_shed_repository.tool_shed as well when repository dependencies across tool sheds is supported.
for tsr in tool_shed_repositories:
Expand Down
2 changes: 1 addition & 1 deletion lib/galaxy/tool_util/deps/resolvers/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ class DependencyResolver(Dictifiable, metaclass=ABCMeta):
"can_uninstall_dependencies",
"read_only",
]
# A "simple" dependency is one that does not depend on the the tool
# A "simple" dependency is one that does not depend on the tool
# resolving the dependency. Classic tool shed dependencies are non-simple
# because the repository install context is used in dependency resolution
# so the same requirement tags in different tools will have very different
Expand Down
4 changes: 2 additions & 2 deletions lib/galaxy/webapps/galaxy/api/workflows.py
Original file line number Diff line number Diff line change
Expand Up @@ -1174,11 +1174,11 @@ def __encode_invocation(self, invocation, **kwd):
IndexQueryTag("user", "The stored workflow's owner's username.", "u"),
IndexQueryTag(
"is:published",
"Include only published workflows in the final result. Be sure the the query parameter `show_published` is set to `true` if to include all published workflows and not just the requesting user's.",
"Include only published workflows in the final result. Be sure the query parameter `show_published` is set to `true` if to include all published workflows and not just the requesting user's.",
),
IndexQueryTag(
"is:share_with_me",
"Include only workflows shared with the requesting user. Be sure the the query parameter `show_shared` is set to `true` if to include shared workflows.",
"Include only workflows shared with the requesting user. Be sure the query parameter `show_shared` is set to `true` if to include shared workflows.",
),
]

Expand Down
4 changes: 1 addition & 3 deletions lib/galaxy/webapps/galaxy/controllers/dataset.py
Original file line number Diff line number Diff line change
Expand Up @@ -815,9 +815,7 @@ def _purge(self, trans, dataset_id):
if hda.dataset.user_can_purge:
try:
hda.dataset.full_delete()
trans.log_event(
f"Dataset id {hda.dataset.id} has been purged upon the the purge of HDA id {hda.id}"
)
trans.log_event(f"Dataset id {hda.dataset.id} has been purged upon the purge of HDA id {hda.id}")
trans.sa_session.add(hda.dataset)
except Exception:
log.exception(f"Unable to purge dataset ({hda.dataset.id}) on purge of HDA ({hda.id}):")
Expand Down
2 changes: 1 addition & 1 deletion lib/galaxy/webapps/galaxy/controllers/history.py
Original file line number Diff line number Diff line change
Expand Up @@ -658,7 +658,7 @@ def purge_deleted_datasets(self, trans):
try:
hda.dataset.full_delete()
trans.log_event(
f"Dataset id {hda.dataset.id} has been purged upon the the purge of HDA id {hda.id}"
f"Dataset id {hda.dataset.id} has been purged upon the purge of HDA id {hda.id}"
)
trans.sa_session.add(hda.dataset)
except Exception:
Expand Down
2 changes: 1 addition & 1 deletion lib/galaxy_test/selenium/test_workflow_editor.py
Original file line number Diff line number Diff line change
Expand Up @@ -1175,7 +1175,7 @@ def workflow_editor_source_sink_terminal_ids(self, source, sink):
def workflow_editor_add_input(self, item_name="data_input"):
editor = self.components.workflow_editor

# Make sure we're on the the workflow editor and not clicking the main tool panel.
# Make sure we're on the workflow editor and not clicking the main tool panel.
editor.canvas_body.wait_for_visible()

editor.tool_menu.wait_for_visible()
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,14 +6,16 @@
)

matplotlib_repository_name = "package_matplotlib_1_2_0170"
matplotlib_repository_description = "Contains a tool dependency definition that downloads and compiles version 1.2.x of the the python matplotlib package."
matplotlib_repository_description = (
"Contains a tool dependency definition that downloads and compiles version 1.2.x of the python matplotlib package."
)
matplotlib_repository_long_description = (
"This repository is intended to be defined as a complex repository dependency within a separate repository."
)

numpy_repository_name = "package_numpy_1_7_0170"
numpy_repository_description = (
"Contains a tool dependency definition that downloads and compiles version 1.7 of the the python numpy package."
"Contains a tool dependency definition that downloads and compiles version 1.7 of the python numpy package."
)
numpy_repository_long_description = (
"This repository is intended to be defined as a complex repository dependency within a separate repository."
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,14 +6,16 @@
)

matplotlib_repository_name = "package_matplotlib_1_2_0170"
matplotlib_repository_description = "Contains a tool dependency definition that downloads and compiles version 1.2.x of the the python matplotlib package."
matplotlib_repository_description = (
"Contains a tool dependency definition that downloads and compiles version 1.2.x of the python matplotlib package."
)
matplotlib_repository_long_description = (
"This repository is intended to be defined as a complex repository dependency within a separate repository."
)

numpy_repository_name = "package_numpy_1_7_0170"
numpy_repository_description = (
"Contains a tool dependency definition that downloads and compiles version 1.7 of the the python numpy package."
"Contains a tool dependency definition that downloads and compiles version 1.7 of the python numpy package."
)
numpy_repository_long_description = (
"This repository is intended to be defined as a complex repository dependency within a separate repository."
Expand Down
2 changes: 1 addition & 1 deletion test/unit/app/tools/test_parameter_parsing.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ class TestParameterParsing(BaseParameterTestCase):
those tests may need to be updated anyway.
It occurs to me that rewriting this stuff to test to_dict would
be much better - since that is a public API of the the tools.
be much better - since that is a public API of the tools.
"""

def test_parse_help_and_label(self):
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -636,7 +636,7 @@ nglims_config_file = tool-data/nglims.yaml
# set that here.
#dynamic_proxy_session_map=database/session_map.sqlite

# Set the port and IP for the the dynamic proxy to bind to, this must match
# Set the port and IP for the dynamic proxy to bind to, this must match
# the external configuration if dynamic_proxy_manage is False.
#dynamic_proxy_bind_port=8800
#dynamic_proxy_bind_ip=0.0.0.0
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -636,7 +636,7 @@ nglims_config_file = tool-data/nglims.yaml
# set that here.
#dynamic_proxy_session_map=database/session_map.sqlite

# Set the port and IP for the the dynamic proxy to bind to, this must match
# Set the port and IP for the dynamic proxy to bind to, this must match
# the external configuration if dynamic_proxy_manage is False.
#dynamic_proxy_bind_port=8800
#dynamic_proxy_bind_ip=0.0.0.0
Expand Down
2 changes: 1 addition & 1 deletion tools/sr_mapping/PerM.xml
Original file line number Diff line number Diff line change
Expand Up @@ -352,7 +352,7 @@ The command-line options::
--ignoreQS Ignore the quality scores in fastq or QUAL files.
--seed {F2 | S11 | F3 | F4} Specify the seed pattern, which has a specific full sensitivity. Check the algorithm page (link below) for seed patterns to balance the sensitivity and running time.
--readFormat {fasta | fastq | csfasta | csfastq} Read in reads in the specified format, instead of guessing according to the extension name.
--delimiter CHAR Which is a character used as the delimiter to separate the the read id, and the additional info in the line with ">" in fasta or csfasta.
--delimiter CHAR Which is a character used as the delimiter to separate the read id, and the additional info in the line with ">" in fasta or csfasta.

Paired reads options::

Expand Down

0 comments on commit 80cc5dd

Please sign in to comment.