From e778db59ce79e642cb26aafe483d0d16e2eabb0f Mon Sep 17 00:00:00 2001 From: Manuel Kaufmann Date: Thu, 4 Jan 2024 11:58:46 +0100 Subject: [PATCH] New notification system: implementation (#10922) * Add some comments to understand the scope of the changes * Add more comments about the plan * Delete old code We are not sending any kind of notification from the Django admin. We used to use this in the past but we are not using it anymore. I'm deleting it now to avoid re-implementing it using the new notification system. * Add more comments to organize the work of new notification system * Notification: initial modelling * Define `max_length=128` that's required for SQLite (tests) * Initial implementation for build notifications API endpoint * API endpoint to create build notifications * Use APIv2 for internal endpoint to call from builders * Use APIv2 to create notifications from the builder * Check project slug when attaching to a Build/Project instance * Add extra TODO comments * Initial migration from Exceptions to Messages * Disable `Notification.format_values` for now I'll come back to this later. * Define all the notifications * Messages for symlink exceptions * Simplify how `NOTIFICATION_MESSAGES` is built * Add `Notification.format_values` field This allows to store values required for this notification at render time * Notification with `format_values` passed when exception is raised * Use `get_display_icon` to show the proper icon in the serializer * Proper relation name * Initial work to replace site notifications * We are not going to create Notifications via APIv3 for now * Send `format_values` to exception * Small changes and comments added * Initial work to migrate an old SiteNotification to new system * Create a notification `registry` to register messages by application * Use a custom `models.Manager` to implement de-duplication Use `Notification.objects.add()` to create a notification with de-duplication capabilities. If the notification already exists, it updates all the fields with the new ones and set the `modified` datetime to now. This commit also migrates the "Your primary email address is not verified" to the new notification system, which it was when I implemented the de-duplication logic. * Small refactor to email notifications * Render build/project notifications from Django templates Use a small hack on build detail's page to refresh the page once the build is finished. This way, we don't have to re-implement the new notification system in the old Knockout javascript code and we focus ourselves only in the new dashboard. * Attach "build.commands" in beta notification * Initial migration pattern for YAML config errors * Pass default settings for notifications * Organization disabled notification * Email not verified notification * Project skipped notification * Missing file for NotificationQuerySet * Migrate `oauth` app notifications to the new system * Add small comment about notification migration * Rename import * Fix queryset when creating a notification via `.add()` * Remove `SiteNotification` completely * Rename file and remove `__init__` magic * Remove `django-message-extends` completely We are not going to use it anymore with the new notification system. * Note about user notifications in base template * Remove commented code for debugging * Typo (from review) * Migrate `ConfigError` to the new notification system * Implement refact `icon` and `icon_type` into `icon_classes` * Remove unused setting * Update config file test cases to match changes * Mark test that depend on search to be skipped * Cancel build test fixed * Update APIv3 test cases * Down to a few of tests failing * Raise exception properly by passing `message_id` * More tests updated/fixed * BuildUser/BuildAdd error * Instantiate `BuildDirector` on `before_start` It makes more sense to be there than under `execute()`, since it's a pre-requirement. It also helps when testing. * Update test case to pass * Update more tests related to notifications * Make usage of `.add` for deduplication * Move the link to the message itself. * Create message IDs property * Use IDs to avoid notifications * Update comments * Attach organization on retry * Test we are hitting APIv2 for notifications * Remove old comment * Remove temporary debugging link * Remove old comment * Update send build status celery tests * Explain how to show notifications attached to users * Better comment in template * Lint * Minor changes * Refactor `ValidationError` to raise a proper exception/notification * Update tests for config after refactor * Pass `user_notifications` to templates * More updates on config tests * Remove debugging code * Add context processor for user notifications * Don't retrieve notifications if anonymous user * Use proper exception * Apply suggestions from code review Co-authored-by: Anthony * Remove old TODO messages * Remove `deprecated_` utils and notifications They are not useful anymore since it's not possible to build without a config file or using a config file v1. * Use the proper HTML/CSS for user notifications * Message for deploy key (required in commercial) * Migrate Docker errors to new system * Migrate more messages to the new notifications system * Typo * Fix error type * Remove showing errors from `build.error` We are using the `Notification`s attached to the object now. These old errors are migrated to the new system by a one-time script. * Typo in copy * Revert "Remove showing errors from `build.error`" This reverts commit d4951a83d226d9b6c539cc53166e5c9d53e426f7. We decided to keep this code around for now. See https://github.com/readthedocs/readthedocs.org/issues/10980#issuecomment-1876785304 * Comment explaining the decision about `Build.error` HTML+KO code * Use `textwrap.dedent` + `.strip()` for all messages Closes #10987 * Avoid breaking at rendering time If the message ID is not found we return a generic message to avoid breaking the request and we log an internal error to fix the issue. * Apply feedback from review --------- Co-authored-by: Anthony --- common | 2 +- readthedocs/api/v2/serializers.py | 54 ++ readthedocs/api/v2/urls.py | 2 + readthedocs/api/v2/views/model_views.py | 38 ++ readthedocs/api/v3/serializers.py | 85 +++ readthedocs/api/v3/tests/mixins.py | 1 + .../responses/projects-builds-detail.json | 3 +- .../v3/tests/responses/projects-detail.json | 1 + .../projects-versions-builds-list_POST.json | 1 + readthedocs/api/v3/urls.py | 19 +- readthedocs/api/v3/views.py | 29 + readthedocs/builds/models.py | 40 +- .../builds/static-src/builds/js/detail.js | 21 +- readthedocs/builds/static/builds/js/detail.js | 2 +- readthedocs/builds/tasks.py | 24 +- readthedocs/builds/views.py | 4 +- readthedocs/config/__init__.py | 2 + readthedocs/config/config.py | 305 +++-------- readthedocs/config/exceptions.py | 37 ++ readthedocs/config/notifications.py | 350 ++++++++++++ readthedocs/config/tests/test_config.py | 317 ++++++----- readthedocs/config/tests/test_validation.py | 38 +- readthedocs/config/validation.py | 100 ++-- readthedocs/core/admin.py | 27 - readthedocs/core/context_processors.py | 36 ++ readthedocs/core/notifications.py | 28 + readthedocs/core/tasks.py | 12 - readthedocs/core/tests/test_contact.py | 27 +- .../core/tests/test_filesystem_utils.py | 4 +- readthedocs/core/utils/__init__.py | 19 +- readthedocs/core/utils/contact.py | 67 +-- readthedocs/core/utils/filesystem.py | 12 +- readthedocs/doc_builder/backends/mkdocs.py | 8 +- readthedocs/doc_builder/backends/sphinx.py | 5 +- readthedocs/doc_builder/director.py | 29 + readthedocs/doc_builder/environments.py | 23 +- readthedocs/doc_builder/exceptions.py | 195 ++----- readthedocs/domains/notifications.py | 47 +- readthedocs/domains/tasks.py | 24 +- .../pending_domain_configuration_site.html | 8 - readthedocs/domains/tests/test_tasks.py | 6 +- readthedocs/notifications/__init__.py | 22 - readthedocs/notifications/backends.py | 113 ---- readthedocs/notifications/constants.py | 35 +- readthedocs/notifications/email.py | 96 ++++ readthedocs/notifications/forms.py | 40 -- readthedocs/notifications/messages.py | 513 ++++++++++++++++++ .../notifications/migrations/0001_initial.py | 76 +++ .../0002_notification_format_values.py | 17 + .../notifications/migrations/__init__.py | 0 readthedocs/notifications/models.py | 97 ++++ readthedocs/notifications/notification.py | 185 ------- readthedocs/notifications/querysets.py | 34 ++ readthedocs/notifications/storages.py | 159 ------ readthedocs/notifications/urls.py | 17 - readthedocs/notifications/views.py | 120 ---- readthedocs/oauth/notifications.py | 165 +++--- readthedocs/oauth/tasks.py | 73 ++- readthedocs/projects/admin.py | 20 - readthedocs/projects/apps.py | 2 + readthedocs/projects/exceptions.py | 55 +- readthedocs/projects/models.py | 11 + readthedocs/projects/notifications.py | 157 +++++- readthedocs/projects/tasks/builds.py | 100 ++-- readthedocs/projects/tasks/utils.py | 14 +- readthedocs/projects/tests/mockers.py | 5 + .../projects/tests/test_build_tasks.py | 61 ++- readthedocs/projects/views/private.py | 17 +- readthedocs/rtd_tests/tests/test_backend.py | 8 +- readthedocs/rtd_tests/tests/test_builds.py | 18 - readthedocs/rtd_tests/tests/test_celery.py | 35 +- .../rtd_tests/tests/test_doc_builder.py | 22 +- .../rtd_tests/tests/test_imported_file.py | 2 + .../rtd_tests/tests/test_notifications.py | 200 +------ readthedocs/rtd_tests/tests/test_project.py | 19 +- readthedocs/settings/base.py | 15 +- readthedocs/subscriptions/notifications.py | 63 +-- readthedocs/subscriptions/tests/test_tasks.py | 37 +- readthedocs/templates/base.html | 16 + .../templates/builds/build_detail.html | 72 +-- .../templates/core/project_bar_base.html | 21 +- readthedocs/urls.py | 1 - readthedocs/vcs_support/backends/bzr.py | 6 +- readthedocs/vcs_support/backends/git.py | 11 +- readthedocs/vcs_support/backends/hg.py | 12 +- readthedocs/vcs_support/backends/svn.py | 7 +- readthedocs/vcs_support/base.py | 6 +- requirements/deploy.txt | 14 +- requirements/docker.txt | 14 +- requirements/pip.in | 2 +- requirements/pip.txt | 11 +- requirements/testing.txt | 18 +- 92 files changed, 2746 insertions(+), 2140 deletions(-) create mode 100644 readthedocs/config/exceptions.py create mode 100644 readthedocs/config/notifications.py create mode 100644 readthedocs/core/notifications.py delete mode 100644 readthedocs/domains/templates/domains/notifications/pending_domain_configuration_site.html delete mode 100644 readthedocs/notifications/__init__.py delete mode 100644 readthedocs/notifications/backends.py create mode 100644 readthedocs/notifications/email.py delete mode 100644 readthedocs/notifications/forms.py create mode 100644 readthedocs/notifications/messages.py create mode 100644 readthedocs/notifications/migrations/0001_initial.py create mode 100644 readthedocs/notifications/migrations/0002_notification_format_values.py create mode 100644 readthedocs/notifications/migrations/__init__.py create mode 100644 readthedocs/notifications/models.py delete mode 100644 readthedocs/notifications/notification.py create mode 100644 readthedocs/notifications/querysets.py delete mode 100644 readthedocs/notifications/storages.py delete mode 100644 readthedocs/notifications/urls.py delete mode 100644 readthedocs/notifications/views.py diff --git a/common b/common index cadcca43d33..9e0ca9f3e99 160000 --- a/common +++ b/common @@ -1 +1 @@ -Subproject commit cadcca43d33cd9211810873632136bed098970c0 +Subproject commit 9e0ca9f3e99a14d22ac804422094513a468fcf3a diff --git a/readthedocs/api/v2/serializers.py b/readthedocs/api/v2/serializers.py index 4756eceb675..560ecc440d2 100644 --- a/readthedocs/api/v2/serializers.py +++ b/readthedocs/api/v2/serializers.py @@ -2,11 +2,15 @@ from allauth.socialaccount.models import SocialAccount +from django.core.exceptions import ObjectDoesNotExist +from django.utils.translation import gettext as _ +from generic_relations.relations import GenericRelatedField from rest_framework import serializers from readthedocs.api.v2.utils import normalize_build_command from readthedocs.builds.models import Build, BuildCommandResult, Version from readthedocs.core.resolver import Resolver +from readthedocs.notifications.models import Notification from readthedocs.oauth.models import RemoteOrganization, RemoteRepository from readthedocs.projects.models import Domain, Project @@ -378,3 +382,53 @@ def get_username(self, obj): or obj.extra_data.get("login") # FIXME: which one is GitLab? ) + + +class NotificationAttachedToRelatedField(serializers.RelatedField): + + """ + Attached to related field for Notifications. + + Used together with ``rest-framework-generic-relations`` to accept multiple object types on ``attached_to``. + + See https://github.com/LilyFoote/rest-framework-generic-relations + """ + + default_error_messages = { + "required": _("This field is required."), + "does_not_exist": _("Object does not exist."), + "incorrect_type": _( + "Incorrect type. Expected URL string, received {data_type}." + ), + } + + def to_representation(self, value): + return f"{self.queryset.model._meta.model_name}/{value.pk}" + + def to_internal_value(self, data): + # TODO: handle exceptions + model, pk = data.strip("/").split("/") + if self.queryset.model._meta.model_name != model: + self.fail("incorrect_type") + + try: + return self.queryset.get(pk=pk) + except (ObjectDoesNotExist, ValueError, TypeError): + self.fail("does_not_exist") + + +class NotificationSerializer(serializers.ModelSerializer): + # Accept different object types (Project, Build, User, etc) depending on what the notification is attached to. + # The client has to send a value like "/". + # Example: "build/3522" will attach the notification to the Build object with id 3522 + attached_to = GenericRelatedField( + { + Build: NotificationAttachedToRelatedField(queryset=Build.objects.all()), + Project: NotificationAttachedToRelatedField(queryset=Project.objects.all()), + }, + required=True, + ) + + class Meta: + model = Notification + exclude = ["attached_to_id", "attached_to_content_type"] diff --git a/readthedocs/api/v2/urls.py b/readthedocs/api/v2/urls.py index 9739ba8569b..42b354d00be 100644 --- a/readthedocs/api/v2/urls.py +++ b/readthedocs/api/v2/urls.py @@ -12,6 +12,7 @@ BuildCommandViewSet, BuildViewSet, DomainViewSet, + NotificationViewSet, ProjectViewSet, RemoteOrganizationViewSet, RemoteRepositoryViewSet, @@ -25,6 +26,7 @@ router.register(r"version", VersionViewSet, basename="version") router.register(r"project", ProjectViewSet, basename="project") router.register(r"domain", DomainViewSet, basename="domain") +router.register(r"notifications", NotificationViewSet, basename="notifications") router.register( r"remote/org", RemoteOrganizationViewSet, diff --git a/readthedocs/api/v2/views/model_views.py b/readthedocs/api/v2/views/model_views.py index aafd209ae5c..8498cd26af2 100644 --- a/readthedocs/api/v2/views/model_views.py +++ b/readthedocs/api/v2/views/model_views.py @@ -18,6 +18,7 @@ from readthedocs.api.v2.utils import normalize_build_command from readthedocs.builds.constants import INTERNAL from readthedocs.builds.models import Build, BuildCommandResult, Version +from readthedocs.notifications.models import Notification from readthedocs.oauth.models import RemoteOrganization, RemoteRepository from readthedocs.oauth.services import registry from readthedocs.projects.models import Domain, Project @@ -29,6 +30,7 @@ BuildCommandSerializer, BuildSerializer, DomainSerializer, + NotificationSerializer, ProjectAdminSerializer, ProjectSerializer, RemoteOrganizationSerializer, @@ -361,6 +363,42 @@ def get_queryset_for_api_key(self, api_key): return self.model.objects.filter(build__project=api_key.project) +class NotificationViewSet(DisableListEndpoint, CreateModelMixin, UserSelectViewSet): + + """ + Create a notification attached to an object (User, Project, Build, Organization). + + This endpoint is currently used only internally by the builder. + Notifications are attached to `Build` objects only when using this endpoint. + This limitation will change in the future when re-implementing this on APIv3 if neeed. + """ + + parser_classes = [JSONParser, MultiPartParser] + permission_classes = [HasBuildAPIKey] + renderer_classes = (JSONRenderer,) + serializer_class = NotificationSerializer + model = Notification + + def perform_create(self, serializer): + """Restrict creation to notifications attached to the project's builds from the api key.""" + attached_to = serializer.validated_data["attached_to"] + + build_api_key = self.request.build_api_key + + project_slug = None + if isinstance(attached_to, Build): + project_slug = attached_to.project.slug + elif isinstance(attached_to, Project): + project_slug = attached_to.slug + + # Limit the permissions to create a notification on this object only if the API key + # is attached to the related project + if not project_slug or build_api_key.project.slug != project_slug: + raise PermissionDenied() + + return super().perform_create(serializer) + + class DomainViewSet(DisableListEndpoint, UserSelectViewSet): permission_classes = [ReadOnlyPermission] renderer_classes = (JSONRenderer,) diff --git a/readthedocs/api/v3/serializers.py b/readthedocs/api/v3/serializers.py index e0c9237388d..21b1c30eced 100644 --- a/readthedocs/api/v3/serializers.py +++ b/readthedocs/api/v3/serializers.py @@ -15,6 +15,7 @@ from readthedocs.core.resolver import Resolver from readthedocs.core.utils import slugify from readthedocs.core.utils.extend import SettingsOverrideObject +from readthedocs.notifications.models import Notification from readthedocs.oauth.models import RemoteOrganization, RemoteRepository from readthedocs.organizations.models import Organization, Team from readthedocs.projects.constants import ( @@ -60,10 +61,33 @@ class Meta: fields = [] +# TODO: decide whether or not include a `_links` field on the object +# +# This also includes adding `/api/v3/notifications/` endpoint, +# which I'm not sure it's useful at this point. +# +# class NotificationLinksSerializer(BaseLinksSerializer): +# _self = serializers.SerializerMethodField() +# attached_to = serializers.SerializerMethodField() + +# def get__self(self, obj): +# path = reverse( +# "notifications-detail", +# kwargs={ +# "pk": obj.pk, +# }, +# ) +# return self._absolute_url(path) + +# def get_attached_to(self, obj): +# return None + + class BuildLinksSerializer(BaseLinksSerializer): _self = serializers.SerializerMethodField() version = serializers.SerializerMethodField() project = serializers.SerializerMethodField() + notifications = serializers.SerializerMethodField() def get__self(self, obj): path = reverse( @@ -96,6 +120,16 @@ def get_project(self, obj): ) return self._absolute_url(path) + def get_notifications(self, obj): + path = reverse( + "project-builds-notifications-list", + kwargs={ + "parent_lookup_project__slug": obj.project.slug, + "parent_lookup_build__id": obj.pk, + }, + ) + return self._absolute_url(path) + class BuildURLsSerializer(BaseLinksSerializer, serializers.Serializer): build = serializers.URLField(source="get_full_url") @@ -190,6 +224,57 @@ def get_success(self, obj): return None +class NotificationMessageSerializer(serializers.Serializer): + id = serializers.SlugField() + header = serializers.CharField(source="get_rendered_header") + body = serializers.CharField(source="get_rendered_body") + type = serializers.CharField() + icon_classes = serializers.CharField(source="get_display_icon_classes") + + class Meta: + fields = [ + "id", + "header", + "body", + "type", + "icon_classes", + ] + + +class NotificationCreateSerializer(serializers.ModelSerializer): + class Meta: + model = Notification + fields = [ + "message_id", + "dismissable", + "news", + "state", + ] + + +class NotificationSerializer(serializers.ModelSerializer): + message = NotificationMessageSerializer(source="get_message") + attached_to_content_type = serializers.SerializerMethodField() + # TODO: review these fields + # _links = BuildLinksSerializer(source="*") + # urls = BuildURLsSerializer(source="*") + + class Meta: + model = Notification + fields = [ + "id", + "state", + "dismissable", + "news", + "attached_to_content_type", + "attached_to_id", + "message", + ] + + def get_attached_to_content_type(self, obj): + return obj.attached_to_content_type.name + + class VersionLinksSerializer(BaseLinksSerializer): _self = serializers.SerializerMethodField() builds = serializers.SerializerMethodField() diff --git a/readthedocs/api/v3/tests/mixins.py b/readthedocs/api/v3/tests/mixins.py index 8dfb7cf1b7d..3736cf30a32 100644 --- a/readthedocs/api/v3/tests/mixins.py +++ b/readthedocs/api/v3/tests/mixins.py @@ -26,6 +26,7 @@ ) class APIEndpointMixin(TestCase): fixtures = [] + maxDiff = None # So we get an actual diff when it fails def setUp(self): self.created = make_aware(datetime.datetime(2019, 4, 29, 10, 0, 0)) diff --git a/readthedocs/api/v3/tests/responses/projects-builds-detail.json b/readthedocs/api/v3/tests/responses/projects-builds-detail.json index 7f3b319f6e9..996bcef6e61 100644 --- a/readthedocs/api/v3/tests/responses/projects-builds-detail.json +++ b/readthedocs/api/v3/tests/responses/projects-builds-detail.json @@ -8,7 +8,8 @@ "_links": { "_self": "https://readthedocs.org/api/v3/projects/project/builds/1/", "project": "https://readthedocs.org/api/v3/projects/project/", - "version": "https://readthedocs.org/api/v3/projects/project/versions/v1.0/" + "version": "https://readthedocs.org/api/v3/projects/project/versions/v1.0/", + "notifications": "https://readthedocs.org/api/v3/projects/project/builds/1/notifications/" }, "urls": { "build": "https://readthedocs.org/projects/project/builds/1/", diff --git a/readthedocs/api/v3/tests/responses/projects-detail.json b/readthedocs/api/v3/tests/responses/projects-detail.json index 1b21a984bd2..a5266b94564 100644 --- a/readthedocs/api/v3/tests/responses/projects-detail.json +++ b/readthedocs/api/v3/tests/responses/projects-detail.json @@ -23,6 +23,7 @@ "id": 1, "_links": { "_self": "https://readthedocs.org/api/v3/projects/project/builds/1/", + "notifications": "https://readthedocs.org/api/v3/projects/project/builds/1/notifications/", "project": "https://readthedocs.org/api/v3/projects/project/", "version": "https://readthedocs.org/api/v3/projects/project/versions/v1.0/" }, diff --git a/readthedocs/api/v3/tests/responses/projects-versions-builds-list_POST.json b/readthedocs/api/v3/tests/responses/projects-versions-builds-list_POST.json index 987d72de209..18f392601a5 100644 --- a/readthedocs/api/v3/tests/responses/projects-versions-builds-list_POST.json +++ b/readthedocs/api/v3/tests/responses/projects-versions-builds-list_POST.json @@ -8,6 +8,7 @@ "id": 2, "_links": { "_self": "https://readthedocs.org/api/v3/projects/project/builds/2/", + "notifications": "https://readthedocs.org/api/v3/projects/project/builds/2/notifications/", "project": "https://readthedocs.org/api/v3/projects/project/", "version": "https://readthedocs.org/api/v3/projects/project/versions/v1.0/" }, diff --git a/readthedocs/api/v3/urls.py b/readthedocs/api/v3/urls.py index 310f1b00e88..528f576e0f1 100644 --- a/readthedocs/api/v3/urls.py +++ b/readthedocs/api/v3/urls.py @@ -3,6 +3,7 @@ BuildsCreateViewSet, BuildsViewSet, EnvironmentVariablesViewSet, + NotificationsViewSet, ProjectsViewSet, RedirectsViewSet, RemoteOrganizationViewSet, @@ -12,7 +13,6 @@ VersionsViewSet, ) - router = DefaultRouterWithNesting() # allows /api/v3/projects/ @@ -63,13 +63,28 @@ # allows /api/v3/projects/pip/builds/ # allows /api/v3/projects/pip/builds/1053/ -projects.register( +builds = projects.register( r"builds", BuildsViewSet, basename="projects-builds", parents_query_lookups=["project__slug"], ) +# NOTE: we are only listing notifications on APIv3 for now. +# The front-end will use this endpoint. +# allows /api/v3/projects/pip/builds/1053/notifications/ +builds.register( + r"notifications", + NotificationsViewSet, + basename="project-builds-notifications", + parents_query_lookups=["project__slug", "build__id"], +) + +# TODO: create an APIv3 endpoint to PATCH Build/Project notifications. +# This way the front-end can mark them as READ/DISMISSED. +# +# TODO: create an APIv3 endpoint to list notifications for Projects. + # allows /api/v3/projects/pip/redirects/ # allows /api/v3/projects/pip/redirects/1053/ projects.register( diff --git a/readthedocs/api/v3/views.py b/readthedocs/api/v3/views.py index 4df4d7a2b35..3196980dca1 100644 --- a/readthedocs/api/v3/views.py +++ b/readthedocs/api/v3/views.py @@ -1,4 +1,5 @@ import django_filters.rest_framework as filters +from django.contrib.contenttypes.models import ContentType from django.db.models import Exists, OuterRef from rest_flex_fields import is_expanded from rest_flex_fields.views import FlexFieldsMixin @@ -23,6 +24,7 @@ from readthedocs.builds.models import Build, Version from readthedocs.core.utils import trigger_build from readthedocs.core.utils.extend import SettingsOverrideObject +from readthedocs.notifications.models import Notification from readthedocs.oauth.models import ( RemoteOrganization, RemoteRepository, @@ -62,6 +64,7 @@ BuildCreateSerializer, BuildSerializer, EnvironmentVariableSerializer, + NotificationSerializer, OrganizationSerializer, ProjectCreateSerializer, ProjectSerializer, @@ -381,6 +384,32 @@ def create(self, request, **kwargs): # pylint: disable=arguments-differ return Response(data=data, status=code) +class NotificationsViewSet( + APIv3Settings, + NestedViewSetMixin, + ProjectQuerySetMixin, + FlexFieldsMixin, + ReadOnlyModelViewSet, +): + model = Notification + lookup_field = "pk" + lookup_url_kwarg = "notification_pk" + serializer_class = NotificationSerializer + queryset = Notification.objects.all() + # filterset_class = BuildFilter + + # http://chibisov.github.io/drf-extensions/docs/#usage-with-generic-relations + def get_queryset(self): + queryset = self.queryset.filter( + attached_to_content_type=ContentType.objects.get_for_model(Build) + ) + + # TODO: make sure if this particular filter should be applied here or somewhere else. + return queryset.filter( + attached_to_id=self.kwargs.get("parent_lookup_build__id") + ) + + class RedirectsViewSet( APIv3Settings, NestedViewSetMixin, diff --git a/readthedocs/builds/models.py b/readthedocs/builds/models.py index 82646cf5532..32f6b70afff 100644 --- a/readthedocs/builds/models.py +++ b/readthedocs/builds/models.py @@ -7,6 +7,7 @@ import regex import structlog from django.conf import settings +from django.contrib.contenttypes.fields import GenericRelation from django.db import models from django.urls import reverse from django.utils import timezone @@ -58,8 +59,8 @@ get_vcs_url, ) from readthedocs.builds.version_slug import VersionSlugField -from readthedocs.config import LATEST_CONFIGURATION_VERSION from readthedocs.core.utils import extract_valid_attributes_for_model, trigger_build +from readthedocs.notifications.models import Notification from readthedocs.projects.constants import ( BITBUCKET_COMMIT_URL, BITBUCKET_URL, @@ -833,6 +834,13 @@ class Build(models.Model): blank=True, ) + notifications = GenericRelation( + Notification, + related_query_name="build", + content_type_field="attached_to_content_type", + object_id_field="attached_to_id", + ) + # Managers objects = BuildQuerySet.as_manager() # Only include BRANCH, TAG, UNKNOWN type Version builds. @@ -1095,36 +1103,6 @@ def can_rebuild(self): def external_version_name(self): return external_version_name(self) - def deprecated_config_used(self): - """ - Check whether this particular build is using a deprecated config file. - - When using v1 or not having a config file at all, it returns ``True``. - Returns ``False`` only when it has a config file and it is using v2. - - Note we are using this to communicate deprecation of v1 file and not using a config file. - See https://github.com/readthedocs/readthedocs.org/issues/10342 - """ - if not self.config: - return True - - return int(self.config.get("version", "1")) != LATEST_CONFIGURATION_VERSION - - def deprecated_build_image_used(self): - """ - Check whether this particular build is using the deprecated "build.image" config. - - Note we are using this to communicate deprecation of "build.image". - See https://github.com/readthedocs/meta/discussions/48 - """ - if not self.config: - # Don't notify users without a config file. - # We hope they will migrate to `build.os` in the process of adding a `.readthedocs.yaml` - return False - - build_config_key = self.config.get("build", {}) - return "image" in build_config_key - def reset(self): """ Reset the build so it can be re-used when re-trying. diff --git a/readthedocs/builds/static-src/builds/js/detail.js b/readthedocs/builds/static-src/builds/js/detail.js index 60ae8c2c6b5..3cb170a881b 100644 --- a/readthedocs/builds/static-src/builds/js/detail.js +++ b/readthedocs/builds/static-src/builds/js/detail.js @@ -28,6 +28,7 @@ function BuildCommand(data) { function BuildDetailView(instance) { var self = this; var instance = instance || {}; + var poll_api_counts = 0; /* Instance variables */ self.state = ko.observable(instance.state); @@ -63,10 +64,8 @@ function BuildDetailView(instance) { self.legacy_output(true); }; + function poll_api() { - if (self.finished()) { - return; - } $.getJSON('/api/v2/build/' + instance.id + '/', function (data) { self.state(data.state); self.state_display(data.state_display); @@ -77,6 +76,9 @@ function BuildDetailView(instance) { self.commit(data.commit); self.docs_url(data.docs_url); self.commit_url(data.commit_url); + + poll_api_counts = poll_api_counts + 1; + var n; for (n in data.commands) { var command = data.commands[n]; @@ -92,6 +94,19 @@ function BuildDetailView(instance) { } }); + if (self.finished()) { + // HACK: this is a small hack to avoid re-implementing the new notification system + // in the old dashboard with Knockout. + // The notifications are rendered properly via Django template in a static way. + // So, we refresh the page once the build has finished to make Django render the notifications. + // We use a check of 1 API poll for those builds that are already finished when opened. + // The new dashboard will implement the new notification system in a nicer way using APIv3. + if (poll_api_counts !== 1) { + location.reload(); + } + return; + } + setTimeout(poll_api, 2000); } diff --git a/readthedocs/builds/static/builds/js/detail.js b/readthedocs/builds/static/builds/js/detail.js index 8662e9f08e4..7cd3809b6ad 100644 --- a/readthedocs/builds/static/builds/js/detail.js +++ b/readthedocs/builds/static/builds/js/detail.js @@ -1 +1 @@ -require=function r(s,n,u){function i(t,e){if(!n[t]){if(!s[t]){var o="function"==typeof require&&require;if(!e&&o)return o(t,!0);if(c)return c(t,!0);throw(e=new Error("Cannot find module '"+t+"'")).code="MODULE_NOT_FOUND",e}o=n[t]={exports:{}},s[t][0].call(o.exports,function(e){return i(s[t][1][e]||e)},o,o.exports,r,s,n,u)}return n[t].exports}for(var c="function"==typeof require&&require,e=0;e{directory}. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=ConfigError.KEY_NOT_SUPPORTED_IN_VERSION, + header=_("Configuration key not supported in this version"), + body=_( + textwrap.dedent( + """ + The {key} configuration option is not supported in this version. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=ConfigError.PYTHON_SYSTEM_PACKAGES_REMOVED, + header=_("Invalid configuration key"), + body=_( + textwrap.dedent( + """ + The configuration key python.system_packages has been deprecated and removed. + Refer to https://blog.readthedocs.com/drop-support-system-packages/ to read more + about this change and how to upgrade your config file." + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=ConfigError.PYTHON_USE_SYSTEM_SITE_PACKAGES_REMOVED, + header=_("Invalid configuration key"), + body=_( + textwrap.dedent( + """ + The configuration key python.use_system_site_packages has been deprecated and removed. + Refer to https://blog.readthedocs.com/drop-support-system-packages/ to read more + about this change and how to upgrade your config file." + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=ConfigError.INVALID_VERSION, + header=_("Invalid configuration version"), + body=_( + textwrap.dedent( + """ + Invalid version of the configuration file. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=ConfigError.GENERIC_INVALID_CONFIG_KEY, + header=_("Invalid configuration option"), + body=_( + textwrap.dedent( + """ + Invalid configuration option: {key}. + + Read the Docs configuration file: {source_file}. + + {error_message} + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=ConfigError.NOT_BUILD_TOOLS_OR_COMMANDS, + header=_("Invalid configuration option: build"), + body=_( + textwrap.dedent( + """ + At least one item should be provided in "tools" or "commands". + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=ConfigError.BUILD_JOBS_AND_COMMANDS, + header=_("Invalid configuration option"), + body=_( + textwrap.dedent( + """ + The keys build.jobs and build.commands can't be used together. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=ConfigError.APT_INVALID_PACKAGE_NAME_PREFIX, + header=_("Invalid APT package name"), + body=_( + textwrap.dedent( + """ + The name of the packages (e.g. {package}) can't start with {prefix} + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=ConfigError.APT_INVALID_PACKAGE_NAME, + header=_("Invalid APT package name"), + body=_( + textwrap.dedent( + """ + The name of the package {pacakge} is invalid. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=ConfigError.USE_PIP_FOR_EXTRA_REQUIREMENTS, + header=_("Invalid Python install method"), + body=_( + textwrap.dedent( + """ + You need to install your project with pip to use extra_requirements. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=ConfigError.PIP_PATH_OR_REQUIREMENT_REQUIRED, + header=_("Invalid configuration key"), + body=_( + textwrap.dedent( + """ + path or requirements key is required for python.install. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=ConfigError.SPHINX_MKDOCS_CONFIG_TOGETHER, + header=_("Invalid configuration key"), + body=_( + textwrap.dedent( + """ + You can not have sphinx and mkdocs keys at the same time. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=ConfigError.SUBMODULES_INCLUDE_EXCLUDE_TOGETHER, + header=_("Invalid configuration key"), + body=_( + textwrap.dedent( + """ + You can not have exclude and include submodules at the same time. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=ConfigError.INVALID_KEY_NAME, + header=_("Invalid configuration key: {key}"), + body=_( + textwrap.dedent( + """ + Make sure the key name {key} is correct. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=ConfigError.SYNTAX_INVALID, + header=_("Invalid syntax in configuration file"), + body=_( + textwrap.dedent( + """ + Error while parsing {filename}. + + {error_message} + """ + ).strip(), + ), + type=ERROR, + ), +] +registry.add(messages) + +# Validation errors +messages = [ + Message( + id=ConfigValidationError.INVALID_BOOL, + header=_("Config file validation error"), + body=_( + textwrap.dedent( + """ + Config validation error in {key}. + Expected one of (0, 1, true, false), got {value}. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=ConfigValidationError.INVALID_CHOICE, + header=_("Config file validation error"), + body=_( + textwrap.dedent( + """ + Config validation error in {key}. + Expected one of ({choices}), got {value}. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=ConfigValidationError.INVALID_DICT, + header=_("Config file validation error"), + body=_( + textwrap.dedent( + """ + Config validation error in {key}. + Expected a dictionary, got {value}. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=ConfigValidationError.INVALID_PATH, + header=_("Config file validation error"), + body=_( + textwrap.dedent( + """ + Config validation error in {key}. + The path {value} does not exist. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=ConfigValidationError.INVALID_PATH_PATTERN, + header=_("Config file validation error"), + body=_( + textwrap.dedent( + """ + Config validation error in {key}. + The path {value} is not a valid path pattern. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=ConfigValidationError.INVALID_STRING, + header=_("Config file validation error"), + body=_( + textwrap.dedent( + """ + Config validation error in {key}. + Expected a string, got {value}. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=ConfigValidationError.INVALID_LIST, + header=_("Config file validation error"), + body=_( + textwrap.dedent( + """ + Config validation error in {key}. + Expected a list, got {value}. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=ConfigValidationError.VALUE_NOT_FOUND, + header=_("Config file validation error"), + body=_( + textwrap.dedent( + """ + Config validation error in {key}. + Value {value} not found. + """ + ).strip(), + ), + type=ERROR, + ), +] +registry.add(messages) diff --git a/readthedocs/config/tests/test_config.py b/readthedocs/config/tests/test_config.py index 8e73b6d8a9a..c5963bf7ef4 100644 --- a/readthedocs/config/tests/test_config.py +++ b/readthedocs/config/tests/test_config.py @@ -8,32 +8,15 @@ from django.test import override_settings from pytest import raises -from readthedocs.config import ( - ALL, - PIP, - SETUPTOOLS, - BuildConfigV2, - ConfigError, - DefaultConfigFileNotFound, - InvalidConfig, - load, -) -from readthedocs.config.config import ( - CONFIG_FILE_REQUIRED, - CONFIG_FILENAME_REGEX, - CONFIG_REQUIRED, - CONFIG_SYNTAX_INVALID, - INVALID_KEY, - INVALID_NAME, - VERSION_INVALID, -) +from readthedocs.config import ALL, PIP, SETUPTOOLS, BuildConfigV2, load +from readthedocs.config.config import CONFIG_FILENAME_REGEX +from readthedocs.config.exceptions import ConfigError, ConfigValidationError from readthedocs.config.models import ( BuildJobs, BuildWithOs, PythonInstall, PythonInstallRequirements, ) -from readthedocs.config.validation import VALUE_NOT_FOUND, ValidationError from .utils import apply_fs @@ -73,11 +56,10 @@ def get_build_config(config, source_file="readthedocs.yml", validate=False): def test_load_no_config_file(tmpdir, files): apply_fs(tmpdir, files) base = str(tmpdir) - with raises(DefaultConfigFileNotFound) as e: + with raises(ConfigError) as e: with override_settings(DOCROOT=tmpdir): load(base, {}) - assert e.value.code == CONFIG_FILE_REQUIRED - + assert e.value.message_id == ConfigError.DEFAULT_PATH_NOT_FOUND def test_load_empty_config_file(tmpdir): apply_fs( @@ -124,7 +106,7 @@ def test_load_unknow_version(tmpdir): with raises(ConfigError) as excinfo: with override_settings(DOCROOT=tmpdir): load(base, {}) - assert excinfo.value.code == VERSION_INVALID + assert excinfo.value.message_id == ConfigError.INVALID_VERSION def test_load_raise_exception_invalid_syntax(tmpdir): @@ -146,7 +128,7 @@ def test_load_raise_exception_invalid_syntax(tmpdir): with raises(ConfigError) as excinfo: with override_settings(DOCROOT=tmpdir): load(base, {}) - assert excinfo.value.code == CONFIG_SYNTAX_INVALID + assert excinfo.value.message_id == ConfigError.SYNTAX_INVALID def test_load_non_default_filename(tmpdir): @@ -232,14 +214,6 @@ def test_version(self): build = get_build_config({}) assert build.version == "2" - def test_correct_error_when_source_is_dir(self, tmpdir): - build = get_build_config({}, source_file=str(tmpdir)) - with raises(InvalidConfig) as excinfo: - build.error(key="key", message="Message", code="code") - # We don't have any extra information about - # the source_file. - assert str(excinfo.value) == 'Invalid configuration option "key": Message' - def test_formats_check_valid(self): build = get_build_config({"formats": ["htmlzip", "pdf", "epub"]}) build.validate() @@ -248,17 +222,19 @@ def test_formats_check_valid(self): @pytest.mark.parametrize("value", [3, "invalid", {"other": "value"}]) def test_formats_check_invalid_value(self, value): build = get_build_config({"formats": value}) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "formats" + assert excinfo.value.message_id == ConfigValidationError.INVALID_LIST + assert excinfo.value.format_values.get("key") == "formats" def test_formats_check_invalid_type(self): build = get_build_config( {"formats": ["htmlzip", "invalid", "epub"]}, ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "formats" + assert excinfo.value.message_id == ConfigValidationError.INVALID_CHOICE + assert excinfo.value.format_values.get("key") == "formats" def test_formats_default_value(self): build = get_build_config({}) @@ -309,36 +285,41 @@ def test_conda_check_valid(self, tmpdir): @pytest.mark.parametrize("value", [3, [], "invalid"]) def test_conda_check_invalid_value(self, value): build = get_build_config({"conda": value}) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "conda" + assert excinfo.value.message_id == ConfigValidationError.INVALID_DICT + assert excinfo.value.format_values.get("key") == "conda" @pytest.mark.parametrize("value", [3, [], {}]) def test_conda_check_invalid_file_value(self, value): build = get_build_config({"conda": {"file": value}}) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "conda.environment" + assert excinfo.value.message_id == ConfigValidationError.VALUE_NOT_FOUND + assert excinfo.value.format_values.get("key") == "conda.environment" def test_conda_check_file_required(self): build = get_build_config({"conda": {"no-file": "other"}}) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "conda.environment" + assert excinfo.value.message_id == ConfigValidationError.VALUE_NOT_FOUND + assert excinfo.value.format_values.get("key") == "conda.environment" @pytest.mark.parametrize("value", [3, [], "invalid"]) def test_build_check_invalid_type(self, value): build = get_build_config({"build": value}) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "build" + assert excinfo.value.message_id == ConfigValidationError.INVALID_DICT + assert excinfo.value.format_values.get("key") == "build" @pytest.mark.parametrize("value", [3, [], {}]) def test_build_image_check_invalid_type(self, value): build = get_build_config({"build": {"image": value}}) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "build.os" + assert excinfo.value.message_id == ConfigValidationError.VALUE_NOT_FOUND + assert excinfo.value.format_values.get("key") == "build.os" @pytest.mark.parametrize("value", ["", None, "latest"]) def test_new_build_config_invalid_os(self, value): @@ -350,9 +331,10 @@ def test_new_build_config_invalid_os(self, value): }, }, ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "build.os" + assert excinfo.value.message_id == ConfigValidationError.INVALID_CHOICE + assert excinfo.value.format_values.get("key") == "build.os" @pytest.mark.parametrize( "value", ["", None, "python", ["python", "nodejs"], {}, {"cobol": "99"}] @@ -366,9 +348,17 @@ def test_new_build_config_invalid_tools(self, value): }, }, ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "build.tools" + + # TODO: split this test to check specific errors now we have better messages + assert excinfo.value.message_id in ( + ConfigError.NOT_BUILD_TOOLS_OR_COMMANDS, + ConfigValidationError.INVALID_DICT, + ConfigValidationError.VALUE_NOT_FOUND, + ConfigValidationError.INVALID_CHOICE, + ) + assert excinfo.value.format_values.get("key") in ("build.tools", "build") def test_new_build_config_invalid_tools_version(self): build = get_build_config( @@ -379,9 +369,10 @@ def test_new_build_config_invalid_tools_version(self): }, }, ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "build.tools.python" + assert excinfo.value.message_id == ConfigValidationError.INVALID_CHOICE + assert excinfo.value.format_values.get("key") == "build.tools.python" def test_new_build_config(self): build = get_build_config( @@ -410,9 +401,10 @@ def test_new_build_config_conflict_with_build_image(self): }, }, ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "build.image" + assert excinfo.value.message_id == ConfigError.INVALID_KEY_NAME + assert excinfo.value.format_values.get("key") == "build.image" def test_new_build_config_conflict_with_build_python_version(self): build = get_build_config( @@ -424,9 +416,10 @@ def test_new_build_config_conflict_with_build_python_version(self): "python": {"version": "3.8"}, }, ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "python.version" + assert excinfo.value.message_id == ConfigError.INVALID_KEY_NAME + assert excinfo.value.format_values.get("key") == "python.version" def test_commands_build_config_tools_and_commands_valid(self): """ @@ -459,9 +452,10 @@ def test_build_jobs_without_build_os_is_invalid(self): }, }, ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "build.os" + assert excinfo.value.message_id == ConfigValidationError.VALUE_NOT_FOUND + assert excinfo.value.format_values.get("key") == "build.os" def test_commands_build_config_invalid_command(self): build = get_build_config( @@ -473,9 +467,10 @@ def test_commands_build_config_invalid_command(self): }, }, ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "build.commands" + assert excinfo.value.message_id == ConfigValidationError.INVALID_LIST + assert excinfo.value.format_values.get("key") == "build.commands" def test_commands_build_config_invalid_no_os(self): build = get_build_config( @@ -485,9 +480,10 @@ def test_commands_build_config_invalid_no_os(self): }, }, ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "build.os" + assert excinfo.value.message_id == ConfigValidationError.VALUE_NOT_FOUND + assert excinfo.value.format_values.get("key") == "build.os" def test_commands_build_config_valid(self): """It's valid to build with just build.os and build.commands.""" @@ -516,9 +512,10 @@ def test_jobs_build_config_invalid_jobs(self, value): }, }, ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "build.jobs" + assert excinfo.value.message_id == ConfigValidationError.INVALID_CHOICE + assert excinfo.value.format_values.get("key") == "build.jobs" @pytest.mark.parametrize("value", ["", None, "echo 123", 42]) def test_jobs_build_config_invalid_job_commands(self, value): @@ -533,9 +530,10 @@ def test_jobs_build_config_invalid_job_commands(self, value): }, }, ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "build.jobs.pre_install" + assert excinfo.value.message_id == ConfigValidationError.INVALID_LIST + assert excinfo.value.format_values.get("key") == "build.jobs.pre_install" def test_jobs_build_config(self): build = get_build_config( @@ -622,9 +620,10 @@ def test_build_apt_packages_invalid_type(self, value): } } ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "build.apt_packages" + assert excinfo.value.message_id == ConfigValidationError.INVALID_LIST + assert excinfo.value.format_values.get("key") == "build.apt_packages" @pytest.mark.parametrize( "error_index, value", @@ -662,24 +661,32 @@ def test_build_apt_packages_invalid_value(self, error_index, value): } } ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == f"build.apt_packages.{error_index}" - assert excinfo.value.code == INVALID_NAME + assert excinfo.value.message_id in ( + ConfigError.APT_INVALID_PACKAGE_NAME, + ConfigError.APT_INVALID_PACKAGE_NAME_PREFIX, + ) + assert ( + excinfo.value.format_values.get("key") + == f"build.apt_packages.{error_index}" + ) @pytest.mark.parametrize("value", [3, [], "invalid"]) def test_python_check_invalid_types(self, value): build = get_build_config({"python": value}) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "python" + assert excinfo.value.message_id == ConfigValidationError.INVALID_DICT + assert excinfo.value.format_values.get("key") == "python" @pytest.mark.parametrize("value", [[], {}, "3", "3.10"]) def test_python_version_check_invalid_types(self, value): build = get_build_config({"python": {"version": value}}) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "python.version" + assert excinfo.value.message_id == ConfigError.INVALID_KEY_NAME + assert excinfo.value.format_values.get("key") == "python.version" def test_python_install_default_value(self): build = get_build_config({}) @@ -723,9 +730,10 @@ def test_python_install_method_check_invalid(self, value, tmpdir): }, source_file=str(tmpdir.join('readthedocs.yml')), ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "python.install.0.method" + assert excinfo.value.message_id == ConfigValidationError.INVALID_CHOICE + assert excinfo.value.format_values.get("key") == "python.install.0.method" def test_python_install_requirements_check_valid(self, tmpdir): apply_fs(tmpdir, {"requirements.txt": ""}) @@ -757,9 +765,10 @@ def test_python_install_requirements_does_not_allow_null(self, tmpdir): }, source_file=str(tmpdir.join('readthedocs.yml')), ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "python.install.0.requirements" + assert excinfo.value.message_id == ConfigValidationError.INVALID_STRING + assert excinfo.value.format_values.get("key") == "python.install.0.requirements" def test_python_install_requirements_error_msg(self, tmpdir): build = get_build_config( @@ -775,13 +784,12 @@ def test_python_install_requirements_error_msg(self, tmpdir): }, source_file=str(tmpdir.join('readthedocs.yml')), ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert ( - str(excinfo.value) - == 'Invalid configuration option "python.install[0].requirements": expected string' - ) + assert str(excinfo.value) == "Build user exception" + # assert registry.get() + # == 'Invalid configuration option "python.install[0].requirements": expected string' def test_python_install_requirements_does_not_allow_empty_string(self, tmpdir): build = get_build_config( @@ -797,9 +805,10 @@ def test_python_install_requirements_does_not_allow_empty_string(self, tmpdir): }, source_file=str(tmpdir.join('readthedocs.yml')), ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "python.install.0.requirements" + assert excinfo.value.message_id == ConfigValidationError.INVALID_PATH + assert excinfo.value.format_values.get("key") == "python.install.0.requirements" @pytest.mark.parametrize("value", [3, [], {}]) def test_python_install_requirements_check_invalid_types(self, value, tmpdir): @@ -816,9 +825,10 @@ def test_python_install_requirements_check_invalid_types(self, value, tmpdir): }, source_file=str(tmpdir.join('readthedocs.yml')), ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "python.install.0.requirements" + assert excinfo.value.message_id == ConfigValidationError.INVALID_STRING + assert excinfo.value.format_values.get("key") == "python.install.0.requirements" def test_python_install_path_is_required(self, tmpdir): build = get_build_config( @@ -833,10 +843,10 @@ def test_python_install_path_is_required(self, tmpdir): }, source_file=str(tmpdir.join('readthedocs.yml')), ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "python.install.0" - assert excinfo.value.code == CONFIG_REQUIRED + assert excinfo.value.message_id == ConfigError.PIP_PATH_OR_REQUIREMENT_REQUIRED + assert excinfo.value.format_values.get("key") == "python.install.0" def test_python_install_pip_check_valid(self, tmpdir): build = get_build_config( @@ -895,9 +905,10 @@ def test_python_install_check_invalid_type(self, value): build = get_build_config( {"python": {"install": value}}, ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "python.install" + assert excinfo.value.message_id == ConfigValidationError.INVALID_LIST + assert excinfo.value.format_values.get("key") == "python.install" def test_python_install_extra_requirements_and_pip(self, tmpdir): build = get_build_config( @@ -934,9 +945,9 @@ def test_python_install_extra_requirements_and_setuptools(self, tmpdir): }, source_file=str(tmpdir.join('readthedocs.yml')), ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "python.install.0.extra_requirements" + assert excinfo.value.message_id == ConfigError.USE_PIP_FOR_EXTRA_REQUIREMENTS @pytest.mark.parametrize("value", [2, "invalid", {}, "", None]) def test_python_install_extra_requirements_check_type(self, value, tmpdir): @@ -954,9 +965,13 @@ def test_python_install_extra_requirements_check_type(self, value, tmpdir): }, source_file=str(tmpdir.join('readthedocs.yml')), ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "python.install.0.extra_requirements" + assert excinfo.value.message_id == ConfigValidationError.INVALID_LIST + assert ( + excinfo.value.format_values.get("key") + == "python.install.0.extra_requirements" + ) def test_python_install_extra_requirements_allow_empty(self, tmpdir): build = get_build_config( @@ -1024,9 +1039,10 @@ def test_python_install_several_respects_order(self, tmpdir): @pytest.mark.parametrize("value", [[], True, 0, "invalid"]) def test_sphinx_validate_type(self, value): build = get_build_config({"sphinx": value}) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "sphinx" + assert excinfo.value.message_id == ConfigValidationError.INVALID_DICT + assert excinfo.value.format_values.get("key") == "sphinx" def test_sphinx_is_default_doc_type(self): build = get_build_config({}) @@ -1055,9 +1071,10 @@ def test_sphinx_builder_check_valid(self, value, expected): @pytest.mark.parametrize("value", [[], True, 0, "invalid"]) def test_sphinx_builder_check_invalid(self, value): build = get_build_config({"sphinx": {"builder": value}}) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "sphinx.builder" + assert excinfo.value.message_id == ConfigValidationError.INVALID_CHOICE + assert excinfo.value.format_values.get("key") == "sphinx.builder" def test_sphinx_builder_default(self): build = get_build_config({}) @@ -1089,9 +1106,9 @@ def test_sphinx_cant_be_used_with_mkdocs(self, tmpdir): }, source_file=str(tmpdir.join("readthedocs.yml")), ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "." + assert excinfo.value.message_id == ConfigError.SPHINX_MKDOCS_CONFIG_TOGETHER def test_sphinx_configuration_allow_null(self): build = get_build_config( @@ -1110,9 +1127,10 @@ def test_sphinx_configuration_validate_type(self, value): build = get_build_config( {"sphinx": {"configuration": value}}, ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "sphinx.configuration" + assert excinfo.value.message_id == ConfigValidationError.INVALID_STRING + assert excinfo.value.format_values.get("key") == "sphinx.configuration" @pytest.mark.parametrize("value", [True, False]) def test_sphinx_fail_on_warning_check_valid(self, value): @@ -1123,9 +1141,10 @@ def test_sphinx_fail_on_warning_check_valid(self, value): @pytest.mark.parametrize("value", [[], "invalid", 5]) def test_sphinx_fail_on_warning_check_invalid(self, value): build = get_build_config({"sphinx": {"fail_on_warning": value}}) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "sphinx.fail_on_warning" + assert excinfo.value.message_id == ConfigValidationError.INVALID_BOOL + assert excinfo.value.format_values.get("key") == "sphinx.fail_on_warning" def test_sphinx_fail_on_warning_check_default(self): build = get_build_config({}) @@ -1135,9 +1154,10 @@ def test_sphinx_fail_on_warning_check_default(self): @pytest.mark.parametrize("value", [[], True, 0, "invalid"]) def test_mkdocs_validate_type(self, value): build = get_build_config({"mkdocs": value}) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "mkdocs" + assert excinfo.value.message_id == ConfigValidationError.INVALID_DICT + assert excinfo.value.format_values.get("key") == "mkdocs" def test_mkdocs_default(self): build = get_build_config({}) @@ -1174,9 +1194,10 @@ def test_mkdocs_configuration_validate_type(self, value): build = get_build_config( {"mkdocs": {"configuration": value}}, ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "mkdocs.configuration" + assert excinfo.value.message_id == ConfigValidationError.INVALID_STRING + assert excinfo.value.format_values.get("key") == "mkdocs.configuration" @pytest.mark.parametrize("value", [True, False]) def test_mkdocs_fail_on_warning_check_valid(self, value): @@ -1191,9 +1212,10 @@ def test_mkdocs_fail_on_warning_check_invalid(self, value): build = get_build_config( {"mkdocs": {"fail_on_warning": value}}, ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "mkdocs.fail_on_warning" + assert excinfo.value.message_id == ConfigValidationError.INVALID_BOOL + assert excinfo.value.format_values.get("key") == "mkdocs.fail_on_warning" def test_mkdocs_fail_on_warning_check_default(self): build = get_build_config( @@ -1212,9 +1234,10 @@ def test_submodule_defaults(self): @pytest.mark.parametrize("value", [[], "invalid", 0]) def test_submodules_check_invalid_type(self, value): build = get_build_config({"submodules": value}) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "submodules" + assert excinfo.value.message_id == ConfigValidationError.INVALID_DICT + assert excinfo.value.format_values.get("key") == "submodules" def test_submodules_include_check_valid(self): build = get_build_config( @@ -1238,9 +1261,10 @@ def test_submodules_include_check_invalid(self, value): }, } ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "submodules.include" + assert excinfo.value.message_id == ConfigValidationError.INVALID_LIST + assert excinfo.value.format_values.get("key") == "submodules.include" def test_submodules_include_allows_all_keyword(self): build = get_build_config( @@ -1277,9 +1301,10 @@ def test_submodules_exclude_check_invalid(self, value): }, } ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "submodules.exclude" + assert excinfo.value.message_id == ConfigValidationError.INVALID_LIST + assert excinfo.value.format_values.get("key") == "submodules.exclude" def test_submodules_exclude_allows_all_keyword(self): build = get_build_config( @@ -1303,9 +1328,11 @@ def test_submodules_cant_exclude_and_include(self): }, } ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "submodules" + assert ( + excinfo.value.message_id == ConfigError.SUBMODULES_INCLUDE_EXCLUDE_TOGETHER + ) def test_submodules_can_exclude_include_be_empty(self): build = get_build_config( @@ -1346,9 +1373,10 @@ def test_submodules_recursive_check_invalid(self, value): }, } ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "submodules.recursive" + assert excinfo.value.message_id == ConfigValidationError.INVALID_BOOL + assert excinfo.value.format_values.get("key") == "submodules.recursive" def test_submodules_recursive_explicit_default(self): build = get_build_config( @@ -1384,9 +1412,10 @@ def test_search_invalid_type(self, value): "search": value, } ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "search" + assert excinfo.value.message_id == ConfigValidationError.INVALID_DICT + assert excinfo.value.format_values.get("key") == "search" @pytest.mark.parametrize( 'value', @@ -1413,9 +1442,18 @@ def test_search_ranking_invalid_type(self, value): "search": {"ranking": value}, } ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "search.ranking" + + # TODO: these test should be split to validate the exact ``message_id`` + assert excinfo.value.message_id in ( + ConfigValidationError.INVALID_DICT, + ConfigValidationError.INVALID_CHOICE, + ConfigValidationError.INVALID_PATH_PATTERN, + ConfigValidationError.INVALID_STRING, + ) + + assert excinfo.value.format_values.get("key") == "search.ranking" @pytest.mark.parametrize("value", list(range(-10, 10 + 1))) def test_search_valid_ranking(self, value): @@ -1476,9 +1514,13 @@ def test_search_ignore_invalid_type(self, value): "search": {"ignore": value}, } ) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == "search.ignore" + assert excinfo.value.message_id in ( + ConfigValidationError.INVALID_LIST, + ConfigValidationError.INVALID_STRING, + ) + assert excinfo.value.format_values.get("key") == "search.ignore" @pytest.mark.parametrize( "path, expected", @@ -1549,10 +1591,13 @@ def test_search_ignore_valid_type(self, path, expected): ) def test_strict_validation(self, value, key): build = get_build_config(value) - with raises(InvalidConfig) as excinfo: + with raises(ConfigError) as excinfo: build.validate() - assert excinfo.value.key == key - assert excinfo.value.code == INVALID_KEY + assert excinfo.value.message_id in ( + ConfigError.INVALID_KEY_NAME, + ConfigValidationError.INVALID_BOOL, + ) + assert excinfo.value.format_values.get("key") == key @pytest.mark.parametrize( 'value,expected', [ @@ -1596,10 +1641,10 @@ def test_pop_config_default(self): def test_pop_config_raise_exception(self): build = get_build_config({}) - with raises(ValidationError) as excinfo: + with raises(ConfigValidationError) as excinfo: build.pop_config("build.invalid", raise_ex=True) - assert excinfo.value.value == "invalid" - assert excinfo.value.code == VALUE_NOT_FOUND + assert excinfo.value.format_values.get("value") == "invalid" + assert excinfo.value.message_id == ConfigValidationError.VALUE_NOT_FOUND def test_as_dict_new_build_config(self, tmpdir): build = get_build_config( diff --git a/readthedocs/config/tests/test_validation.py b/readthedocs/config/tests/test_validation.py index 3a3426ce9ee..c3585b180f2 100644 --- a/readthedocs/config/tests/test_validation.py +++ b/readthedocs/config/tests/test_validation.py @@ -1,11 +1,7 @@ from pytest import raises +from readthedocs.config.exceptions import ConfigValidationError from readthedocs.config.validation import ( - INVALID_BOOL, - INVALID_CHOICE, - INVALID_LIST, - INVALID_STRING, - ValidationError, validate_bool, validate_choice, validate_list, @@ -28,9 +24,9 @@ def test_it_accepts_1(self): assert validate_bool(1) is True def test_it_fails_on_string(self): - with raises(ValidationError) as excinfo: + with raises(ConfigValidationError) as excinfo: validate_bool("random string") - assert excinfo.value.code == INVALID_BOOL + assert excinfo.value.message_id == ConfigValidationError.INVALID_BOOL class TestValidateChoice: @@ -38,14 +34,14 @@ def test_it_accepts_valid_choice(self): result = validate_choice("choice", ("choice", "another_choice")) assert result == "choice" - with raises(ValidationError) as excinfo: + with raises(ConfigValidationError) as excinfo: validate_choice("c", "abc") - assert excinfo.value.code == INVALID_LIST + assert excinfo.value.message_id == ConfigValidationError.INVALID_LIST def test_it_rejects_invalid_choice(self): - with raises(ValidationError) as excinfo: + with raises(ConfigValidationError) as excinfo: validate_choice("not-a-choice", ("choice", "another_choice")) - assert excinfo.value.code == INVALID_CHOICE + assert excinfo.value.message_id == ConfigValidationError.INVALID_CHOICE class TestValidateList: @@ -62,14 +58,14 @@ def iterator(): result = validate_list(iterator()) assert result == ["choice"] - with raises(ValidationError) as excinfo: + with raises(ConfigValidationError) as excinfo: validate_choice("c", "abc") - assert excinfo.value.code == INVALID_LIST + assert excinfo.value.message_id == ConfigValidationError.INVALID_LIST def test_it_rejects_string_types(self): - with raises(ValidationError) as excinfo: + with raises(ConfigValidationError) as excinfo: validate_list("choice") - assert excinfo.value.code == INVALID_LIST + assert excinfo.value.message_id == ConfigValidationError.INVALID_LIST class TestValidatePath: @@ -91,9 +87,9 @@ def test_it_returns_relative_path(self, tmpdir): assert path == "a directory" def test_it_only_accepts_strings(self): - with raises(ValidationError) as excinfo: + with raises(ConfigValidationError) as excinfo: validate_path(None, "") - assert excinfo.value.code == INVALID_STRING + assert excinfo.value.message_id == ConfigValidationError.INVALID_STRING class TestValidateString: @@ -106,11 +102,11 @@ def test_it_accepts_nonunicode(self): assert isinstance(result, str) def test_it_rejects_float(self): - with raises(ValidationError) as excinfo: + with raises(ConfigValidationError) as excinfo: validate_string(123.456) - assert excinfo.value.code == INVALID_STRING + assert excinfo.value.message_id == ConfigValidationError.INVALID_STRING def test_it_rejects_none(self): - with raises(ValidationError) as excinfo: + with raises(ConfigValidationError) as excinfo: validate_string(None) - assert excinfo.value.code == INVALID_STRING + assert excinfo.value.message_id == ConfigValidationError.INVALID_STRING diff --git a/readthedocs/config/validation.py b/readthedocs/config/validation.py index 0b8bb894efd..fd08955f0f3 100644 --- a/readthedocs/config/validation.py +++ b/readthedocs/config/validation.py @@ -2,66 +2,47 @@ import os -INVALID_BOOL = "invalid-bool" -INVALID_CHOICE = "invalid-choice" -INVALID_LIST = "invalid-list" -INVALID_DICT = "invalid-dictionary" -INVALID_PATH = "invalid-path" -INVALID_PATH_PATTERN = "invalid-path-pattern" -INVALID_STRING = "invalid-string" -VALUE_NOT_FOUND = "value-not-found" - - -class ValidationError(Exception): - - """Base error for validations.""" - - messages = { - INVALID_BOOL: "expected one of (0, 1, true, false), got {value}", - INVALID_CHOICE: "expected one of ({choices}), got {value}", - INVALID_DICT: "{value} is not a dictionary", - INVALID_PATH: "path {value} does not exist", - INVALID_PATH_PATTERN: "{value} isn't a valid path pattern", - INVALID_STRING: "expected string", - INVALID_LIST: "expected list", - VALUE_NOT_FOUND: "{value} not found", - } - - def __init__(self, value, code, format_kwargs=None): - self.value = value - self.code = code - defaults = { - "value": value, - } - if format_kwargs is not None: - defaults.update(format_kwargs) - message = self.messages[code].format(**defaults) - super().__init__(message) +from .exceptions import ConfigValidationError def validate_list(value): """Check if ``value`` is an iterable.""" if isinstance(value, (dict, str)): - raise ValidationError(value, INVALID_LIST) + raise ConfigValidationError( + message_id=ConfigValidationError.INVALID_LIST, + format_values={ + "value": value, + }, + ) if not hasattr(value, "__iter__"): - raise ValidationError(value, INVALID_LIST) + raise ConfigValidationError( + message_id=ConfigValidationError.INVALID_LIST, + format_values={ + "value": value, + }, + ) return list(value) def validate_dict(value): """Check if ``value`` is a dictionary.""" if not isinstance(value, dict): - raise ValidationError(value, INVALID_DICT) + raise ConfigValidationError( + message_id=ConfigValidationError.INVALID_DICT, + format_values={ + "value": value, + }, + ) def validate_choice(value, choices): """Check that ``value`` is in ``choices``.""" choices = validate_list(choices) if value not in choices: - raise ValidationError( - value, - INVALID_CHOICE, - { + raise ConfigValidationError( + message_id=ConfigValidationError.INVALID_CHOICE, + format_values={ + "value": value, "choices": ", ".join(map(str, choices)), }, ) @@ -71,7 +52,12 @@ def validate_choice(value, choices): def validate_bool(value): """Check that ``value`` is an boolean value.""" if value not in (0, 1, False, True): - raise ValidationError(value, INVALID_BOOL) + raise ConfigValidationError( + message_id=ConfigValidationError.INVALID_BOOL, + format_values={ + "value": value, + }, + ) return bool(value) @@ -79,7 +65,12 @@ def validate_path(value, base_path): """Check that ``value`` is a valid path name and normamlize it.""" string_value = validate_string(value) if not string_value: - raise ValidationError(value, INVALID_PATH) + raise ConfigValidationError( + message_id=ConfigValidationError.INVALID_PATH, + format_values={ + "value": value, + }, + ) full_path = os.path.join(base_path, string_value) rel_path = os.path.relpath(full_path, base_path) return rel_path @@ -98,16 +89,31 @@ def validate_path_pattern(value): path = "/" + path.lstrip("/") path = os.path.normpath(path) if not os.path.isabs(path): - raise ValidationError(value, INVALID_PATH_PATTERN) + raise ConfigValidationError( + message_id=ConfigValidationError.INVALID_PATH_PATTERN, + format_values={ + "value": value, + }, + ) # Remove ``/`` from the path once is validated. path = path.lstrip("/") if not path: - raise ValidationError(value, INVALID_PATH_PATTERN) + raise ConfigValidationError( + message_id=ConfigValidationError.INVALID_PATH_PATTERN, + format_values={ + "value": value, + }, + ) return path def validate_string(value): """Check that ``value`` is a string type.""" if not isinstance(value, str): - raise ValidationError(value, INVALID_STRING) + raise ConfigValidationError( + message_id=ConfigValidationError.INVALID_STRING, + format_values={ + "value": value, + }, + ) return str(value) diff --git a/readthedocs/core/admin.py b/readthedocs/core/admin.py index c01c00d4e4d..e488758d093 100644 --- a/readthedocs/core/admin.py +++ b/readthedocs/core/admin.py @@ -10,8 +10,6 @@ from django.utils.html import format_html from django.utils.safestring import mark_safe from django.utils.translation import gettext_lazy as _ -from messages_extends.admin import MessageAdmin -from messages_extends.models import Message from rest_framework.authtoken.admin import TokenAdmin from readthedocs.core.history import ExtraSimpleHistoryAdmin @@ -127,30 +125,5 @@ class UserProfileAdmin(ExtraSimpleHistoryAdmin): raw_id_fields = ("user",) -class MessageAdminExtra(MessageAdmin): - list_display = [ - "user", - "organizations", - "message", - "created", - "read", - ] - list_filter = [ - "read", - ] - search_fields = [ - "user__username", - "message", - "user__organizationowner__organization__slug", - ] - - def organizations(self, obj): - return ", ".join( - organization.slug for organization in obj.user.owner_organizations.all() - ) - - admin.site.unregister(User) admin.site.register(User, UserAdminExtra) -admin.site.unregister(Message) -admin.site.register(Message, MessageAdminExtra) diff --git a/readthedocs/core/context_processors.py b/readthedocs/core/context_processors.py index c7c8aa2b6df..99b75fccaf2 100644 --- a/readthedocs/core/context_processors.py +++ b/readthedocs/core/context_processors.py @@ -1,9 +1,19 @@ """Template context processors for core app.""" from django.conf import settings +from django.contrib.contenttypes.models import ContentType def readthedocs_processor(request): + """ + Context processor to include global settings to templates. + + Note that we are not using the ``request`` object at all here. + It's preferable to keep it like that since it's used from places where there is no request. + + If you need to add something that depends on the request, + create a new context processor. + """ exports = { "PUBLIC_DOMAIN": settings.PUBLIC_DOMAIN, "PRODUCTION_DOMAIN": settings.PRODUCTION_DOMAIN, @@ -18,3 +28,29 @@ def readthedocs_processor(request): "PUBLIC_API_URL": settings.PUBLIC_API_URL, } return exports + + +def user_notifications(request): + """ + Context processor to include user's notification to templates. + + We can't use ``request.user.notifications.all()`` because we are not using a ``CustomUser``. + If we want to go that route, we should define a ``CustomUser`` and define a ``GenericRelation``. + + See https://docs.djangoproject.com/en/4.2/ref/settings/#std-setting-AUTH_USER_MODEL + """ + + # Import here due to circular import + from readthedocs.notifications.models import Notification + + user_notifications = Notification.objects.none() + if request.user.is_authenticated: + content_type = ContentType.objects.get_for_model(request.user) + user_notifications = Notification.objects.filter( + attached_to_content_type=content_type, + attached_to_id=request.user.pk, + ) + + return { + "user_notifications": user_notifications, + } diff --git a/readthedocs/core/notifications.py b/readthedocs/core/notifications.py new file mode 100644 index 00000000000..f38e37264c5 --- /dev/null +++ b/readthedocs/core/notifications.py @@ -0,0 +1,28 @@ +"""Read the Docs notifications.""" + +import textwrap + +from django.utils.translation import gettext_lazy as _ + +from readthedocs.notifications.constants import WARNING +from readthedocs.notifications.messages import Message, registry + +MESSAGE_EMAIL_VALIDATION_PENDING = "core:email:validation-pending" +messages = [ + Message( + id=MESSAGE_EMAIL_VALIDATION_PENDING, + header=_("Email address not verified"), + body=_( + textwrap.dedent( + """ + Your primary email address is not verified. + Please verify it here. + """ + ).strip(), + ), + type=WARNING, + ), +] + + +registry.add(messages) diff --git a/readthedocs/core/tasks.py b/readthedocs/core/tasks.py index ce4e046f608..157d7c91822 100644 --- a/readthedocs/core/tasks.py +++ b/readthedocs/core/tasks.py @@ -6,8 +6,6 @@ import structlog from django.conf import settings from django.core.mail import EmailMultiAlternatives -from django.utils import timezone -from messages_extends.models import Message as PersistentMessage from readthedocs.worker import app @@ -51,16 +49,6 @@ def send_email_task( msg.send() -@app.task(queue="web") -def clear_persistent_messages(): - # Delete all expired message_extend's messages - log.info("Deleting all expired message_extend's messages") - expired_messages = PersistentMessage.objects.filter( - expires__lt=timezone.now(), - ) - expired_messages.delete() - - @app.task(queue="web") def cleanup_pidbox_keys(): """ diff --git a/readthedocs/core/tests/test_contact.py b/readthedocs/core/tests/test_contact.py index becb6e8aa70..8b4b6462db1 100644 --- a/readthedocs/core/tests/test_contact.py +++ b/readthedocs/core/tests/test_contact.py @@ -5,7 +5,6 @@ from django_dynamic_fixture import get from readthedocs.core.utils.contact import contact_users -from readthedocs.notifications.backends import SiteBackend User = get_user_model() @@ -17,14 +16,12 @@ def setUp(self): self.user_three = get(User, username="test3", email="three@test.com") @mock.patch("readthedocs.core.utils.contact.send_mail") - @mock.patch.object(SiteBackend, "send") - def test_contact_users_dryrun(self, send_notification, send_mail): + def test_contact_users_dryrun(self, send_mail): self.assertEqual(User.objects.all().count(), 3) resp = contact_users( users=User.objects.all(), email_subject="Subject", email_content="Content", - notification_content="Notification", dryrun=True, ) self.assertEqual( @@ -34,29 +31,18 @@ def test_contact_users_dryrun(self, send_notification, send_mail): "sent": {"one@test.com", "two@test.com", "three@test.com"}, "failed": set(), }, - "notification": { - "sent": { - self.user.username, - self.user_two.username, - self.user_three.username, - }, - "failed": set(), - }, }, ) - self.assertEqual(send_notification.call_count, 0) self.assertEqual(send_mail.call_count, 0) @mock.patch("readthedocs.core.utils.contact.send_mail") - @mock.patch.object(SiteBackend, "send") - def test_contact_users_not_dryrun(self, send_notification, send_mail): + def test_contact_users_not_dryrun(self, send_mail): self.assertEqual(User.objects.all().count(), 3) resp = contact_users( users=User.objects.all(), email_subject="Subject", email_content="Content", - notification_content="Notification", dryrun=False, ) self.assertEqual( @@ -66,16 +52,7 @@ def test_contact_users_not_dryrun(self, send_notification, send_mail): "sent": {"one@test.com", "two@test.com", "three@test.com"}, "failed": set(), }, - "notification": { - "sent": { - self.user.username, - self.user_two.username, - self.user_three.username, - }, - "failed": set(), - }, }, ) - self.assertEqual(send_notification.call_count, 3) self.assertEqual(send_mail.call_count, 3) diff --git a/readthedocs/core/tests/test_filesystem_utils.py b/readthedocs/core/tests/test_filesystem_utils.py index 6afe31edd51..6c744a975f8 100644 --- a/readthedocs/core/tests/test_filesystem_utils.py +++ b/readthedocs/core/tests/test_filesystem_utils.py @@ -7,7 +7,7 @@ from readthedocs.core.utils.filesystem import safe_copytree, safe_open, safe_rmtree from readthedocs.doc_builder.exceptions import ( - FileTooLarge, + BuildUserError, SymlinkOutsideBasePath, UnsupportedSymlinkFileError, ) @@ -110,7 +110,7 @@ def test_open_large_file(self): file_a.write_bytes(b"0" * (1024 * 2)) with override_settings(DOCROOT=docroot_path): - with pytest.raises(FileTooLarge): + with pytest.raises(BuildUserError): safe_open(file_a, max_size_bytes=1024) def test_write_file(self): diff --git a/readthedocs/core/utils/__init__.py b/readthedocs/core/utils/__init__.py index f7b821f11b2..7e30678fd63 100644 --- a/readthedocs/core/utils/__init__.py +++ b/readthedocs/core/utils/__init__.py @@ -18,6 +18,7 @@ EXTERNAL, ) from readthedocs.doc_builder.exceptions import BuildCancelled, BuildMaxConcurrencyError +from readthedocs.notifications.models import Notification from readthedocs.worker import app log = structlog.get_logger(__name__) @@ -153,10 +154,13 @@ def prepare_build( # and the user will be alerted in the UI from the Error below. options["countdown"] = settings.RTD_BUILDS_RETRY_DELAY options["max_retries"] = settings.RTD_BUILDS_MAX_RETRIES - build.error = BuildMaxConcurrencyError.message.format( - limit=max_concurrent_builds, + + Notification.objects.add( + message_id=BuildMaxConcurrencyError.LIMIT_REACHED, + attached_to=build, + dismissable=False, + format_values={"limit": max_concurrent_builds}, ) - build.save() _, build_api_key = BuildAPIKey.objects.create_key(project=project) @@ -244,7 +248,14 @@ def cancel_build(build): terminate = False build.state = BUILD_STATE_CANCELLED build.success = False - build.error = BuildCancelled.message + + # Add a notification for this build + Notification.objects.add( + message_id=BuildCancelled.CANCELLED_BY_USER, + attached_to=build, + dismissable=False, + ) + build.length = 0 build.save() else: diff --git a/readthedocs/core/utils/contact.py b/readthedocs/core/utils/contact.py index eccb022b569..3fd20c4a5f1 100644 --- a/readthedocs/core/utils/contact.py +++ b/readthedocs/core/utils/contact.py @@ -4,71 +4,53 @@ from django.conf import settings from django.core.mail import send_mail from django.template import Context, Engine -from messages_extends import constants as message_constants -from readthedocs.notifications import SiteNotification -from readthedocs.notifications.backends import SiteBackend log = structlog.get_logger(__name__) +# TODO: re-implement sending notifications to users. +# This needs more thinking because the notifications where "Generic", +# and now we need to register a ``Message`` with its ID. def contact_users( users, email_subject=None, email_content=None, from_email=None, - notification_content=None, - sticky_notification=False, context_function=None, dryrun=True, ): """ - Send an email or a sticky notification to a list of users. + Send an email to a list of users. :param users: Queryset of Users. :param string email_subject: Email subject :param string email_content: Email content (markdown) :param string from_email: Email to sent from (Test Support ) - :param string notification_content: Content for the sticky notification (markdown) :param context_function: A callable that will receive an user - and return a dict of additional context to be used in the email/notification content - :param bool dryrun: If `True` don't sent the email or notification, just logs the content + and return a dict of additional context to be used in the email content + :param bool dryrun: If `True` don't sent the email, just logs the content - The `email_content` and `notification_content` contents will be rendered using - a template with the following context:: + The `email_content` contents will be rendered using a template with the following context:: { 'user': , 'production_uri': https://readthedocs.org, } - :returns: A dictionary with a list of sent/failed emails/notifications. + :returns: A dictionary with a list of sent/failed emails. """ from_email = from_email or settings.DEFAULT_FROM_EMAIL context_function = context_function or (lambda user: {}) sent_emails = set() failed_emails = set() - sent_notifications = set() - failed_notifications = set() - - backend = SiteBackend(request=None) engine = Engine.get_default() - notification_template = engine.from_string(notification_content or "") email_template = engine.from_string(email_content or "") email_txt_template = engine.get_template("core/email/common.txt") email_html_template = engine.get_template("core/email/common.html") - class TempNotification(SiteNotification): - if sticky_notification: - success_level = message_constants.SUCCESS_PERSISTENT - - def render(self, *args, **kwargs): - return markdown.markdown( - notification_template.render(Context(self.get_context_data())) - ) - total = users.count() for count, user in enumerate(users.iterator(), start=1): context = { @@ -77,35 +59,6 @@ def render(self, *args, **kwargs): } context.update(context_function(user)) - if notification_content: - notification = TempNotification( - user=user, - success=True, - extra_context=context, - ) - try: - if not dryrun: - backend.send(notification) - else: - # Check we can render the notification with the context properly - log.debug( - "Rendered notification.", - notification=markdown.markdown( - notification_template.render(Context(context)) - ), - ) - except Exception: - log.exception("Notification failed to send") - failed_notifications.add(user.username) - else: - log.info( - "Successfully sent notification.", - user_username=user.username, - count=count, - total=total, - ) - sent_notifications.add(user.username) - if email_subject: emails = list( user.emailaddress_set.filter(verified=True) @@ -148,8 +101,4 @@ def render(self, *args, **kwargs): "sent": sent_emails, "failed": failed_emails, }, - "notification": { - "sent": sent_notifications, - "failed": failed_emails, - }, } diff --git a/readthedocs/core/utils/filesystem.py b/readthedocs/core/utils/filesystem.py index d98f7deb90d..fe63dc769b9 100644 --- a/readthedocs/core/utils/filesystem.py +++ b/readthedocs/core/utils/filesystem.py @@ -13,8 +13,8 @@ from django.core.exceptions import SuspiciousFileOperation from readthedocs.doc_builder.exceptions import ( + BuildUserError, FileIsNotRegularFile, - FileTooLarge, SymlinkOutsideBasePath, UnsupportedSymlinkFileError, ) @@ -77,11 +77,11 @@ def safe_open( ) if path.exists() and not path.is_file(): - raise FileIsNotRegularFile() + raise FileIsNotRegularFile(FileIsNotRegularFile.SYMLINK_USED) if not allow_symlinks and path.is_symlink(): - log.info("Skipping file becuase it's a symlink.") - raise UnsupportedSymlinkFileError() + log.info("Skipping file because it's a symlink.") + raise UnsupportedSymlinkFileError(UnsupportedSymlinkFileError.SYMLINK_USED) # Expand symlinks. resolved_path = path.resolve() @@ -90,14 +90,14 @@ def safe_open( file_size = resolved_path.stat().st_size if file_size > max_size_bytes: log.info("File is too large.", size_bytes=file_size) - raise FileTooLarge() + raise BuildUserError(BuildUserError.FILE_TOO_LARGE) if allow_symlinks and base_path: base_path = Path(base_path).absolute() if not resolved_path.is_relative_to(base_path): # Trying to path traversal via a symlink, sneaky! log.info("Path traversal via symlink.") - raise SymlinkOutsideBasePath() + raise SymlinkOutsideBasePath(SymlinkOutsideBasePath.SYMLINK_USED) _assert_path_is_inside_docroot(resolved_path) diff --git a/readthedocs/doc_builder/backends/mkdocs.py b/readthedocs/doc_builder/backends/mkdocs.py index 7f36b49eafd..6fef3009818 100644 --- a/readthedocs/doc_builder/backends/mkdocs.py +++ b/readthedocs/doc_builder/backends/mkdocs.py @@ -102,7 +102,10 @@ def load_yaml_config(self): ) if not result: raise UserFileNotFound( - UserFileNotFound.FILE_NOT_FOUND.format(self.yaml_file) + message_id=UserFileNotFound.FILE_NOT_FOUND, + format_values={ + "filename": self.yaml_file, + }, ) config = yaml_load_safely(result) @@ -124,8 +127,7 @@ def load_yaml_config(self): mark.column + 1, ) raise MkDocsYAMLParseError( - 'Your mkdocs.yml could not be loaded, ' - 'possibly due to a syntax error{note}'.format(note=note), + MkDocsYAMLParseError.SYNTAX_ERROR, ) from exc def append_conf(self): diff --git a/readthedocs/doc_builder/backends/sphinx.py b/readthedocs/doc_builder/backends/sphinx.py index 35c7d7503fd..b4d2ddc6988 100644 --- a/readthedocs/doc_builder/backends/sphinx.py +++ b/readthedocs/doc_builder/backends/sphinx.py @@ -18,7 +18,6 @@ from readthedocs.builds import utils as version_utils from readthedocs.builds.models import APIVersion from readthedocs.core.utils.filesystem import safe_open -from readthedocs.doc_builder.exceptions import PDFNotFound from readthedocs.projects.constants import OLD_LANGUAGES_CODE_MAPPING, PUBLIC from readthedocs.projects.exceptions import ProjectConfigurationError, UserFileNotFound from readthedocs.projects.models import Feature @@ -502,7 +501,7 @@ def build(self): tex_files = glob(os.path.join(self.absolute_host_output_dir, "*.tex")) if not tex_files: - raise BuildUserError("No TeX files were found.") + raise BuildUserError(message_id=BuildUserError.TEX_FILE_NOT_FOUND) # Run LaTeX -> PDF conversions success = self._build_latexmk(self.project_path) @@ -581,7 +580,7 @@ def _post_build(self): """Internal post build to cleanup PDF output directory and leave only one .pdf file.""" if not self.pdf_file_name: - raise PDFNotFound() + raise BuildUserError(BuildUserError.PDF_NOT_FOUND) # TODO: merge this with ePUB since it's pretty much the same temp_pdf_file = f"/tmp/{self.project.slug}-{self.version.slug}.pdf" diff --git a/readthedocs/doc_builder/director.py b/readthedocs/doc_builder/director.py index fcbb4c47c2e..b8b9aaa2a04 100644 --- a/readthedocs/doc_builder/director.py +++ b/readthedocs/doc_builder/director.py @@ -443,6 +443,11 @@ def check_old_output_directory(self): def run_build_commands(self): """Runs each build command in the build environment.""" + + self.attach_notification( + message_id=BuildUserError.BUILD_COMMANDS_IN_BETA, + ) + reshim_commands = ( {"pip", "install"}, {"conda", "create"}, @@ -767,3 +772,27 @@ def store_readthedocs_build_yaml(self): # It will be saved when the API is hit. # This data will be used by the `/_/readthedocs-config.json` API endpoint. self.data.version.build_data = data + + def attach_notification( + self, + message_id, + format_values=None, + state="unread", + dismissable=False, + news=False, + ): + """Attach a notification to build in progress using the APIv2.""" + + format_values = format_values or {} + # NOTE: we are using APIv2 here because it uses BuildAPIKey authentication, + # which is not currently supported by APIv3. + self.data.api_client.notifications.post( + { + "attached_to": f'build/{self.data.build["id"]}', + "message_id": message_id, + "state": state, # Optional + "dismissable": dismissable, + "news": news, + "format_values": format_values, + } + ) diff --git a/readthedocs/doc_builder/environments.py b/readthedocs/doc_builder/environments.py index 609fc234abc..9950498f315 100644 --- a/readthedocs/doc_builder/environments.py +++ b/readthedocs/doc_builder/environments.py @@ -31,7 +31,7 @@ DOCKER_VERSION, RTD_SKIP_BUILD_EXIT_CODE, ) -from .exceptions import BuildAppError, BuildUserError, BuildUserSkip +from .exceptions import BuildAppError, BuildUserError log = structlog.get_logger(__name__) @@ -527,13 +527,13 @@ def run_command_class( version_slug=self.version.slug if self.version else "", ) elif build_cmd.exit_code == RTD_SKIP_BUILD_EXIT_CODE: - raise BuildUserSkip() + raise BuildAppError(BuildAppError.SKIPPED_EXIT_CODE_183) else: # TODO: for now, this still outputs a generic error message # that is the same across all commands. We could improve this # with more granular error messages that vary by the command # being run. - raise BuildUserError() + raise BuildUserError(BuildUserError.GENERIC) return build_cmd @@ -776,22 +776,17 @@ def raise_container_error(self, state): """ if state is not None and state.get("Running") is False: if state.get("ExitCode") == DOCKER_TIMEOUT_EXIT_CODE: - raise BuildUserError( - _("Build exited due to time out"), - ) + raise BuildUserError(message_id=BuildUserError.BUILD_TIME_OUT) if state.get("OOMKilled", False): - raise BuildUserError( - _("Build exited due to excessive memory consumption"), - ) + raise BuildUserError(message_id=BuildUserError.BUILD_EXCESSIVE_MEMORY) if state.get("Error"): raise BuildAppError( - ( - _("Build exited due to unknown error: {0}").format( - state.get("Error") - ), - ) + message_id=BuildAppError.BUILD_DOCKER_UNKNOWN_ERROR, + format_values={ + "message": state.get("Error"), + }, ) def create_container(self): diff --git a/readthedocs/doc_builder/exceptions.py b/readthedocs/doc_builder/exceptions.py index 9597fb9e7ce..b13e9ba2b74 100644 --- a/readthedocs/doc_builder/exceptions.py +++ b/readthedocs/doc_builder/exceptions.py @@ -1,156 +1,86 @@ -"""Exceptions raised when building documentation.""" +""" +Exceptions raised when building documentation. -from django.utils.translation import gettext_noop +Exceptions defined here have different categories based on + + - responsibility (user or application) + - special cases (they need to be handled in a special way, e.g. concurrency limit reached) + - grouped by topic (e.g. MkDocs errors) + +All these exception should only define the "message id" under one of these categories. +Then the header/body texts should be defined in ``readthedocs/notifications/messages.py``. +""" -from readthedocs.builds.constants import BUILD_STATE_CANCELLED -from readthedocs.projects.constants import BUILD_COMMANDS_OUTPUT_PATH_HTML class BuildBaseException(Exception): - message = None - status_code = None - def __init__(self, message=None, **kwargs): - self.status_code = ( - kwargs.pop( - "status_code", - None, - ) - or self.status_code - or 1 - ) - self.message = message or self.message or self.get_default_message() - super().__init__(message, **kwargs) + default_message = "Build user exception" - def get_default_message(self): - return self.message + def __init__(self, message_id, format_values=None, **kwargs): + self.message_id = message_id + self.format_values = format_values + super().__init__(self.default_message, **kwargs) class BuildAppError(BuildBaseException): - GENERIC_WITH_BUILD_ID = gettext_noop( - "There was a problem with Read the Docs while building your documentation. " - "Please try again later. " - "If this problem persists, " - "report this error to us with your build id ({build_id}).", - ) + default_message = "Build app exception" + GENERIC_WITH_BUILD_ID = "build:app:generic-with-build-id" + BUILDS_DISABLED = "build:app:project-builds-disabled" + BUILD_DOCKER_UNKNOWN_ERROR = "build:app:docker:unknown-error" + BUILD_TERMINATED_DUE_INACTIVITY = "build:app:terminated-due-inactivity" -class BuildUserError(BuildBaseException): - GENERIC = gettext_noop( - "We encountered a problem with a command while building your project. " - "To resolve this error, double check your project configuration and installed " - "dependencies are correct and have not changed recently." - ) - BUILD_COMMANDS_WITHOUT_OUTPUT = gettext_noop( - f'No "{BUILD_COMMANDS_OUTPUT_PATH_HTML}" folder was created during this build.' - ) - BUILD_OUTPUT_IS_NOT_A_DIRECTORY = gettext_noop( - 'Build output directory for format "{artifact_type}" is not a directory.' - ) - BUILD_OUTPUT_HAS_0_FILES = gettext_noop( - 'Build output directory for format "{artifact_type}" does not contain any files. ' - "It seems the build process created the directory but did not save any file to it." - ) - BUILD_OUTPUT_HAS_MULTIPLE_FILES = gettext_noop( - 'Build output directory for format "{artifact_type}" contains multiple files ' - "and it is not currently supported. " - 'Please, remove all the files but the "{artifact_type}" you want to upload.' - ) - BUILD_OUTPUT_HTML_NO_INDEX_FILE = gettext_noop( - "Your documentation did not generate an 'index.html' at its root directory. " - "This is required for documentation serving at the root URL for this version." - ) - BUILD_OUTPUT_OLD_DIRECTORY_USED = gettext_noop( - "Some files were detected in an unsupported output path, '_build/html'. " - "Ensure your project is configured to use the output path " - "'$READTHEDOCS_OUTPUT/html' instead." - ) - NO_CONFIG_FILE_DEPRECATED = gettext_noop( - "The configuration file required to build documentation is missing from your project. " - "Add a configuration file to your project to make it build successfully. " - "Read more at https://docs.readthedocs.io/en/stable/config-file/v2.html" - ) - BUILD_IMAGE_CONFIG_KEY_DEPRECATED = gettext_noop( - 'The configuration key "build.image" is deprecated. ' - 'Use "build.os" instead to continue building your project. ' - "Read more at https://docs.readthedocs.io/en/stable/config-file/v2.html#build-os" - ) - BUILD_OS_REQUIRED = gettext_noop( - 'The configuration key "build.os" is required to build your documentation. ' - "Read more at https://docs.readthedocs.io/en/stable/config-file/v2.html#build-os" - ) - - -class BuildUserSkip(BuildUserError): - message = gettext_noop("This build was manually skipped using a command exit code.") - state = BUILD_STATE_CANCELLED - - -class ProjectBuildsSkippedError(BuildUserError): - message = gettext_noop("Builds for this project are temporarily disabled") - - -class YAMLParseError(BuildUserError): - GENERIC_WITH_PARSE_EXCEPTION = gettext_noop( - "Problem in your project's configuration. {exception}", - ) +class BuildUserError(BuildBaseException): + GENERIC = "build:user:generic" + SKIPPED_EXIT_CODE_183 = "build:user:exit-code-183" + MAX_CONCURRENCY = "build:user:max-concurrency" -class BuildMaxConcurrencyError(BuildUserError): - message = gettext_noop( - "Concurrency limit reached ({limit}), retrying in 5 minutes." - ) + BUILD_COMMANDS_WITHOUT_OUTPUT = "build:user:output:no-html" + BUILD_OUTPUT_IS_NOT_A_DIRECTORY = "build:user:output:is-no-a-directory" + BUILD_OUTPUT_HAS_0_FILES = "build:user:output:has-0-files" + BUILD_OUTPUT_HAS_NO_PDF_FILES = "build:user:output:has-no-pdf-files" + BUILD_OUTPUT_HAS_MULTIPLE_FILES = "build:user:output:has-multiple-files" + BUILD_OUTPUT_HTML_NO_INDEX_FILE = "build:user:output:html-no-index-file" + BUILD_OUTPUT_OLD_DIRECTORY_USED = "build:user:output:old-directory-used" + FILE_TOO_LARGE = "build:user:output:file-too-large" + TEX_FILE_NOT_FOUND = "build:user:tex-file-not-found" + NO_CONFIG_FILE_DEPRECATED = "build:user:config:no-config-file" + BUILD_IMAGE_CONFIG_KEY_DEPRECATED = "build:user:config:build-image-deprecated" + BUILD_OS_REQUIRED = "build:user:config:build-os-required" -class BuildCancelled(BuildUserError): - message = gettext_noop("Build cancelled by user.") - state = BUILD_STATE_CANCELLED + BUILD_COMMANDS_IN_BETA = "build:user:build-commands-config-key-in-beta" + BUILD_TIME_OUT = "build:user:time-out" + BUILD_EXCESSIVE_MEMORY = "build:user:excessive-memory" -class PDFNotFound(BuildUserError): - message = gettext_noop( - 'PDF file was not generated/found in "_readthedocs/pdf" output directory.' - ) +class BuildMaxConcurrencyError(BuildUserError): + LIMIT_REACHED = "build:user:concurrency-limit-reached" -class MkDocsYAMLParseError(BuildUserError): - GENERIC_WITH_PARSE_EXCEPTION = gettext_noop( - "Problem parsing MkDocs YAML configuration. {exception}", - ) - - INVALID_DOCS_DIR_CONFIG = gettext_noop( - 'The "docs_dir" config from your MkDocs YAML config file has to be a ' - "string with relative or absolute path.", - ) - - INVALID_DOCS_DIR_PATH = gettext_noop( - 'The "docs_dir" config from your MkDocs YAML config file does not ' - "contain a valid path.", - ) - - INVALID_EXTRA_CONFIG = gettext_noop( - 'The "{config}" config from your MkDocs YAML config file has to be a ' - "list of relative paths.", - ) - - EMPTY_CONFIG = gettext_noop( - "Please make sure the MkDocs YAML configuration file is not empty.", - ) - NOT_FOUND = gettext_noop( - "A configuration file was not found. " - 'Make sure you have a "mkdocs.yml" file in your repository.', - ) - - CONFIG_NOT_DICT = gettext_noop( - "Your MkDocs YAML config file is incorrect. " - "Please follow the user guide https://www.mkdocs.org/user-guide/configuration/ " - "to configure the file properly.", - ) +class BuildCancelled(BuildUserError): + CANCELLED_BY_USER = "build:user:cancelled" +class MkDocsYAMLParseError(BuildUserError): + GENERIC_WITH_PARSE_EXCEPTION = "build:user:mkdocs:yaml-parse" + INVALID_DOCS_DIR_CONFIG = "build:user:mkdocs:invalid-dir-config" + INVALID_DOCS_DIR_PATH = "build:user:mkdocs:invalid-dir-path" + INVALID_EXTRA_CONFIG = "build:user:mkdocs:invalid-extra-config" + EMPTY_CONFIG = "build:user:mkdocs:empty-config" + NOT_FOUND = "build:user:mkdocs:config-not-found" + CONFIG_NOT_DICT = "build:user:mkdocs:invalid-yaml" + SYNTAX_ERROR = "build:user:mkdocs:syntax-error" + + +# NOTE: there is no need to have three different error classes for this. +# We can merge all of them in one, always raise the same exception with different messages. +# # TODO: improve messages for symlink errors with a more detailed error and include the `filepath`. class UnsupportedSymlinkFileError(BuildUserError): - message = gettext_noop("Symlinks are not fully supported") + SYMLINK_USED = "build:user:symlink:used" class FileIsNotRegularFile(UnsupportedSymlinkFileError): @@ -159,10 +89,3 @@ class FileIsNotRegularFile(UnsupportedSymlinkFileError): class SymlinkOutsideBasePath(UnsupportedSymlinkFileError): pass - - -class FileTooLarge(BuildUserError): - message = gettext_noop( - "A file from your build process is too large to be processed by Read the Docs. " - "Please ensure no files generated are larger than 1GB." - ) diff --git a/readthedocs/domains/notifications.py b/readthedocs/domains/notifications.py index ae787b0d2b8..57a3694e9d6 100644 --- a/readthedocs/domains/notifications.py +++ b/readthedocs/domains/notifications.py @@ -1,12 +1,51 @@ """Notifications related to custom domains.""" +import textwrap -from readthedocs.notifications import Notification -from readthedocs.notifications.constants import REQUIREMENT +from django.utils.translation import gettext_noop as _ +from readthedocs.notifications.constants import INFO +from readthedocs.notifications.email import EmailNotification +from readthedocs.notifications.messages import Message, registry -class PendingCustomDomainValidation(Notification): + +class PendingCustomDomainValidation(EmailNotification): app_templates = "domains" context_object_name = "domain" name = "pending_domain_configuration" subject = "Pending configuration of custom domain {{ domain.domain }}" - level = REQUIREMENT + + +MESSAGE_DOMAIN_VALIDATION_PENDING = "project:domain:validation-pending" +MESSAGE_DOMAIN_VALIDATION_EXPIRED = "project:domain:validation-expired" +messages = [ + Message( + id=MESSAGE_DOMAIN_VALIDATION_PENDING, + header=_("Pending configuration of custom domain: {domain}"), + body=_( + textwrap.dedent( + """ + The configuration of your custom domain {domain} is pending. + Go to the domain page and follow the instructions to complete it. + """ + ).strip(), + ), + type=INFO, + ), + # TODO: the custom domain expired notification requires a periodic task to + # remove the old notification and create a new one pointing to this + # ``message_id`` + Message( + id=MESSAGE_DOMAIN_VALIDATION_EXPIRED, + header=_("Validation of custom domain expired: {domain}"), + body=_( + textwrap.dedent( + """ + The validation period of your custom domain {domain} has ended. + Go to the domain page and click on "Save" to restart the process. + """ + ).strip(), + ), + type=INFO, + ), +] +registry.add(messages) diff --git a/readthedocs/domains/tasks.py b/readthedocs/domains/tasks.py index a8ad7d9495e..2a12bd0956c 100644 --- a/readthedocs/domains/tasks.py +++ b/readthedocs/domains/tasks.py @@ -1,11 +1,15 @@ """Tasks related to custom domains.""" from django.conf import settings -from django.http import HttpRequest +from django.urls import reverse from django.utils import timezone from readthedocs.core.permissions import AdminPermission -from readthedocs.domains.notifications import PendingCustomDomainValidation +from readthedocs.domains.notifications import ( + MESSAGE_DOMAIN_VALIDATION_PENDING, + PendingCustomDomainValidation, +) +from readthedocs.notifications.models import Notification from readthedocs.projects.models import Domain from readthedocs.worker import app @@ -28,10 +32,24 @@ def email_pending_custom_domains(number_of_emails=3): validation_process_start__date__in=dates ) for domain in queryset: + # NOTE: this site notification was attach to every single user. + # The new behavior is to attach it to the project. + # + # We send an email notification to all the project's admins. + Notification.objects.add( + message_id=MESSAGE_DOMAIN_VALIDATION_PENDING, + attached_to=domain.project, + format_values={ + "domain": domain.domain, + "domain_url": reverse( + "projects_domains_edit", args=[domain.project.slug, domain.pk] + ), + }, + ) + for user in AdminPermission.admins(domain.project): notification = PendingCustomDomainValidation( context_object=domain, - request=HttpRequest(), user=user, ) notification.send() diff --git a/readthedocs/domains/templates/domains/notifications/pending_domain_configuration_site.html b/readthedocs/domains/templates/domains/notifications/pending_domain_configuration_site.html deleted file mode 100644 index ce862872a90..00000000000 --- a/readthedocs/domains/templates/domains/notifications/pending_domain_configuration_site.html +++ /dev/null @@ -1,8 +0,0 @@ -{% url "projects_domains_edit" domain.project.slug domain.pk as domain_url %} -{% if domain.validation_process_expired %} - The validation period of your custom domain {{ domain.domain }} has ended. - Go to the domain page and click on "Save" to restart the process. -{% else %} - The configuration of your custom domain {{ domain.domain }} is pending. - Go to the domain page and follow the instructions to complete it. -{% endif %} diff --git a/readthedocs/domains/tests/test_tasks.py b/readthedocs/domains/tests/test_tasks.py index 8e3ba95b0dd..8e1b0b47da9 100644 --- a/readthedocs/domains/tests/test_tasks.py +++ b/readthedocs/domains/tests/test_tasks.py @@ -72,7 +72,7 @@ def setUp(self): ) self.domain_expired.save() - @mock.patch("readthedocs.notifications.backends.send_email") + @mock.patch("readthedocs.notifications.email.send_email") def test_email_pending_emails(self, send_email): subject = "Pending configuration of custom domain" email_pending_custom_domains.delay(number_of_emails=3) @@ -88,7 +88,7 @@ def test_email_pending_emails(self, send_email): self.assertTrue(kwargs["subject"].startswith(subject)) self.assertIn(self.domain_recently_expired.domain, kwargs["subject"]) - @mock.patch("readthedocs.notifications.backends.send_email") + @mock.patch("readthedocs.notifications.email.send_email") def test_dont_send_email_on_given_days(self, send_email): now = timezone.now() days = [5, 8, 14, 16, 29, 31] @@ -98,7 +98,7 @@ def test_dont_send_email_on_given_days(self, send_email): email_pending_custom_domains.delay(number_of_emails=3) send_email.assert_not_called() - @mock.patch("readthedocs.notifications.backends.send_email") + @mock.patch("readthedocs.notifications.email.send_email") def test_send_email_on_given_days(self, send_email): now = timezone.now() days = [7, 15, 30] diff --git a/readthedocs/notifications/__init__.py b/readthedocs/notifications/__init__.py deleted file mode 100644 index 7b4c69b05c6..00000000000 --- a/readthedocs/notifications/__init__.py +++ /dev/null @@ -1,22 +0,0 @@ -""" -Extensions to Django messages to support notifications to users. - -Notifications are important communications to users that need to be as visible -as possible. We support different backends to make notifications visible in -different ways. For example, they might be e-mailed to users as well as -displayed on the site. - -This app builds on `django-messages-extends`_ to provide persistent messages -on the site. - -.. _`django-messages-extends`: https://github.com - /AliLozano/django-messages-extends/ -""" -from .notification import Notification, SiteNotification -from .backends import send_notification - -__all__ = ( - "Notification", - "SiteNotification", - "send_notification", -) diff --git a/readthedocs/notifications/backends.py b/readthedocs/notifications/backends.py deleted file mode 100644 index b496d040f48..00000000000 --- a/readthedocs/notifications/backends.py +++ /dev/null @@ -1,113 +0,0 @@ -""" -Pluggable backends for the delivery of notifications. - -Delivery of notifications to users depends on a list of backends configured in -Django settings. For example, they might be e-mailed to users as well as -displayed on the site. -""" - -from django.conf import settings -from django.http import HttpRequest -from django.utils.module_loading import import_string -from messages_extends.constants import INFO_PERSISTENT - -from readthedocs.core.utils import send_email - -from .constants import HTML, LEVEL_MAPPING, REQUIREMENT, TEXT - - -def send_notification(request, notification): - """ - Send notifications through all backends defined by settings. - - Backends should be listed in the settings ``NOTIFICATION_BACKENDS``, which - should be a list of class paths to be loaded, using the standard Django - string module loader. - """ - for cls_name in settings.NOTIFICATION_BACKENDS: - backend = import_string(cls_name)(request) - backend.send(notification) - - -class Backend: - def __init__(self, request): - self.request = request - - def send(self, notification): - pass - - -class EmailBackend(Backend): - - """ - Send templated notification emails through our standard email backend. - - The content body is first rendered from an on-disk template, then passed - into the standard email templates as a string. - - If the notification is set to ``send_email=False``, this backend will exit - early from :py:meth:`send`. - """ - - name = "email" - - def send(self, notification): - if not notification.send_email: - return - # FIXME: if the level is an ERROR an email is received and sometimes - # it's not necessary. This behavior should be clearly documented in the - # code - if notification.level >= REQUIREMENT: - template = notification.get_template_names( - backend_name=self.name, source_format=TEXT - ) - template_html = notification.get_template_names( - backend_name=self.name, source_format=HTML - ) - send_email( - recipient=notification.user.email, - subject=notification.get_subject(), - template=template, - template_html=template_html, - context=notification.get_context_data(), - ) - - -class SiteBackend(Backend): - - """ - Add messages through Django messages application. - - This uses persistent messageing levels provided by :py:mod:`message_extends` - and stores persistent messages in the database. - """ - - name = "site" - - def send(self, notification): - # Instead of calling the standard messages.add method, this instead - # manipulates the storage directly. This is because we don't have a - # request object and need to mock one out to fool the message storage - # into saving a message for a separate user. - cls_name = settings.MESSAGE_STORAGE - cls = import_string(cls_name) - req = HttpRequest() - setattr(req, "session", "") - storage = cls(req) - - # Use the method defined by the notification or map a simple level to a - # persistent one otherwise - if hasattr(notification, "get_message_level"): - level = notification.get_message_level() - else: - level = LEVEL_MAPPING.get(notification.level, INFO_PERSISTENT) - - storage.add( - level=level, - message=notification.render( - backend_name=self.name, - source_format=HTML, - ), - extra_tags=notification.extra_tags, - user=notification.user, - ) diff --git a/readthedocs/notifications/constants.py b/readthedocs/notifications/constants.py index 375702b0998..84a3f3af1e1 100644 --- a/readthedocs/notifications/constants.py +++ b/readthedocs/notifications/constants.py @@ -1,7 +1,5 @@ """Notification constants.""" -from messages_extends import constants as message_constants - TEXT = "txt" HTML = "html" @@ -10,25 +8,16 @@ REQUIREMENT = 2 ERROR = 3 -LEVEL_MAPPING = { - INFO: message_constants.INFO_PERSISTENT, - WARNING: message_constants.WARNING_PERSISTENT, - REQUIREMENT: message_constants.WARNING_PERSISTENT, - ERROR: message_constants.ERROR_PERSISTENT, -} - -# Message levels to save the message into the database and mark as read -# immediately after retrieved (one-time shown message) -DEBUG_NON_PERSISTENT = 100 -INFO_NON_PERSISTENT = 101 -SUCCESS_NON_PERSISTENT = 102 -WARNING_NON_PERSISTENT = 103 -ERROR_NON_PERSISTENT = 104 +# TODO: use Enum here for these ones +# Status +UNREAD = "unread" +READ = "read" +DISMISSED = "dismissed" +CANCELLED = "cancelled" -NON_PERSISTENT_MESSAGE_LEVELS = ( - DEBUG_NON_PERSISTENT, - INFO_NON_PERSISTENT, - SUCCESS_NON_PERSISTENT, - WARNING_NON_PERSISTENT, - ERROR_NON_PERSISTENT, -) +# Type +ERROR = "error" +WARNING = "warning" +INFO = "info" +NOTE = "note" +TIP = "tip" diff --git a/readthedocs/notifications/email.py b/readthedocs/notifications/email.py new file mode 100644 index 00000000000..c4fa36cf230 --- /dev/null +++ b/readthedocs/notifications/email.py @@ -0,0 +1,96 @@ +"""Email notifications.""" + +import structlog +from django.conf import settings +from django.db import models +from django.template import Context, Template +from django.template.loader import render_to_string + +from readthedocs.core.context_processors import readthedocs_processor +from readthedocs.core.utils import send_email + +from . import constants + +log = structlog.get_logger(__name__) + + +class EmailNotification: + + """ + An unsent notification linked to an object. + + This class provides an interface to construct notification messages by + rendering Django templates and send them via email. + + Call .send() to send the email notification. + """ + + name = None + context_object_name = "object" + app_templates = None + subject = None + user = None + extra_tags = "" + + def __init__(self, context_object, user, extra_context=None): + self.context_object = context_object + self.extra_context = extra_context or {} + self.user = user + + def get_subject(self): + template = Template(self.subject) + return template.render(context=Context(self.get_context_data())) + + def get_context_data(self): + context = { + self.context_object_name: self.context_object, + "production_uri": "{scheme}://{host}".format( + scheme="https", + host=settings.PRODUCTION_DOMAIN, + ), + } + context.update(self.extra_context) + context.update(readthedocs_processor(None)) + return context + + def get_template_names(self, source_format=constants.HTML): + names = [] + if self.context_object and isinstance(self.context_object, models.Model): + meta = self.context_object._meta + names.append( + "{app}/notifications/{name}_{backend}.{source_format}".format( + app=self.app_templates or meta.app_label, + name=self.name or meta.model_name, + backend="email", + source_format=source_format, + ), + ) + return names + + raise AttributeError("Template for this email not found.") + + def render(self, source_format=constants.HTML): + return render_to_string( + template_name=self.get_template_names( + source_format=source_format, + ), + context=self.get_context_data(), + ) + + def send(self): + """ + Send templated notification emails through our standard email backend. + + The content body is first rendered from an on-disk template, then passed + into the standard email templates as a string. + """ + + template = self.get_template_names(source_format=constants.TEXT) + template_html = self.get_template_names(source_format=constants.HTML) + send_email( + recipient=self.user.email, + subject=self.get_subject(), + template=template, + template_html=template_html, + context=self.get_context_data(), + ) diff --git a/readthedocs/notifications/forms.py b/readthedocs/notifications/forms.py deleted file mode 100644 index b42f3699502..00000000000 --- a/readthedocs/notifications/forms.py +++ /dev/null @@ -1,40 +0,0 @@ -"""HTML forms for sending notifications.""" -from django import forms -from django.utils.translation import gettext_lazy as _ - - -class SendNotificationForm(forms.Form): - - """ - Send notification form. - - Used for sending a notification to a list of users from admin pages - - Fields: - - _selected_action - This is required for the admin intermediate form to submit - - source - Source notification class to use, referenced by name - - :param notification_classes: List of notification sources to display - :type notification_classes: list - """ - - _selected_action = forms.CharField(widget=forms.MultipleHiddenInput) - - source = forms.ChoiceField(label=_("Notification"), choices=[]) - - def __init__(self, *args, **kwargs): - self.notification_classes = kwargs.pop("notification_classes", []) - super().__init__(*args, **kwargs) - self.fields["source"].choices = [ - (cls.name, cls.name) for cls in self.notification_classes - ] - - def clean_source(self): - """Get the source class from the class name.""" - source = self.cleaned_data["source"] - classes = {cls.name: cls for cls in self.notification_classes} - return classes.get(source, None) diff --git a/readthedocs/notifications/messages.py b/readthedocs/notifications/messages.py new file mode 100644 index 00000000000..2f0d01a5bbe --- /dev/null +++ b/readthedocs/notifications/messages.py @@ -0,0 +1,513 @@ +import textwrap + +from django.utils.translation import gettext_noop as _ + +from readthedocs.doc_builder.exceptions import ( + BuildAppError, + BuildCancelled, + BuildMaxConcurrencyError, + BuildUserError, + MkDocsYAMLParseError, +) +from readthedocs.projects.constants import BUILD_COMMANDS_OUTPUT_PATH_HTML + +from .constants import ERROR, INFO, NOTE, TIP, WARNING + + +class Message: + def __init__(self, id, header, body, type, icon_classes=None): + self.id = id + self.header = header + self.body = body + self.type = type # (ERROR, WARNING, INFO, NOTE, TIP) + self.icon_classes = icon_classes + self.format_values = {} + + def __repr__(self): + return f"" + + def __str__(self): + return f"Message: {self.id} | {self.header}" + + def set_format_values(self, format_values): + self.format_values = format_values or {} + + def get_display_icon_classes(self): + if self.icon_classes: + return self.icon_classes + + # Default classes that apply to all the notifications + classes = [ + "fas", + ] + + if self.type == ERROR: + classes.append("fa-circle-xmark") + if self.type == WARNING: + classes.append("fa-circle-exclamation") + if self.type == INFO: + classes.append("fa-circle-info") + if self.type == NOTE: + classes.append("fa-circle-info") + if self.type == TIP: + classes.append("fa-circle-info") + + return " ".join(classes) + + def get_rendered_header(self): + return self.header.format(**self.format_values) + + def get_rendered_body(self): + return self.body.format(**self.format_values) + + +# TODO: review the copy of these notifications/messages on PR review and adapt them. +# Most of them are copied from what we had in `readthedocs.doc_builder.exceptions` +# and slightly adapted to have a header and a better body. +BUILD_MESSAGES = [ + Message( + id=BuildAppError.GENERIC_WITH_BUILD_ID, + header=_("Unknown problem"), + # Note the message receives the instance it's attached to + # and could be use it to inject related data + body=_( + textwrap.dedent( + """ + There was a problem with Read the Docs while building your documentation. + Please try again later. + If this problem persists, + report this error to us with your build id ({instance.pk}). + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=BuildAppError.BUILD_TERMINATED_DUE_INACTIVITY, + header=_("Build terminated due inactivity"), + body=_( + textwrap.dedent( + """ + This build was terminated due to inactivity. + If you continue to encounter this error, + file a support request and reference this build id ({instance.pk}). + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=BuildUserError.GENERIC, + header=_("Unknown problem"), + body=_( + textwrap.dedent( + """ + We encountered a problem with a command while building your project. + To resolve this error, double check your project configuration and installed + dependencies are correct and have not changed recently. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=BuildMaxConcurrencyError.LIMIT_REACHED, + header=_("Maximum concurrency limit reached."), + body=_( + textwrap.dedent( + """ + Concurrency limit reached ({limit}), retrying in 5 minutes. + """ + ).strip(), + ), + type=INFO, + ), + Message( + id=BuildCancelled.CANCELLED_BY_USER, + header=_("Build cancelled manually."), + body=_( + textwrap.dedent( + """ + The user has cancelled this build. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=BuildUserError.SKIPPED_EXIT_CODE_183, + header=_("Build skipped manually."), + body=_( + textwrap.dedent( + """ + This build was skipped because + one of the commands exited with code 183 + """ + ).strip(), + ), + type=INFO, + ), + Message( + id=BuildUserError.BUILD_TIME_OUT, + header=_("Build exited due to time out."), + body=_( + textwrap.dedent( + """ + Build exited due to time out. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=BuildUserError.BUILD_EXCESSIVE_MEMORY, + header=_("Build exited due to excessive memory consumption."), + body=_( + textwrap.dedent( + """ + Build exited due to excessive memory consumption. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=BuildAppError.BUILD_DOCKER_UNKNOWN_ERROR, + header=_("Build exited due to unknown error."), + body=_( + textwrap.dedent( + """ + Build exited due to unknown error: {message} + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=BuildAppError.BUILDS_DISABLED, + header=_("Builds are temporary disabled for this project."), + body=_( + textwrap.dedent( + """ + This is due to excessive usage of our resources. + Please, contact our support team if you think this is a mistake + and builds should be re-enabled. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=BuildUserError.MAX_CONCURRENCY, + header=_("Concurrency limit reached"), + body=_( + textwrap.dedent( + """ + Your project, organization, or user is currently building the maximum concurrency builds allowed ({limit}). + It will automatically retry in 5 minutes. + """ + ).strip(), + ), + type=WARNING, + ), + Message( + id=BuildUserError.BUILD_COMMANDS_WITHOUT_OUTPUT, + header=_("No HTML content found"), + body=_( + textwrap.dedent( + f""" + No "{BUILD_COMMANDS_OUTPUT_PATH_HTML}" folder was created during this build. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=BuildUserError.BUILD_OUTPUT_IS_NOT_A_DIRECTORY, + header=_("Build output directory is not a directory"), + body=_( + textwrap.dedent( + """ + Build output directory for format "{artifact_type}" is not a directory. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=BuildUserError.BUILD_OUTPUT_HAS_0_FILES, + header=_("Build output directory doesn't contain any file"), + body=_( + textwrap.dedent( + """ + Build output directory for format "{artifact_type}" does not contain any files. + It seems the build process created the directory but did not save any file to it. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=BuildUserError.BUILD_OUTPUT_HAS_MULTIPLE_FILES, + header=_("Build output directory contains multiple files"), + body=_( + textwrap.dedent( + """ + Build output directory for format "{artifact_type}" contains multiple files + and it is not currently supported. + Please, remove all the files but the "{artifact_type}" you want to upload. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=BuildUserError.BUILD_OUTPUT_HTML_NO_INDEX_FILE, + header=_("Index file is not present in HTML output directory"), + body=_( + textwrap.dedent( + """ + Your documentation did not generate an 'index.html' at its root directory. + This is required for documentation serving at the root URL for this version. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=BuildUserError.BUILD_OUTPUT_OLD_DIRECTORY_USED, + header=_("Your project is outputing files in an old directory"), + body=_( + textwrap.dedent( + """ + Some files were detected in an unsupported output path, '_build/html'. + Ensure your project is configured to use the output path + '$READTHEDOCS_OUTPUT/html' instead. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=BuildUserError.NO_CONFIG_FILE_DEPRECATED, + header=_("Your project doesn't have a .readthedocs.yaml file"), + body=_( + textwrap.dedent( + """ + The configuration file required to build documentation is missing from your project. + Add a configuration file to your project to make it build successfully. + Read more at https://docs.readthedocs.io/en/stable/config-file/v2.html + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=BuildUserError.BUILD_IMAGE_CONFIG_KEY_DEPRECATED, + header=_("Configuration key build.image is deprecated"), + body=_( + textwrap.dedent( + """ + The configuration key "build.image" is deprecated. + Use "build.os" instead to continue building your project. + Read more at https://docs.readthedocs.io/en/stable/config-file/v2.html#build-os + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=BuildUserError.BUILD_OS_REQUIRED, + header=_("Configuration key build.os is required"), + body=_( + textwrap.dedent( + """ + The configuration key "build.os" is required to build your documentation. + Read more at https://docs.readthedocs.io/en/stable/config-file/v2.html#build-os + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=BuildUserError.FILE_TOO_LARGE, + header=_("There is at least one file that exceeds the size limit"), + body=_( + textwrap.dedent( + """ + A file from your build process is too large to be processed by Read the Docs. + Please ensure no generated files are larger than 1GB. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=BuildUserError.BUILD_OUTPUT_HAS_NO_PDF_FILES, + header=_("There is no PDF file in output directory"), + body=_( + textwrap.dedent( + f""" + PDF file was not generated/found in "{BUILD_COMMANDS_OUTPUT_PATH_HTML}/pdf" output directory. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=BuildUserError.BUILD_COMMANDS_IN_BETA, + header=_("Config key build.commands is in beta"), + body=_( + textwrap.dedent( + """ + The "build.commands" feature is in beta, and could have backwards incompatible changes while in beta. + Read more at our documentation to find out its limitations and potential issues. + """ + ).strip(), + ), + type=INFO, + ), + Message( + id=BuildUserError.TEX_FILE_NOT_FOUND, + header=_("No TeX files were found"), + body=_( + textwrap.dedent( + """ + No TeX files were found. + """ + ).strip(), + ), + type=ERROR, + ), +] + +BUILD_MKDOCS_MESSAGES = [ + Message( + id=MkDocsYAMLParseError.GENERIC_WITH_PARSE_EXCEPTION, + header=_(""), + body=_( + textwrap.dedent( + """ + Problem parsing MkDocs YAML configuration. {exception} + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=MkDocsYAMLParseError.INVALID_DOCS_DIR_CONFIG, + header=_(""), + body=_( + textwrap.dedent( + """ + The "docs_dir" config from your MkDocs YAML config file has to be a + string with relative or absolute path. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=MkDocsYAMLParseError.INVALID_DOCS_DIR_PATH, + header=_(""), + body=_( + textwrap.dedent( + """ + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=MkDocsYAMLParseError.INVALID_EXTRA_CONFIG, + header=_(""), + body=_( + textwrap.dedent( + """ + The "{config}" config from your MkDocs YAML config file has to be a + list of relative paths. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=MkDocsYAMLParseError.EMPTY_CONFIG, + header=_(""), + body=_( + textwrap.dedent( + """ + Please make sure the MkDocs YAML configuration file is not empty. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=MkDocsYAMLParseError.NOT_FOUND, + header=_(""), + body=_( + textwrap.dedent( + """ + A configuration file was not found. + Make sure you have a "mkdocs.yml" file in your repository. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=MkDocsYAMLParseError.CONFIG_NOT_DICT, + header=_(""), + body=_( + textwrap.dedent( + """ + Your MkDocs YAML config file is incorrect. + Please follow the user guide https://www.mkdocs.org/user-guide/configuration/ + to configure the file properly. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=MkDocsYAMLParseError.SYNTAX_ERROR, + header=_("Syntax error in mkdocs.yml"), + body=_( + textwrap.dedent( + """ + Your mkdocs.yml could not be loaded, + possibly due to a syntax error. + """ + ).strip(), + ), + type=ERROR, + ), +] + + +class MessagesRegistry: + def __init__(self): + self.messages = {} + + def add(self, messages): + if not isinstance(messages, list): + if not isinstance(messages, Message): + raise ValueError( + "A message should be instance of Message or a list of Messages." + ) + + messages = [messages] + + for message in messages: + if message.id in messages: + raise ValueError("A message with the same 'id' is already registered.") + self.messages[message.id] = message + + def get(self, message_id): + return self.messages.get(message_id) + + +registry = MessagesRegistry() +registry.add(BUILD_MKDOCS_MESSAGES) +registry.add(BUILD_MESSAGES) diff --git a/readthedocs/notifications/migrations/0001_initial.py b/readthedocs/notifications/migrations/0001_initial.py new file mode 100644 index 00000000000..2b4dc2e142c --- /dev/null +++ b/readthedocs/notifications/migrations/0001_initial.py @@ -0,0 +1,76 @@ +# Generated by Django 4.2.6 on 2023-11-23 17:26 + +import django.db.models.deletion +import django_extensions.db.fields +from django.db import migrations, models + + +class Migration(migrations.Migration): + initial = True + + dependencies = [ + ("contenttypes", "0002_remove_content_type_name"), + ] + + operations = [ + migrations.CreateModel( + name="Notification", + fields=[ + ( + "id", + models.AutoField( + auto_created=True, + primary_key=True, + serialize=False, + verbose_name="ID", + ), + ), + ( + "created", + django_extensions.db.fields.CreationDateTimeField( + auto_now_add=True, verbose_name="created" + ), + ), + ( + "modified", + django_extensions.db.fields.ModificationDateTimeField( + auto_now=True, verbose_name="modified" + ), + ), + ("message_id", models.CharField(max_length=128)), + ( + "state", + models.CharField( + choices=[ + ("unread", "unread"), + ("read", "read"), + ("dismissed", "dismissed"), + ("cancelled", "cancelled"), + ], + db_index=True, + default="unread", + max_length=128, + ), + ), + ("dismissable", models.BooleanField(default=False)), + ( + "news", + models.BooleanField( + default=False, help_text="Show under bell icon" + ), + ), + ("attached_to_id", models.PositiveIntegerField()), + ( + "attached_to_content_type", + models.ForeignKey( + on_delete=django.db.models.deletion.CASCADE, + to="contenttypes.contenttype", + ), + ), + ], + options={ + "get_latest_by": "modified", + "abstract": False, + }, + ), + ] diff --git a/readthedocs/notifications/migrations/0002_notification_format_values.py b/readthedocs/notifications/migrations/0002_notification_format_values.py new file mode 100644 index 00000000000..1bd97356bbc --- /dev/null +++ b/readthedocs/notifications/migrations/0002_notification_format_values.py @@ -0,0 +1,17 @@ +# Generated by Django 4.2.7 on 2023-11-29 15:44 + +from django.db import migrations, models + + +class Migration(migrations.Migration): + dependencies = [ + ("notifications", "0001_initial"), + ] + + operations = [ + migrations.AddField( + model_name="notification", + name="format_values", + field=models.JSONField(blank=True, null=True), + ), + ] diff --git a/readthedocs/notifications/migrations/__init__.py b/readthedocs/notifications/migrations/__init__.py new file mode 100644 index 00000000000..e69de29bb2d diff --git a/readthedocs/notifications/models.py b/readthedocs/notifications/models.py new file mode 100644 index 00000000000..7db34818f24 --- /dev/null +++ b/readthedocs/notifications/models.py @@ -0,0 +1,97 @@ +import textwrap + +import structlog +from django.contrib.contenttypes.fields import GenericForeignKey +from django.contrib.contenttypes.models import ContentType +from django.db import models +from django.utils.translation import gettext_noop as _ +from django_extensions.db.models import TimeStampedModel + +from readthedocs.core.context_processors import readthedocs_processor + +from .constants import CANCELLED, DISMISSED, READ, UNREAD, WARNING +from .messages import Message, registry +from .querysets import NotificationQuerySet + +log = structlog.get_logger(__name__) + + +class Notification(TimeStampedModel): + # Message identifier + message_id = models.CharField(max_length=128) + + # UNREAD: the notification was not shown to the user + # READ: the notifiation was shown + # DISMISSED: the notification was shown and the user dismissed it + # CANCELLED: removed automatically because the user has done the action required (e.g. paid the subscription) + state = models.CharField( + choices=[ + (UNREAD, UNREAD), + (READ, READ), + (DISMISSED, DISMISSED), + (CANCELLED, CANCELLED), + ], + default=UNREAD, + max_length=128, + db_index=True, + ) + + # Makes the notification imposible to dismiss (useful for Build notifications) + dismissable = models.BooleanField(default=False) + + # Show the notification under the bell icon for the user + news = models.BooleanField(default=False, help_text=_("Show under bell icon")) + + # Notification attached to Organization, Project, Build or User + # + # Uses ContentType for this. + # https://docs.djangoproject.com/en/4.2/ref/contrib/contenttypes/#generic-relations + # + attached_to_content_type = models.ForeignKey(ContentType, on_delete=models.CASCADE) + attached_to_id = models.PositiveIntegerField() + attached_to = GenericForeignKey("attached_to_content_type", "attached_to_id") + + # Store values known at creation time that are required to render the final message + format_values = models.JSONField(null=True, blank=True) + + # Use a custom manager with an ``.add()`` method that deduplicates + # notifications attached to the same object. + objects = NotificationQuerySet.as_manager() + + def __str__(self): + return self.message_id + + def get_message(self): + message = registry.get(self.message_id) + if message is None: + # Log the error and return an unknown error to avoid breaking the response. + log.error( + "The message ID retrieved is not in our registry anymore.", + message_id=self.message_id, + ) + return Message( + id="unknown-message", + header=_("Unknown message"), + body=_( + textwrap.dedent( + """ + It seems it was an old message, + and is not in our registry anymore. + """ + ).strip(), + ), + type=WARNING, + ) + + # Pass the instance attached to this notification + all_format_values = { + "instance": self.attached_to, + } + + # Always include global variables + all_format_values.update(readthedocs_processor(None)) + + # Pass the values stored in the database + all_format_values.update(self.format_values or {}) + message.set_format_values(all_format_values) + return message diff --git a/readthedocs/notifications/notification.py b/readthedocs/notifications/notification.py deleted file mode 100644 index 4c13569923f..00000000000 --- a/readthedocs/notifications/notification.py +++ /dev/null @@ -1,185 +0,0 @@ -"""Support for templating of notifications.""" - -import structlog -from django.conf import settings -from django.db import models -from django.http import HttpRequest -from django.template import Context, Template -from django.template.loader import render_to_string - -from readthedocs.core.context_processors import readthedocs_processor - -from . import constants -from .backends import send_notification - -log = structlog.get_logger(__name__) - - -class Notification: - - """ - An unsent notification linked to an object. - - This class provides an interface to construct notification messages by - rendering Django templates. The ``Notification`` itself is not expected - to be persisted by the backends. - - Call .send() to send the notification. - """ - - name = None - context_object_name = "object" - app_templates = None - level = constants.INFO - subject = None - user = None - send_email = True - extra_tags = "" - - def __init__(self, context_object, request=None, user=None, extra_context=None): - self.object = context_object - self.request = request or HttpRequest() - self.extra_context = extra_context or {} - self.user = user - if self.user is None: - self.user = request.user - - def get_subject(self): - template = Template(self.subject) - return template.render(context=Context(self.get_context_data())) - - def get_context_data(self): - context = { - self.context_object_name: self.object, - "request": self.request, - "production_uri": "{scheme}://{host}".format( - scheme="https", - host=settings.PRODUCTION_DOMAIN, - ), - } - context.update(self.extra_context) - context.update(readthedocs_processor(self.request)) - return context - - def get_template_names(self, backend_name, source_format=constants.HTML): - names = [] - if self.object and isinstance(self.object, models.Model): - meta = self.object._meta - names.append( - "{app}/notifications/{name}_{backend}.{source_format}".format( - app=self.app_templates or meta.app_label, - name=self.name or meta.model_name, - backend=backend_name, - source_format=source_format, - ), - ) - return names - - raise AttributeError() - - def render(self, backend_name, source_format=constants.HTML): - return render_to_string( - template_name=self.get_template_names( - backend_name=backend_name, - source_format=source_format, - ), - context=self.get_context_data(), - ) - - def send(self): - """ - Trigger notification send through all notification backends. - - In order to limit which backends a notification will send out from, - override this method and duplicate the logic from - :py:func:`send_notification`, taking care to limit which backends are - avoided. - """ - send_notification(self.request, self) - - -class SiteNotification(Notification): - - """ - Simple notification to show *only* on site messages. - - ``success_message`` and ``failure_message`` can be a simple string or a - dictionary with different messages depending on the reason of the failure / - success. The message is selected by using ``reason`` to get the proper - value. - - The notification is tied to the ``user`` and it could be sticky, persistent - or normal --this depends on the ``success_level`` and ``failure_level``. - - .. note:: - - ``send_email`` is forced to False to not send accidental emails when - only a simple site notification is needed. - """ - - send_email = False - - success_message = None - failure_message = None - - success_level = constants.SUCCESS_NON_PERSISTENT - failure_level = constants.ERROR_NON_PERSISTENT - - def __init__( - self, - user, - success, - reason=None, - context_object=None, - request=None, - extra_context=None, - ): - self.object = context_object - - self.user = user or request.user - # Fake the request in case the notification is instantiated from a place - # without access to the request object (Celery task, for example) - self.request = request or HttpRequest() - self.request.user = user - - self.success = success - self.reason = reason - super().__init__(context_object, request, user, extra_context) - - def get_message_level(self): - if self.success: - return self.success_level - return self.failure_level - - def get_message(self, success): - if success: - message = self.success_message - else: - message = self.failure_message - - msg = "" # default message in case of error - if isinstance(message, dict): - if self.reason: - if self.reason in message.keys(): - msg = message.get(self.reason) - else: - # log the error but not crash - log.error( - "Notification has no key for messages", - notification=self.__class__.__name__, - key=self.reason, - message="success" if self.success else "failure", - ) - else: - log.error( - "{message} is a dictionary but no reason was provided", - notification=self.__class__.__name__, - message="success" if self.success else "failure", - ) - else: - msg = message - - return Template(msg).render(context=Context(self.get_context_data())) - - def render(self, *args, **kwargs): - return self.get_message(self.success) diff --git a/readthedocs/notifications/querysets.py b/readthedocs/notifications/querysets.py new file mode 100644 index 00000000000..787ecf17197 --- /dev/null +++ b/readthedocs/notifications/querysets.py @@ -0,0 +1,34 @@ +from django.contrib.contenttypes.models import ContentType +from django.db import models +from django.utils import timezone + + +class NotificationQuerySet(models.QuerySet): + def add(self, *args, **kwargs): + """ + Create a notification without duplicating it. + + If a notification with the same ``message_id`` is already attached to the object, + its ``modified_at`` timestamp is updated. + + Otherwise, a new notification object is created for this object. + """ + + message_id = kwargs.get("message_id") + attached_to = kwargs.pop("attached_to") + + content_type = ContentType.objects.get_for_model(attached_to) + notification = self.filter( + attached_to_content_type=content_type, + attached_to_id=attached_to.id, + message_id=message_id, + ).first() + + if notification: + self.filter(pk=notification.pk).update( + *args, modified=timezone.now(), **kwargs + ) + notification.refresh_from_db() + return notification + + return super().create(*args, attached_to=attached_to, **kwargs) diff --git a/readthedocs/notifications/storages.py b/readthedocs/notifications/storages.py deleted file mode 100644 index ce9e58aa346..00000000000 --- a/readthedocs/notifications/storages.py +++ /dev/null @@ -1,159 +0,0 @@ -"""Customised storage for notifications.""" - -from django.contrib.messages.storage.base import Message -from django.db.models import Q -from django.utils.safestring import mark_safe -from messages_extends.constants import PERSISTENT_MESSAGE_LEVELS -from messages_extends.models import Message as PersistentMessage -from messages_extends.storages import FallbackStorage, PersistentStorage - -from .constants import NON_PERSISTENT_MESSAGE_LEVELS - -try: - from django.utils import timezone -except ImportError: - from datetime import datetime as timezone - - -class FallbackUniqueStorage(FallbackStorage): - - """ - Persistent message fallback storage, but only stores unique notifications. - - This loops through all backends to find messages to store, but will skip - this step if the message already exists for the user in the database. - - Deduplication is important here, as a persistent message may ask a user to - perform a specific action, such as change a build option. Duplicated - messages would lead to confusing UX, where a duplicate message may not be - dismissed when the prescribed action is taken. Instead of detecting - duplication while triggering the message, we handle this at the storage - level. - - This class also assumes that notifications set as persistent messages are - more likely to have HTML that should be marked safe. If the level matches a - persistent message level, mark the message text as safe so that we can - render the text in templates. - """ - - storages_names = ( - "readthedocs.notifications.storages.NonPersistentStorage", - "messages_extends.storages.StickyStorage", - "messages_extends.storages.PersistentStorage", - "django.contrib.messages.storage.cookie.CookieStorage", - "django.contrib.messages.storage.session.SessionStorage", - ) - - def _get(self, *args, **kwargs): - # The database backend for persistent messages doesn't support setting - # messages with ``mark_safe``, therefore, we need to do it broadly here. - messages, all_ret = super()._get(self, *args, **kwargs) - - safe_messages = [] - for message in messages: - # Handle all message types, if the message is persistent, take - # special action. As the default message handler, this will also - # process ephemeral messages - if ( - message.level - in PERSISTENT_MESSAGE_LEVELS + NON_PERSISTENT_MESSAGE_LEVELS - ): - message_pk = message.pk - message = Message( - message.level, - mark_safe(message.message), - message.extra_tags, - ) - message.pk = message_pk - safe_messages.append(message) - return safe_messages, all_ret - - def add(self, level, message, extra_tags="", *args, **kwargs): # noqa - user = kwargs.get("user") or self.request.user - if not user.is_anonymous: - persist_messages = PersistentMessage.objects.filter( - message=message, - user=user, - read=False, - ) - if persist_messages.exists(): - return - super().add(level, message, extra_tags, *args, **kwargs) - - -class NonPersistentStorage(PersistentStorage): - - """ - Save one time (non-pesistent) messages in the database. - - Messages are saved into the database. ``user`` object is mandatory but - ``request`` is needed. - """ - - # Non-persistent level numbers start at 100 and it's used to check if it's - # an actionable message or not - level = 100 - - def _message_queryset(self, include_read=False): - """Return a queryset of non persistent messages for the request user.""" - expire = timezone.now() - - qs = ( - PersistentMessage.objects.filter(user=self.get_user()) - .filter(Q(expires=None) | Q(expires__gt=expire)) - .filter(level__in=NON_PERSISTENT_MESSAGE_LEVELS) - ) - if not include_read: - qs = qs.exclude(read=True) - - # We need to save the objects returned by the query before updating it, - # otherwise they are marked as ``read`` and not returned when executed - result = list(qs) - - # Mark non-persistent messages as read when show them - qs.update(read=True) - - return result - - # These methods (_store, process_message) are copied from - # https://github.com/AliLozano/django-messages-extends/blob/master/messages_extends/storages.py - # and replaced PERSISTENT_MESSAGE_LEVELS by NON_PERSISTENT_MESSAGE_LEVELS - - def _store(self, messages, response, *args, **kwargs): - # There are already saved. - return [ - message - for message in messages - if message.level not in NON_PERSISTENT_MESSAGE_LEVELS - ] - - def process_message(self, message, *args, **kwargs): - """ - Convert messages to models and save them. - - If its level is into non-persistent levels, convert the message to - models and save it - """ - if message.level not in NON_PERSISTENT_MESSAGE_LEVELS: - return message - - user = kwargs.get("user") or self.get_user() - - try: - anonymous = user.is_anonymous - except TypeError: - anonymous = user.is_anonymous - if anonymous: - raise NotImplementedError( - "Persistent message levels cannot be used for anonymous users.", - ) - message_persistent = PersistentMessage() - message_persistent.level = message.level - message_persistent.message = message.message - message_persistent.extra_tags = message.extra_tags - message_persistent.user = user - - if "expires" in kwargs: - message_persistent.expires = kwargs["expires"] - message_persistent.save() - return None diff --git a/readthedocs/notifications/urls.py b/readthedocs/notifications/urls.py deleted file mode 100644 index 8659fc17b19..00000000000 --- a/readthedocs/notifications/urls.py +++ /dev/null @@ -1,17 +0,0 @@ -"""Renames for messages_extends URLs.""" - -from django.urls import path -from messages_extends.views import message_mark_all_read, message_mark_read - -urlpatterns = [ - path( - "dismiss//", - message_mark_read, - name="message_mark_read", - ), - path( - "dismiss/all/", - message_mark_all_read, - name="message_mark_all_read", - ), -] diff --git a/readthedocs/notifications/views.py b/readthedocs/notifications/views.py deleted file mode 100644 index 8b3cd23c725..00000000000 --- a/readthedocs/notifications/views.py +++ /dev/null @@ -1,120 +0,0 @@ -"""Django views for the notifications app.""" -from django.contrib import messages -from django.http import HttpResponseRedirect -from django.views.generic import FormView - -from .forms import SendNotificationForm - - -class SendNotificationView(FormView): - - """ - Form view for sending notifications to users from admin pages. - - Accepts the following additional parameters: - - :param queryset: Queryset to use to determine the users to send emails to - :param action_name: Name of the action to pass to the form template, - determines the action to pass back to the admin view - :param notification_classes: List of :py:class:`Notification` classes to - display in the form - """ - - form_class = SendNotificationForm - template_name = "notifications/send_notification_form.html" - action_name = "send_email" - notification_classes = [] - - def get_form_kwargs(self): - """ - Override form kwargs based on input fields. - - The admin posts to this view initially, so detect the send button on - form post variables. Drop additional fields if we see the send button. - """ - kwargs = super().get_form_kwargs() - kwargs["notification_classes"] = self.notification_classes - if "send" not in self.request.POST: - kwargs.pop("data", None) - kwargs.pop("files", None) - return kwargs - - def get_initial(self): - """Add selected ids to initial form data.""" - initial = super().get_initial() - initial["_selected_action"] = self.request.POST.getlist("_selected_action") - return initial - - def form_valid(self, form): - """If form is valid, send notification to recipients.""" - count = 0 - notification_cls = form.cleaned_data["source"] - for obj in self.get_queryset().all(): - for recipient in self.get_object_recipients(obj): - notification = notification_cls( - context_object=obj, - request=self.request, - user=recipient, - ) - notification.send() - count += 1 - if count == 0: - self.message_user("No recipients to send to", level=messages.ERROR) - else: - self.message_user("Queued {} messages".format(count)) - return HttpResponseRedirect(self.request.get_full_path()) - - def get_object_recipients(self, obj): - """ - Iterate over queryset objects and return User objects. - - This allows for non-User querysets to pass back a list of Users to send - to. By default, assume we're working with :py:class:`User` objects and - just yield the single object. - - For example, this could be made to return project owners with:: - - for owner in AdminPermission.members(project): - yield owner - - :param obj: object from queryset, type is dependent on model class - :rtype: django.contrib.auth.models.User - """ - yield obj - - def get_queryset(self): - return self.kwargs.get("queryset") - - def get_context_data(self, **kwargs): - """Return queryset in context.""" - context = super().get_context_data(**kwargs) - recipients = [] - for obj in self.get_queryset().all(): - recipients.extend(self.get_object_recipients(obj)) - context["recipients"] = recipients - context["action_name"] = self.action_name - return context - - def message_user( - self, - message, - level=messages.INFO, - extra_tags="", - fail_silently=False, - ): - """ - Implementation of. - - :py:meth:`django.contrib.admin.options.ModelAdmin.message_user` - - Send message through messages framework - """ - # TODO generalize this or check if implementation in ModelAdmin is - # usable here - messages.add_message( - self.request, - level, - message, - extra_tags=extra_tags, - fail_silently=fail_silently, - ) diff --git a/readthedocs/oauth/notifications.py b/readthedocs/oauth/notifications.py index 0246606885f..257c8cfd39f 100644 --- a/readthedocs/oauth/notifications.py +++ b/readthedocs/oauth/notifications.py @@ -1,79 +1,98 @@ -from django.urls import reverse -from django.utils.translation import gettext_lazy as _ -from messages_extends.constants import ERROR_PERSISTENT +"""Notifications related to OAuth.""" + +import textwrap -from readthedocs.notifications import SiteNotification -from readthedocs.notifications.constants import ERROR_NON_PERSISTENT +from django.utils.translation import gettext_lazy as _ +from readthedocs.notifications.constants import ERROR, INFO +from readthedocs.notifications.messages import Message, registry -class AttachWebhookNotification(SiteNotification): - # Fail reasons - NO_PERMISSIONS = "no_permissions" - NO_ACCOUNTS = "no_accounts" +MESSAGE_OAUTH_WEBHOOK_NO_PERMISSIONS = "oauth:webhook:no-permissions" +MESSAGE_OAUTH_WEBHOOK_NO_ACCOUNT = "oauth:webhook:no-account" +MESSAGE_OAUTH_WEBHOOK_INVALID = "oauth:webhook:invalid" +MESSAGE_OAUTH_BUILD_STATUS_FAILURE = "oauth:status:send-failed" +MESSAGE_OAUTH_DEPLOY_KEY_ATTACHED_SUCCESSFULY = "oauth:deploy-key:attached-successfully" +MESSAGE_OAUTH_DEPLOY_KEY_ATTACHED_FAILED = "oauth:deploy-key:attached-failed" - context_object_name = "provider" - success_message = _("Webhook successfully added.") - failure_message = { - NO_PERMISSIONS: _( - 'Could not add webhook for {{ project.name }}. Make sure you have the correct {{ provider.name }} permissions.', # noqa +messages = [ + Message( + id=MESSAGE_OAUTH_WEBHOOK_NO_ACCOUNT, + header=_("Unable to attach webhook to this project"), + body=_( + textwrap.dedent( + """ + Could not add webhook for {instance.name}. + Please connect your {provider_name} account. + """ + ).strip(), ), - NO_ACCOUNTS: _( - 'Could not add webhook for {{ project.name }}. Please connect your {{ provider.name }} account.', # noqa + type=ERROR, + ), + Message( + id=MESSAGE_OAUTH_WEBHOOK_NO_PERMISSIONS, + header=_("Unable to attach webhook to this project"), + body=_( + textwrap.dedent( + """ + Could not add webhook for {instance.name}. + Make sure you have the correct {provider_name} permissions. + """ + ).strip(), ), - } - - def get_context_data(self): - context = super().get_context_data() - project = self.extra_context.get("project") - context.update( - { - "url_connect_account": reverse( - "projects_integrations", - args=[project.slug], - ), - "url_docs_webhook": "http://docs.readthedocs.io/en/latest/webhooks.html", # noqa - } - ) - return context - - -class InvalidProjectWebhookNotification(SiteNotification): - context_object_name = "project" - failure_level = ERROR_PERSISTENT - failure_message = _( - "The project {{ project.name }} doesn't have a valid webhook set up, " - "commits won't trigger new builds for this project. " - "See the project integrations for more information.", - ) # noqa - - def get_context_data(self): - context = super().get_context_data() - context.update( - { - "url_integrations": reverse( - "projects_integrations", - args=[self.object.slug], - ), - } - ) - return context - - -class GitBuildStatusFailureNotification(SiteNotification): - context_object_name = "project" - failure_level = ERROR_NON_PERSISTENT - failure_message = _( - "Could not send {{ provider_name }} build status report for {{ project.name }}. " - "Make sure you have the correct {{ provider_name }} repository permissions and " - 'your {{ provider_name }} account ' - "is connected to Read the Docs." - ) # noqa - - def get_context_data(self): - context = super().get_context_data() - context.update( - { - "url_connect_account": reverse("socialaccount_connections"), - } - ) - return context + type=ERROR, + ), + Message( + id=MESSAGE_OAUTH_WEBHOOK_INVALID, + header=_("The project doesn't have a valid webhook set up"), + body=_( + textwrap.dedent( + """ + The project {instance.name} doesn't have a valid webhook set up, + commits won't trigger new builds for this project. + See the project integrations for more information. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=MESSAGE_OAUTH_BUILD_STATUS_FAILURE, + header=_("{provier_name} build status report failed"), + body=_( + textwrap.dedent( + """ + Could not send {provider_name} build status report for {instance.name}. + Make sure you have the correct {provider_name} repository permissions and + your {provider_name} account + is connected to Read the Docs. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=MESSAGE_OAUTH_DEPLOY_KEY_ATTACHED_SUCCESSFULY, + header=_("Deploy key added successfully"), + body=_( + textwrap.dedent( + """ + Successfully added deploy key to {provider_name} project. + """ + ).strip(), + ), + type=INFO, + ), + Message( + id=MESSAGE_OAUTH_DEPLOY_KEY_ATTACHED_FAILED, + header=_("Failed to add deploy key to project"), + body=_( + textwrap.dedent( + """ + Failed to add deploy key to {provider_name} project, ensure you have the correct permissions and try importing again. + """ + ).strip(), + ), + type=ERROR, + ), +] +registry.add(messages) diff --git a/readthedocs/oauth/tasks.py b/readthedocs/oauth/tasks.py index a49bf924987..be7d83c00b3 100644 --- a/readthedocs/oauth/tasks.py +++ b/readthedocs/oauth/tasks.py @@ -6,13 +6,16 @@ from allauth.socialaccount.providers import registry as allauth_registry from django.contrib.auth.models import User from django.db.models.functions import ExtractIsoWeekDay +from django.urls import reverse from django.utils import timezone from readthedocs.core.permissions import AdminPermission from readthedocs.core.utils.tasks import PublicTask, user_id_matches_or_superuser +from readthedocs.notifications.models import Notification from readthedocs.oauth.notifications import ( - AttachWebhookNotification, - InvalidProjectWebhookNotification, + MESSAGE_OAUTH_WEBHOOK_INVALID, + MESSAGE_OAUTH_WEBHOOK_NO_ACCOUNT, + MESSAGE_OAUTH_WEBHOOK_NO_PERMISSIONS, ) from readthedocs.oauth.services.base import SyncServiceError from readthedocs.oauth.utils import SERVICE_MAP @@ -166,17 +169,22 @@ def attach_webhook(project_pk, user_pk, integration=None): if not project or not user: return False - project_notification = InvalidProjectWebhookNotification( - context_object=project, - user=user, - success=False, - ) if integration: service = SERVICE_MAP.get(integration.integration_type) if not service: log.warning("There are no registered services in the application.") - project_notification.send() + Notification.objects.add( + message_id=MESSAGE_OAUTH_WEBHOOK_INVALID, + attached_to=project, + dismissable=True, + format_values={ + "url_integrations": reverse( + "projects_integrations", + args=[project.slug], + ), + }, + ) return None else: for service_cls in registry: @@ -185,23 +193,27 @@ def attach_webhook(project_pk, user_pk, integration=None): break else: log.warning("There are no registered services in the application.") - project_notification.send() + Notification.objects.add( + message_id=MESSAGE_OAUTH_WEBHOOK_INVALID, + attached_to=project, + dismissable=True, + format_values={ + "url_integrations": reverse( + "projects_integrations", + args=[project.slug], + ), + }, + ) return None provider = allauth_registry.by_id(service.adapter.provider_id) - notification = AttachWebhookNotification( - context_object=provider, - extra_context={"project": project}, - user=user, - success=None, - ) user_accounts = service.for_user(user) for account in user_accounts: success, __ = account.setup_webhook(project, integration=integration) if success: - notification.success = True - notification.send() + # NOTE: do we want to communicate that we connect the webhook here? + # messages.add_message(request, "Webhook successfully added.") project.has_valid_webhook = True project.save() @@ -209,12 +221,27 @@ def attach_webhook(project_pk, user_pk, integration=None): # No valid account found if user_accounts: - notification.success = False - notification.reason = AttachWebhookNotification.NO_PERMISSIONS + Notification.objects.add( + message_id=MESSAGE_OAUTH_WEBHOOK_NO_PERMISSIONS, + dismissable=True, + attached_to=project, + format_values={ + "provider_name": provider.name, + "url_docs_webhook": "https://docs.readthedocs.io/page/webhooks.html", + }, + ) else: - notification.success = False - notification.reason = AttachWebhookNotification.NO_ACCOUNTS + Notification.objects.add( + message_id=MESSAGE_OAUTH_WEBHOOK_NO_ACCOUNT, + dismissable=True, + attached_to=project, + format_values={ + "provider_name": provider.name, + "url_connect_account": reverse( + "project_integrations", + args=[project.slug], + ), + }, + ) - project_notification.send() - notification.send() return False diff --git a/readthedocs/projects/admin.py b/readthedocs/projects/admin.py index 5a0bd4a005e..63956565657 100644 --- a/readthedocs/projects/admin.py +++ b/readthedocs/projects/admin.py @@ -10,7 +10,6 @@ from readthedocs.builds.models import Version from readthedocs.core.history import ExtraSimpleHistoryAdmin, set_change_reason from readthedocs.core.utils import trigger_build -from readthedocs.notifications.views import SendNotificationView from readthedocs.projects.tasks.search import reindex_version from readthedocs.redirects.models import Redirect @@ -29,7 +28,6 @@ WebHook, WebHookEvent, ) -from .notifications import ResourceUsageNotification from .tag_utils import import_tags from .tasks.utils import clean_project_resources @@ -50,16 +48,6 @@ def has_delete_permission(self, request, obj=None): return False -class ProjectSendNotificationView(SendNotificationView): - notification_classes = [ - ResourceUsageNotification, - ] - - def get_object_recipients(self, obj): - for owner in obj.users.all(): - yield owner - - class ProjectRelationshipInline(admin.TabularInline): """Project inline relationship view for :py:class:`ProjectAdmin`.""" @@ -260,7 +248,6 @@ class ProjectAdmin(ExtraSimpleHistoryAdmin): ) raw_id_fields = ("users", "main_language_project", "remote_repository") actions = [ - "send_owner_email", "ban_owner", "run_spam_rule_checks", "build_default_version", @@ -277,13 +264,6 @@ def matching_spam_rules(self, obj): def feature_flags(self, obj): return "\n".join([str(f.get_feature_display()) for f in obj.features]) - @admin.action(description="Notify project owners") - def send_owner_email(self, request, queryset): - view = ProjectSendNotificationView.as_view( - action_name="send_owner_email", - ) - return view(request, queryset=queryset) - def run_spam_rule_checks(self, request, queryset): """Run all the spam checks on this project.""" if "readthedocsext.spamfighting" not in settings.INSTALLED_APPS: diff --git a/readthedocs/projects/apps.py b/readthedocs/projects/apps.py index c1154b6c5ad..f4a1525c190 100644 --- a/readthedocs/projects/apps.py +++ b/readthedocs/projects/apps.py @@ -7,6 +7,8 @@ class ProjectsConfig(AppConfig): name = "readthedocs.projects" def ready(self): + # Load and register notification messages for this application + import readthedocs.projects.notifications # noqa import readthedocs.projects.tasks.builds # noqa import readthedocs.projects.tasks.search # noqa import readthedocs.projects.tasks.utils # noqa diff --git a/readthedocs/projects/exceptions.py b/readthedocs/projects/exceptions.py index a48106e7ed0..fa15a79a2fe 100644 --- a/readthedocs/projects/exceptions.py +++ b/readthedocs/projects/exceptions.py @@ -1,7 +1,5 @@ """Project exceptions.""" -from django.conf import settings -from django.utils.translation import gettext_noop as _ from readthedocs.doc_builder.exceptions import BuildAppError, BuildUserError @@ -10,60 +8,23 @@ class ProjectConfigurationError(BuildUserError): """Error raised trying to configure a project for build.""" - NOT_FOUND = _( - "A configuration file was not found. " - "Make sure you have a conf.py file in your repository.", - ) - - MULTIPLE_CONF_FILES = _( - "We found more than one conf.py and are not sure which one to use. " - "Please, specify the correct file under the Advanced settings tab " - "in the project's Admin.", - ) + NOT_FOUND = "project:sphinx:conf-py-not-found" + MULTIPLE_CONF_FILES = "project:sphinx:multiple-conf-py-files-found" class UserFileNotFound(BuildUserError): - FILE_NOT_FOUND = _("The file {} doesn't exist. Make sure it's a valid file path.") + FILE_NOT_FOUND = "project:file:not-found" class RepositoryError(BuildUserError): """Failure during repository operation.""" - PRIVATE_ALLOWED = _( - "There was a problem connecting to your repository, " - "ensure that your repository URL is correct.", - ) - PRIVATE_NOT_ALLOWED = _( - "There was a problem connecting to your repository, " - "ensure that your repository URL is correct and your repository is public. " - "Private repositories are not supported.", - ) - DUPLICATED_RESERVED_VERSIONS = _( - "You can not have two versions with the name latest or stable." - " Ensure you don't have both a branch and a tag with this name." - ) - - FAILED_TO_CHECKOUT = _("Failed to checkout revision: {}") - - GENERIC_ERROR = _( - "There was a problem cloning your repository. " - "Please check the command output for more information.", - ) - - # NOTE: we are not using `@property` here because Python 3.8 does not - # suport `@property` together with `@classmethod`. On Python >= 3.9, we - # could call `RepositoryError.CLONE_ERROR` without parenthesis and it will - # work. However, for now, we are just using a class method and calling it - # as a function/method. - @classmethod - def CLONE_ERROR(cls): # noqa: N802 - if settings.ALLOW_PRIVATE_REPOS: - return cls.PRIVATE_ALLOWED - return cls.PRIVATE_NOT_ALLOWED - - def get_default_message(self): - return self.GENERIC_ERROR + CLONE_ERROR_WITH_PRIVATE_REPO_ALLOWED = "project:repository:private-clone-error" + CLONE_ERROR_WITH_PRIVATE_REPO_NOT_ALLOWED = "project:repository:public-clone-error" + DUPLICATED_RESERVED_VERSIONS = "project:repository:duplicated-reserved-versions" + FAILED_TO_CHECKOUT = "project:repository:checkout-failed" + GENERIC = "project:repository:generic-error" class SyncRepositoryLocked(BuildAppError): diff --git a/readthedocs/projects/models.py b/readthedocs/projects/models.py index d27e0fafb2d..5dea927ed40 100644 --- a/readthedocs/projects/models.py +++ b/readthedocs/projects/models.py @@ -36,6 +36,7 @@ from readthedocs.core.utils import extract_valid_attributes_for_model, slugify from readthedocs.core.utils.url import unsafe_join_url_path from readthedocs.domains.querysets import DomainQueryset +from readthedocs.notifications.models import Notification as NewNotification from readthedocs.projects import constants from readthedocs.projects.exceptions import ProjectConfigurationError from readthedocs.projects.managers import HTMLFileManager @@ -537,6 +538,13 @@ class Project(models.Model): blank=True, ) + notifications = GenericRelation( + NewNotification, + related_query_name="project", + content_type_field="attached_to_content_type", + object_id_field="attached_to_id", + ) + # TODO: remove the following fields since they all are going to be ignored # by the application when we start requiring a ``.readthedocs.yaml`` file. # These fields are: @@ -1588,6 +1596,9 @@ def processed_json(self): class Notification(TimeStampedModel): + + """WebHook / Email notification attached to a Project.""" + # TODO: Overridden from TimeStampedModel just to allow null values, # remove after deploy. created = CreationDateTimeField( diff --git a/readthedocs/projects/notifications.py b/readthedocs/projects/notifications.py index 81a5178dca9..d8973e5b6e7 100644 --- a/readthedocs/projects/notifications.py +++ b/readthedocs/projects/notifications.py @@ -1,28 +1,135 @@ -"""Project notifications.""" +"""Notifications related to projects.""" +import textwrap -from django.urls import reverse -from django.utils.translation import gettext_lazy as _ -from messages_extends.constants import ERROR_PERSISTENT +from django.utils.translation import gettext_noop as _ -from readthedocs.notifications import Notification, SiteNotification -from readthedocs.notifications.constants import REQUIREMENT +from readthedocs.notifications.constants import ERROR, INFO +from readthedocs.notifications.messages import Message, registry +from readthedocs.projects.exceptions import ( + ProjectConfigurationError, + RepositoryError, + UserFileNotFound, +) - -class ResourceUsageNotification(Notification): - name = "resource_usage" - context_object_name = "project" - subject = "Builds for {{ project.name }} are using too many resources" - level = REQUIREMENT - - -class EmailConfirmNotification(SiteNotification): - failure_level = ERROR_PERSISTENT - failure_message = _( - "Your primary email address is not verified. " - 'Please verify it here.', - ) - - def get_context_data(self): - context = super().get_context_data() - context.update({"account_email_url": reverse("account_email")}) - return context +MESSAGE_PROJECT_SKIP_BUILDS = "project:invalid:skip-builds" +messages = [ + Message( + id=MESSAGE_PROJECT_SKIP_BUILDS, + header=_("Build skipped for this project"), + body=_( + textwrap.dedent( + """ + Your project is currently disabled for abuse of the system. + Please make sure it isn't using unreasonable amounts of resources or triggering lots of builds in a short amount of time. + Please contact support to get your project re-enabled. + """ + ).strip(), + ), + type=INFO, + ), + Message( + id=RepositoryError.GENERIC, + header=_("Error while cloning the repository"), + body=_( + textwrap.dedent( + """ + There was a problem cloning your repository. + Please check the command output for more information. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=RepositoryError.DUPLICATED_RESERVED_VERSIONS, + header=_("Duplicated reserved versions"), + body=_( + textwrap.dedent( + """ + You can not have two versions with the name latest or stable. + Ensure you don't have both a branch and a tag with this name. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=RepositoryError.FAILED_TO_CHECKOUT, + header=_("Error while checking out the repository"), + body=_( + textwrap.dedent( + """ + Failed to checkout revision: {revision} + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=RepositoryError.CLONE_ERROR_WITH_PRIVATE_REPO_ALLOWED, + header=_("Error while cloning the repository"), + body=_( + textwrap.dedent( + """ + There was a problem connecting to your repository, + ensure that your repository URL is correct. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=RepositoryError.CLONE_ERROR_WITH_PRIVATE_REPO_NOT_ALLOWED, + header=_("Error while cloning the repository"), + body=_( + textwrap.dedent( + """ + There was a problem connecting to your repository, + ensure that your repository URL is correct and your repository is public. + Private repositories are not supported. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=ProjectConfigurationError.NOT_FOUND, + header=_("Sphinx configuration file is missing"), + body=_( + textwrap.dedent( + """ + A configuration file was not found. + Make sure you have a conf.py file in your repository. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=ProjectConfigurationError.MULTIPLE_CONF_FILES, + header=_("Multiple Sphinx configuration files found"), + body=_( + textwrap.dedent( + """ + We found more than one conf.py and are not sure which one to use. + Please, specify the correct file under the Advanced settings tab + in the project's Admin. + """ + ).strip(), + ), + type=ERROR, + ), + Message( + id=UserFileNotFound.FILE_NOT_FOUND, + header=_("Expected file not found"), + body=_( + textwrap.dedent( + """ + The file {filename} doesn't exist. Make sure it's a valid file path. + """ + ).strip(), + ), + type=ERROR, + ), +] +registry.add(messages) diff --git a/readthedocs/projects/tasks/builds.py b/readthedocs/projects/tasks/builds.py index 19a0e508a85..ccd7ef219a6 100644 --- a/readthedocs/projects/tasks/builds.py +++ b/readthedocs/projects/tasks/builds.py @@ -37,8 +37,8 @@ from readthedocs.builds.models import APIVersion, Build from readthedocs.builds.signals import build_complete from readthedocs.builds.utils import memcache_lock -from readthedocs.config import ConfigError from readthedocs.config.config import BuildConfigV2 +from readthedocs.config.exceptions import ConfigError from readthedocs.doc_builder.director import BuildDirector from readthedocs.doc_builder.environments import ( DockerBuildEnvironment, @@ -49,10 +49,7 @@ BuildCancelled, BuildMaxConcurrencyError, BuildUserError, - BuildUserSkip, MkDocsYAMLParseError, - ProjectBuildsSkippedError, - YAMLParseError, ) from readthedocs.projects.models import Feature from readthedocs.storage import build_media_storage @@ -292,12 +289,9 @@ class UpdateDocsTask(SyncRepositoryMixin, Task): # All exceptions generated by a user miss-configuration should be listed # here. Actually, every subclass of ``BuildUserError``. throws = ( - ProjectBuildsSkippedError, ConfigError, - YAMLParseError, BuildCancelled, BuildUserError, - BuildUserSkip, RepositoryError, MkDocsYAMLParseError, ProjectConfigurationError, @@ -305,16 +299,14 @@ class UpdateDocsTask(SyncRepositoryMixin, Task): # Do not send notifications on failure builds for these exceptions. exceptions_without_notifications = ( - BuildCancelled, - BuildMaxConcurrencyError, - BuildUserSkip, - ProjectBuildsSkippedError, + BuildCancelled.CANCELLED_BY_USER, + BuildUserError.MAX_CONCURRENCY, + BuildUserError.SKIPPED_EXIT_CODE_183, + BuildAppError.BUILDS_DISABLED, ) # Do not send external build status on failure builds for these exceptions. - exceptions_without_external_build_status = ( - BuildMaxConcurrencyError, - ) + exceptions_without_external_build_status = (BuildUserError.MAX_CONCURRENCY,) acks_late = True track_started = True @@ -339,7 +331,7 @@ def sigint_received(*args, **kwargs): log.warning('Ignoring cancelling the build at "Uploading" state.') return - raise BuildCancelled + raise BuildCancelled(message_id=BuildCancelled.CANCELLED_BY_USER) # Do not send the SIGTERM signal to children (pip is automatically killed when # receives SIGTERM and make the build to fail one command and stop build) @@ -373,16 +365,17 @@ def _check_concurrency_limit(self): log.info("Concurrency limit reached, retrying task.") self.retry( exc=BuildMaxConcurrencyError( - BuildMaxConcurrencyError.message.format( - limit=max_concurrent_builds, - ) + BuildMaxConcurrencyError.LIMIT_REACHED, + format_values={ + "limit": max_concurrent_builds, + }, ) ) def _check_project_disabled(self): if self.data.project.skip: log.warning('Project build skipped.') - raise ProjectBuildsSkippedError + raise BuildAppError(BuildAppError.BUILDS_DISABLED) def before_start(self, task_id, args, kwargs): # Create the object to store all the task-related data @@ -417,6 +410,10 @@ def before_start(self, task_id, args, kwargs): # because they just build the latest commit for that version self.data.build_commit = kwargs.get('build_commit') + self.data.build_director = BuildDirector( + data=self.data, + ) + log.bind( # NOTE: ``self.data.build`` is just a regular dict, not an APIBuild :'( builder=self.data.build['builder'], @@ -475,46 +472,37 @@ def on_failure(self, exc, task_id, args, kwargs, einfo): 'id': self.data.build_pk, } - # TODO: handle this `ConfigError` as a `BuildUserError` in the - # following block - if isinstance(exc, ConfigError): - self.data.build['error'] = str( - YAMLParseError( - YAMLParseError.GENERIC_WITH_PARSE_EXCEPTION.format( - exception=str(exc), - ), - ), - ) # Known errors in our application code (e.g. we couldn't connect to # Docker API). Report a generic message to the user. - elif isinstance(exc, BuildAppError): - self.data.build['error'] = BuildAppError.GENERIC_WITH_BUILD_ID.format( - build_id=self.data.build['id'], - ) + if isinstance(exc, BuildAppError): + message_id = BuildAppError.GENERIC_WITH_BUILD_ID + # Known errors in the user's project (e.g. invalid config file, invalid # repository, command failed, etc). Report the error back to the user - # using the `message` and `state` attributes from the exception itself. - # Otherwise, use a generic message and default state. + # by creating a notification attached to the build + # Otherwise, use a notification with a generic message. elif isinstance(exc, BuildUserError): - if hasattr(exc, 'message') and exc.message is not None: - self.data.build['error'] = exc.message + if hasattr(exc, "message_id") and exc.message_id is not None: + message_id = exc.message_id else: - self.data.build['error'] = BuildUserError.GENERIC + message_id = BuildUserError.GENERIC - if hasattr(exc, "state"): - self.data.build["state"] = exc.state else: # We don't know what happened in the build. Log the exception and - # report a generic message to the user. + # report a generic notification to the user. # Note we are using `log.error(exc_info=...)` instead of `log.exception` # because this is not executed inside a try/except block. log.error("Build failed with unhandled exception.", exc_info=exc) - self.data.build["error"] = BuildAppError.GENERIC_WITH_BUILD_ID.format( - build_id=self.data.build["id"], - ) + message_id = BuildAppError.GENERIC_WITH_BUILD_ID + + # Grab the format values from the exception in case it contains + format_values = exc.format_values if hasattr(exc, "format_values") else None + + # Attach the notification to the build + self.data.build_director.attach_notification(message_id, format_values) # Send notifications for unhandled errors - if not isinstance(exc, self.exceptions_without_notifications): + if message_id not in self.exceptions_without_notifications: self.send_notifications( self.data.version_pk, self.data.build['id'], @@ -527,15 +515,16 @@ def on_failure(self, exc, task_id, args, kwargs, einfo): # Oh, I think this is to differentiate a task triggered with # `Build.commit` than a one triggered just with the `Version` to build # the _latest_ commit of it - if self.data.build_commit and not isinstance( - exc, self.exceptions_without_external_build_status + if ( + self.data.build_commit + and message_id not in self.exceptions_without_external_build_status ): version_type = None if self.data.version: version_type = self.data.version.type status = BUILD_STATUS_FAILURE - if isinstance(exc, BuildUserSkip): + if message_id == BuildUserError.SKIPPED_EXIT_CODE_183: # The build was skipped by returning the magic exit code, # marked as CANCELLED, but communicated to GitHub as successful. # This is because the PR has to be available for merging when the build @@ -594,9 +583,10 @@ def get_valid_artifact_types(self): output_format=artifact_type, ) raise BuildUserError( - BuildUserError.BUILD_OUTPUT_IS_NOT_A_DIRECTORY.format( - artifact_type=artifact_type - ) + BuildUserError.BUILD_OUTPUT_IS_NOT_A_DIRECTORY, + format_values={ + "artifact_type": artifact_type, + }, ) # Check if there are multiple files on artifact directories. @@ -694,7 +684,9 @@ def on_retry(self, exc, task_id, args, kwargs, einfo): project_slug=self.data.project.slug, version_slug=self.data.version.slug, ) - self.data.build['error'] = exc.message + self.data.director.attach_notification( + BuildMaxConcurrencyError.LIMIT_REACHED + ) self.update_build(state=BUILD_STATE_TRIGGERED) def after_return(self, status, retval, task_id, args, kwargs, einfo): @@ -762,10 +754,6 @@ def update_build(self, state=None): log.exception("Error while updating the build object.", state=state) def execute(self): - self.data.build_director = BuildDirector( - data=self.data, - ) - # Clonning self.update_build(state=BUILD_STATE_CLONING) diff --git a/readthedocs/projects/tasks/utils.py b/readthedocs/projects/tasks/utils.py index 45056023204..e157f14051b 100644 --- a/readthedocs/projects/tasks/utils.py +++ b/readthedocs/projects/tasks/utils.py @@ -8,7 +8,6 @@ from django.conf import settings from django.db.models import Q from django.utils import timezone -from django.utils.translation import gettext_lazy as _ from readthedocs.builds.constants import ( BUILD_FINAL_STATES, @@ -18,6 +17,8 @@ from readthedocs.builds.models import Build from readthedocs.builds.tasks import send_build_status from readthedocs.core.utils.filesystem import safe_rmtree +from readthedocs.doc_builder.exceptions import BuildAppError +from readthedocs.notifications.models import Notification from readthedocs.storage import build_media_storage from readthedocs.worker import app @@ -129,12 +130,13 @@ def finish_inactive_builds(): build.success = False build.state = BUILD_STATE_CANCELLED - build.error = _( - "This build was terminated due to inactivity. If you " - "continue to encounter this error, file a support " - "request with and reference this build id ({}).".format(build.pk), - ) build.save() + + Notification.objects.add( + message_id=BuildAppError.BUILD_TERMINATED_DUE_INACTIVITY, + attached_to=build, + ) + builds_finished.append(build.pk) projects_finished.add(build.project.slug) diff --git a/readthedocs/projects/tests/mockers.py b/readthedocs/projects/tests/mockers.py index 1149b4016d8..dc4e9a23fe6 100644 --- a/readthedocs/projects/tests/mockers.py +++ b/readthedocs/projects/tests/mockers.py @@ -242,3 +242,8 @@ def _mock_api(self): f"{settings.SLUMBER_API_HOST}/api/v2/revoke/", status_code=204, ) + + self.requestsmock.post( + f"{settings.SLUMBER_API_HOST}/api/v2/notifications/", + status_code=204, + ) diff --git a/readthedocs/projects/tests/test_build_tasks.py b/readthedocs/projects/tests/test_build_tasks.py index de60cd4d5f3..10578bc8ed4 100644 --- a/readthedocs/projects/tests/test_build_tasks.py +++ b/readthedocs/projects/tests/test_build_tasks.py @@ -13,10 +13,11 @@ EXTERNAL, ) from readthedocs.builds.models import Build -from readthedocs.config import ALL, ConfigError +from readthedocs.config import ALL from readthedocs.config.config import BuildConfigV2 +from readthedocs.config.exceptions import ConfigError from readthedocs.config.tests.test_config import get_build_config -from readthedocs.doc_builder.exceptions import BuildAppError +from readthedocs.doc_builder.exceptions import BuildUserError from readthedocs.projects.exceptions import RepositoryError from readthedocs.projects.models import EnvironmentVariable, Project, WebHookEvent from readthedocs.projects.tasks.builds import sync_repository_task, update_docs_task @@ -637,11 +638,10 @@ def test_failed_build( # Force an exception from the execution of the task. We don't really # care "where" it was raised: setup, build, syncing directories, etc - execute.side_effect = Exception('Force and exception here.') + execute.side_effect = BuildUserError(message_id=BuildUserError.GENERIC) self._trigger_update_docs_task() - # It has to be called twice, ``before_start`` and ``after_return`` clean_build.assert_has_calls([ mock.call(mock.ANY), # the argument is an APIVersion @@ -670,15 +670,27 @@ def test_failed_build( # and the task won't be run. assert not BuildData.objects.all().exists() + notification_request = self.requests_mock.request_history[-3] + assert notification_request._request.method == "POST" + assert notification_request.path == "/api/v2/notifications/" + assert notification_request.json() == { + "attached_to": f"build/{self.build.pk}", + "message_id": BuildUserError.GENERIC, + "state": "unread", + "dismissable": False, + "news": False, + "format_values": {}, + } + # Test we are updating the DB by calling the API with the updated build object # The second last one should be the PATCH for the build - api_request = self.requests_mock.request_history[-2] - assert api_request._request.method == "PATCH" - assert api_request.path == "/api/v2/build/1/" - assert api_request.json() == { + build_status_request = self.requests_mock.request_history[-2] + assert build_status_request._request.method == "PATCH" + assert build_status_request.path == "/api/v2/build/1/" + assert build_status_request.json() == { "builder": mock.ANY, "commit": self.build.commit, - "error": BuildAppError.GENERIC_WITH_BUILD_ID.format(build_id=self.build.pk), + "error": "", # We are not sending ``error`` anymore "id": self.build.pk, "length": mock.ANY, "state": "finished", @@ -1851,21 +1863,32 @@ class TestBuildTaskExceptionHandler(BuildEnvironmentBase): @mock.patch("readthedocs.doc_builder.director.load_yaml_config") def test_config_file_exception(self, load_yaml_config): load_yaml_config.side_effect = ConfigError( - code="invalid", message="Invalid version in config file." + message_id=ConfigError.INVALID_VERSION, ) self._trigger_update_docs_task() - # This is a known exceptions. We hit the API saving the correct error - # in the Build object. In this case, the "error message" coming from - # the exception will be shown to the user - api_request = self.requests_mock.request_history[-2] - assert api_request._request.method == "PATCH" - assert api_request.path == "/api/v2/build/1/" - assert api_request.json() == { + # This is a known exceptions. We hit the notification API to attach a + # notification to this particular build. + notification_request = self.requests_mock.request_history[-3] + assert notification_request._request.method == "POST" + assert notification_request.path == "/api/v2/notifications/" + assert notification_request.json() == { + "attached_to": f"build/{self.build.pk}", + "message_id": ConfigError.INVALID_VERSION, + "state": "unread", + "dismissable": False, + "news": False, + "format_values": {}, + } + + build_status_request = self.requests_mock.request_history[-2] + assert build_status_request._request.method == "PATCH" + assert build_status_request.path == "/api/v2/build/1/" + assert build_status_request.json() == { "id": 1, "state": "finished", "commit": "a1b2c3", - "error": "Problem in your project's configuration. Invalid version in config file.", + "error": "", # We not sending "error" anymore "success": False, "builder": mock.ANY, "length": 0, @@ -1926,4 +1949,4 @@ def test_check_duplicate_reserved_version_latest(self, on_failure, verbose_name) exception = on_failure.call_args[0][0] assert isinstance(exception, RepositoryError) == True - assert exception.message == RepositoryError.DUPLICATED_RESERVED_VERSIONS + assert exception.message_id == RepositoryError.DUPLICATED_RESERVED_VERSIONS diff --git a/readthedocs/projects/views/private.py b/readthedocs/projects/views/private.py index 9c895d8f9e0..4068063a937 100644 --- a/readthedocs/projects/views/private.py +++ b/readthedocs/projects/views/private.py @@ -39,8 +39,10 @@ ) from readthedocs.core.history import UpdateChangeReasonPostView from readthedocs.core.mixins import ListViewWithForm, PrivateViewMixin +from readthedocs.core.notifications import MESSAGE_EMAIL_VALIDATION_PENDING from readthedocs.integrations.models import HttpExchange, Integration from readthedocs.invitations.models import Invitation +from readthedocs.notifications.models import Notification from readthedocs.oauth.services import registry from readthedocs.oauth.tasks import attach_webhook from readthedocs.oauth.utils import update_webhook @@ -71,7 +73,6 @@ ProjectRelationship, WebHook, ) -from readthedocs.projects.notifications import EmailConfirmNotification from readthedocs.projects.tasks.utils import clean_project_resources from readthedocs.projects.utils import get_csv_file from readthedocs.projects.views.base import ProjectAdminMixin @@ -113,17 +114,23 @@ def get_context_data(self, **kwargs): def validate_primary_email(self, user): """ - Sends a persistent error notification. + Sends a dismissable site notification to this user. Checks if the user has a primary email or if the primary email - is verified or not. Sends a persistent error notification if + is verified or not. Sends a dismissable notification if either of the condition is False. """ email_qs = user.emailaddress_set.filter(primary=True) email = email_qs.first() if not email or not email.verified: - notification = EmailConfirmNotification(user=user, success=False) - notification.send() + Notification.objects.add( + attached_to=user, + message_id=MESSAGE_EMAIL_VALIDATION_PENDING, + dismissable=True, + format_values={ + "account_email_url": reverse("account_email"), + }, + ) def get_queryset(self): sort = self.request.GET.get('sort') diff --git a/readthedocs/rtd_tests/tests/test_backend.py b/readthedocs/rtd_tests/tests/test_backend.py index 720276d7fde..1f66323f8d7 100644 --- a/readthedocs/rtd_tests/tests/test_backend.py +++ b/readthedocs/rtd_tests/tests/test_backend.py @@ -160,8 +160,8 @@ def test_git_checkout_invalid_revision(self): with self.assertRaises(RepositoryError) as e: repo.checkout(version) self.assertEqual( - str(e.exception), - RepositoryError.FAILED_TO_CHECKOUT.format(version), + e.exception.message_id, + RepositoryError.FAILED_TO_CHECKOUT, ) def test_check_for_submodules(self): @@ -408,8 +408,8 @@ def test_checkout_invalid_revision(self): with self.assertRaises(RepositoryError) as e: repo.checkout(version) self.assertEqual( - str(e.exception), - RepositoryError.FAILED_TO_CHECKOUT.format(version), + e.exception.message_id, + RepositoryError.FAILED_TO_CHECKOUT, ) def test_parse_tags(self): diff --git a/readthedocs/rtd_tests/tests/test_builds.py b/readthedocs/rtd_tests/tests/test_builds.py index 088e229a7c9..a0c2fa171f5 100644 --- a/readthedocs/rtd_tests/tests/test_builds.py +++ b/readthedocs/rtd_tests/tests/test_builds.py @@ -248,24 +248,6 @@ def test_build_is_stale(self): self.assertTrue(build_two.is_stale) self.assertFalse(build_three.is_stale) - def test_deprecated_config_used(self): - now = timezone.now() - - build = get( - Build, - project=self.project, - version=self.version, - date=now - datetime.timedelta(minutes=8), - state="finished", - ) - - self.assertTrue(build.deprecated_config_used()) - - build.config = {"version": 2} - build.save() - - self.assertFalse(build.deprecated_config_used()) - def test_build_is_external(self): # Turn the build version to EXTERNAL type. self.version.type = EXTERNAL diff --git a/readthedocs/rtd_tests/tests/test_celery.py b/readthedocs/rtd_tests/tests/test_celery.py index b7bd05cfa0a..54df2b23db6 100644 --- a/readthedocs/rtd_tests/tests/test_celery.py +++ b/readthedocs/rtd_tests/tests/test_celery.py @@ -5,12 +5,13 @@ from django.contrib.auth.models import User from django.test import TestCase from django_dynamic_fixture import get -from messages_extends.models import Message from readthedocs.builds import tasks as build_tasks from readthedocs.builds.constants import BUILD_STATUS_SUCCESS, EXTERNAL, LATEST from readthedocs.builds.models import Build, Version +from readthedocs.notifications.models import Notification from readthedocs.oauth.models import RemoteRepository, RemoteRepositoryRelation +from readthedocs.oauth.notifications import MESSAGE_OAUTH_BUILD_STATUS_FAILURE from readthedocs.projects.models import Project @@ -105,7 +106,7 @@ def test_send_build_status_with_remote_repo_github(self, send_build_status): external_build.commit, BUILD_STATUS_SUCCESS, ) - self.assertEqual(Message.objects.filter(user=self.eric).count(), 0) + self.assertEqual(Notification.objects.count(), 0) @patch("readthedocs.oauth.services.github.GitHubService.send_build_status") def test_send_build_status_with_social_account_github(self, send_build_status): @@ -125,7 +126,7 @@ def test_send_build_status_with_social_account_github(self, send_build_status): external_build.commit, BUILD_STATUS_SUCCESS, ) - self.assertEqual(Message.objects.filter(user=self.eric).count(), 0) + self.assertEqual(Notification.objects.count(), 0) @patch("readthedocs.oauth.services.github.GitHubService.send_build_status") def test_send_build_status_no_remote_repo_or_social_account_github( @@ -140,7 +141,17 @@ def test_send_build_status_no_remote_repo_or_social_account_github( ) send_build_status.assert_not_called() - self.assertEqual(Message.objects.filter(user=self.eric).count(), 1) + self.assertEqual(Notification.objects.count(), 1) + notification = Notification.objects.all().first() + self.assertEqual(notification.message_id, MESSAGE_OAUTH_BUILD_STATUS_FAILURE) + self.assertEqual(notification.attached_to, self.project) + self.assertEqual( + notification.format_values, + { + "provider_name": "GitHub", + "url_connect_account": "/accounts/social/connections/", + }, + ) @patch("readthedocs.oauth.services.gitlab.GitLabService.send_build_status") def test_send_build_status_with_remote_repo_gitlab(self, send_build_status): @@ -168,7 +179,7 @@ def test_send_build_status_with_remote_repo_gitlab(self, send_build_status): external_build.commit, BUILD_STATUS_SUCCESS, ) - self.assertEqual(Message.objects.filter(user=self.eric).count(), 0) + self.assertEqual(Notification.objects.count(), 0) @patch("readthedocs.oauth.services.gitlab.GitLabService.send_build_status") def test_send_build_status_with_social_account_gitlab(self, send_build_status): @@ -188,7 +199,7 @@ def test_send_build_status_with_social_account_gitlab(self, send_build_status): external_build.commit, BUILD_STATUS_SUCCESS, ) - self.assertEqual(Message.objects.filter(user=self.eric).count(), 0) + self.assertEqual(Notification.objects.count(), 0) @patch("readthedocs.oauth.services.gitlab.GitLabService.send_build_status") def test_send_build_status_no_remote_repo_or_social_account_gitlab( @@ -203,4 +214,14 @@ def test_send_build_status_no_remote_repo_or_social_account_gitlab( ) send_build_status.assert_not_called() - self.assertEqual(Message.objects.filter(user=self.eric).count(), 1) + self.assertEqual(Notification.objects.count(), 1) + notification = Notification.objects.all().first() + self.assertEqual(notification.message_id, MESSAGE_OAUTH_BUILD_STATUS_FAILURE) + self.assertEqual(notification.attached_to, self.project) + self.assertEqual( + notification.format_values, + { + "provider_name": "GitLab", + "url_connect_account": "/accounts/social/connections/", + }, + ) diff --git a/readthedocs/rtd_tests/tests/test_doc_builder.py b/readthedocs/rtd_tests/tests/test_doc_builder.py index 23fae392bc9..08004d6b0ad 100644 --- a/readthedocs/rtd_tests/tests/test_doc_builder.py +++ b/readthedocs/rtd_tests/tests/test_doc_builder.py @@ -113,11 +113,14 @@ def test_project_without_conf_py( build_env=self.build_env, python_env=python_env, ) - with pytest.raises( - ProjectConfigurationError, match=ProjectConfigurationError.NOT_FOUND - ): + with self.assertRaises(ProjectConfigurationError) as e: base_sphinx.append_conf() + self.assertEqual( + e.exception.message_id, + ProjectConfigurationError.NOT_FOUND, + ) + @patch("readthedocs.doc_builder.backends.sphinx.BaseSphinx.docs_dir") @patch("readthedocs.doc_builder.backends.sphinx.BaseSphinx.get_config_params") @patch("readthedocs.doc_builder.backends.sphinx.BaseSphinx.run") @@ -615,11 +618,10 @@ def test_empty_yaml_config(self, checkout_path): python_env=python_env, ) - with self.assertRaisesMessage( - MkDocsYAMLParseError, MkDocsYAMLParseError.EMPTY_CONFIG - ): + with self.assertRaises(MkDocsYAMLParseError) as exc: with override_settings(DOCROOT=tmpdir): self.searchbuilder.append_conf() + self.assertEqual(exc.exception.message_id, MkDocsYAMLParseError.EMPTY_CONFIG) @patch("readthedocs.projects.models.Project.checkout_path") def test_yaml_config_not_returns_dict(self, checkout_path): @@ -644,8 +646,10 @@ def test_yaml_config_not_returns_dict(self, checkout_path): python_env=python_env, ) - with self.assertRaisesMessage( - MkDocsYAMLParseError, MkDocsYAMLParseError.CONFIG_NOT_DICT - ): + with self.assertRaises(MkDocsYAMLParseError) as e: with override_settings(DOCROOT=tmpdir): self.searchbuilder.append_conf() + self.assertEqual( + e.exception.message_id, + MkDocsYAMLParseError.CONFIG_NOT_DICT, + ) diff --git a/readthedocs/rtd_tests/tests/test_imported_file.py b/readthedocs/rtd_tests/tests/test_imported_file.py index 7ea1b83ee0c..86b60c4907e 100644 --- a/readthedocs/rtd_tests/tests/test_imported_file.py +++ b/readthedocs/rtd_tests/tests/test_imported_file.py @@ -1,5 +1,6 @@ import os +import pytest from django.conf import settings from django.core.files.storage import get_storage_class from django.test import TestCase @@ -13,6 +14,7 @@ base_dir = os.path.dirname(os.path.dirname(__file__)) +@pytest.mark.search @override_settings(ELASTICSEARCH_DSL_AUTOSYNC=True) class ImportedFileTests(TestCase): fixtures = ["eric", "test_data"] diff --git a/readthedocs/rtd_tests/tests/test_notifications.py b/readthedocs/rtd_tests/tests/test_notifications.py index d6f648799a2..135a67c5000 100644 --- a/readthedocs/rtd_tests/tests/test_notifications.py +++ b/readthedocs/rtd_tests/tests/test_notifications.py @@ -4,52 +4,40 @@ from unittest import mock import django_dynamic_fixture as fixture -from allauth.account.models import EmailAddress -from django.contrib.auth.models import AnonymousUser, User +from django.contrib.auth.models import User from django.test import TestCase from django.test.utils import override_settings -from messages_extends.models import Message as PersistentMessage from readthedocs.builds.models import Build -from readthedocs.notifications import Notification, SiteNotification -from readthedocs.notifications.backends import EmailBackend, SiteBackend -from readthedocs.notifications.constants import ( - ERROR, - INFO_NON_PERSISTENT, - WARNING_NON_PERSISTENT, -) +from readthedocs.notifications.email import EmailNotification @override_settings( - NOTIFICATION_BACKENDS=[ - "readthedocs.notifications.backends.EmailBackend", - "readthedocs.notifications.backends.SiteBackend", - ], PRODUCTION_DOMAIN="readthedocs.org", SUPPORT_EMAIL="support@readthedocs.org", ) -@mock.patch("readthedocs.notifications.notification.render_to_string") -@mock.patch.object(Notification, "send") +@mock.patch("readthedocs.notifications.email.render_to_string") +@mock.patch.object(EmailNotification, "send") class NotificationTests(TestCase): def test_notification_custom(self, send, render_to_string): render_to_string.return_value = "Test" - class TestNotification(Notification): + class TestNotification(EmailNotification): name = "foo" subject = "This is {{ foo.id }}" context_object_name = "foo" + user = fixture.get(User) build = fixture.get(Build) - req = mock.MagicMock() - notify = TestNotification(context_object=build, request=req) + notify = TestNotification(context_object=build, user=user) self.assertEqual( - notify.get_template_names("email"), + notify.get_template_names("html"), ["builds/notifications/foo_email.html"], ) self.assertEqual( - notify.get_template_names("site"), - ["builds/notifications/foo_site.html"], + notify.get_template_names("txt"), + ["builds/notifications/foo_email.txt"], ) self.assertEqual( notify.get_subject(), @@ -60,7 +48,6 @@ class TestNotification(Notification): { "foo": build, "production_uri": "https://readthedocs.org", - "request": req, # readthedocs_processor context "DASHBOARD_ANALYTICS_CODE": mock.ANY, "DO_NOT_TRACK_ENABLED": mock.ANY, @@ -76,32 +63,29 @@ class TestNotification(Notification): }, ) - notify.render("site") + notify.render("html") render_to_string.assert_has_calls( [ mock.call( context=mock.ANY, - template_name=["builds/notifications/foo_site.html"], + template_name=["builds/notifications/foo_email.html"], ), ] ) class NotificationBackendTests(TestCase): - @mock.patch("readthedocs.notifications.backends.send_email") + @mock.patch("readthedocs.notifications.email.send_email") def test_email_backend(self, send_email): - class TestNotification(Notification): + class TestNotification(EmailNotification): name = "foo" subject = "This is {{ foo.id }}" context_object_name = "foo" - level = ERROR build = fixture.get(Build) - req = mock.MagicMock() user = fixture.get(User) - notify = TestNotification(context_object=build, request=req, user=user) - backend = EmailBackend(request=req) - backend.send(notify) + notify = TestNotification(context_object=build, user=user) + notify.send() send_email.assert_has_calls( [ @@ -114,155 +98,3 @@ class TestNotification(Notification): ), ] ) - - @mock.patch("readthedocs.notifications.notification.render_to_string") - def test_message_backend(self, render_to_string): - render_to_string.return_value = "Test" - - class TestNotification(Notification): - name = "foo" - subject = "This is {{ foo.id }}" - context_object_name = "foo" - - build = fixture.get(Build) - user = fixture.get(User) - req = mock.MagicMock() - notify = TestNotification(context_object=build, request=req, user=user) - backend = SiteBackend(request=req) - backend.send(notify) - - self.assertEqual(render_to_string.call_count, 1) - self.assertEqual(PersistentMessage.objects.count(), 1) - - message = PersistentMessage.objects.first() - self.assertEqual(message.user, user) - - @mock.patch("readthedocs.notifications.notification.render_to_string") - def test_message_anonymous_user(self, render_to_string): - """Anonymous user still throwns exception on persistent messages.""" - render_to_string.return_value = "Test" - - class TestNotification(Notification): - name = "foo" - subject = "This is {{ foo.id }}" - context_object_name = "foo" - - build = fixture.get(Build) - user = AnonymousUser() - req = mock.MagicMock() - notify = TestNotification(context_object=build, request=req, user=user) - backend = SiteBackend(request=req) - - self.assertEqual(PersistentMessage.objects.count(), 0) - - # We should never be adding persistent messages for anonymous users. - # Make sure message_extends still throws an exception here - with self.assertRaises(NotImplementedError): - backend.send(notify) - - self.assertEqual(render_to_string.call_count, 1) - self.assertEqual(PersistentMessage.objects.count(), 0) - - @mock.patch("readthedocs.notifications.backends.send_email") - def test_non_persistent_message(self, send_email): - class TestNotification(SiteNotification): - name = "foo" - success_message = "Test success message" - success_level = INFO_NON_PERSISTENT - - user = fixture.get(User) - # Setting the primary and verified email address of the user - email = fixture.get(EmailAddress, user=user, primary=True, verified=True) - - n = TestNotification(user, True) - backend = SiteBackend(request=None) - - self.assertEqual(PersistentMessage.objects.count(), 0) - backend.send(n) - # No email is sent for non persistent messages - send_email.assert_not_called() - self.assertEqual(PersistentMessage.objects.count(), 1) - self.assertEqual(PersistentMessage.objects.filter(read=False).count(), 1) - - self.client.force_login(user) - response = self.client.get("/dashboard/") - self.assertContains(response, "Test success message") - self.assertEqual(PersistentMessage.objects.count(), 1) - self.assertEqual(PersistentMessage.objects.filter(read=True).count(), 1) - - response = self.client.get("/dashboard/") - self.assertNotContains(response, "Test success message") - - -@override_settings( - PRODUCTION_DOMAIN="readthedocs.org", - SUPPORT_EMAIL="support@readthedocs.org", -) -class SiteNotificationTests(TestCase): - class TestSiteNotification(SiteNotification): - name = "foo" - success_message = "simple success message" - failure_message = { - 1: "simple failure message", - 2: "{{ object.name }} object name", - "three": "{{ object.name }} and {{ other.name }} render", - } - success_level = INFO_NON_PERSISTENT - failure_level = WARNING_NON_PERSISTENT - - def setUp(self): - self.user = fixture.get(User) - self.context = {"other": {"name": "other name"}} - self.n = self.TestSiteNotification( - self.user, - True, - context_object={"name": "object name"}, - extra_context=self.context, - ) - - def test_context_data(self): - context = { - "object": {"name": "object name"}, - "request": mock.ANY, - "production_uri": "https://readthedocs.org", - "other": {"name": "other name"}, - # readthedocs_processor context - "DASHBOARD_ANALYTICS_CODE": mock.ANY, - "DO_NOT_TRACK_ENABLED": mock.ANY, - "GLOBAL_ANALYTICS_CODE": mock.ANY, - "PRODUCTION_DOMAIN": "readthedocs.org", - "PUBLIC_DOMAIN": mock.ANY, - "PUBLIC_API_URL": mock.ANY, - "SITE_ROOT": mock.ANY, - "SUPPORT_EMAIL": "support@readthedocs.org", - "TEMPLATE_ROOT": mock.ANY, - "USE_PROMOS": mock.ANY, - "USE_ORGANIZATIONS": mock.ANY, - } - self.assertEqual(self.n.get_context_data(), context) - - def test_message_level(self): - self.n.success = True - self.assertEqual(self.n.get_message_level(), INFO_NON_PERSISTENT) - - self.n.success = False - self.assertEqual(self.n.get_message_level(), WARNING_NON_PERSISTENT) - - def test_message(self): - self.n.reason = 1 - self.assertEqual(self.n.get_message(True), "simple success message") - self.n.reason = "three" - self.assertEqual(self.n.get_message(True), "simple success message") - - self.n.reason = 1 - self.assertEqual(self.n.get_message(False), "simple failure message") - self.n.reason = 2 - self.assertEqual(self.n.get_message(False), "object name object name") - self.n.reason = "three" - self.assertEqual(self.n.get_message(False), "object name and other name render") - - # Invalid reason - self.n.reason = None - with mock.patch("readthedocs.notifications.notification.log") as mock_log: - self.assertEqual(self.n.get_message(False), "") - mock_log.error.assert_called_once() diff --git a/readthedocs/rtd_tests/tests/test_project.py b/readthedocs/rtd_tests/tests/test_project.py index 776dbf44908..30336ce2547 100644 --- a/readthedocs/rtd_tests/tests/test_project.py +++ b/readthedocs/rtd_tests/tests/test_project.py @@ -88,11 +88,12 @@ def test_multiple_conf_file_one_doc_in_path(self, find_method): def test_conf_file_not_found(self, find_method, full_find_method): find_method.return_value = [] full_find_method.return_value = [] - with self.assertRaisesMessage( - ProjectConfigurationError, - ProjectConfigurationError.NOT_FOUND, - ) as cm: + with self.assertRaises(ProjectConfigurationError) as e: self.pip.conf_file() + self.assertEqual( + e.exception.message_id, + ProjectConfigurationError.NOT_FOUND, + ) @patch('readthedocs.projects.models.Project.find') def test_multiple_conf_files(self, find_method): @@ -101,12 +102,14 @@ def test_multiple_conf_files(self, find_method): '/home/docs/rtfd/code/readthedocs.org/user_builds/pip/checkouts/multi-conf.py/src/sub/conf.py', '/home/docs/rtfd/code/readthedocs.org/user_builds/pip/checkouts/multi-conf.py/src/sub/src/conf.py', ] - with self.assertRaisesMessage( - ProjectConfigurationError, - ProjectConfigurationError.MULTIPLE_CONF_FILES, - ) as cm: + with self.assertRaises(ProjectConfigurationError) as e: self.pip.conf_file() + self.assertEqual( + e.exception.message_id, + ProjectConfigurationError.MULTIPLE_CONF_FILES, + ) + def test_get_storage_path(self): for type_ in MEDIA_TYPES: self.assertEqual( diff --git a/readthedocs/settings/base.py b/readthedocs/settings/base.py index 90bc763f4aa..24fb48d56b1 100644 --- a/readthedocs/settings/base.py +++ b/readthedocs/settings/base.py @@ -224,11 +224,11 @@ def INSTALLED_APPS(self): # noqa 'rest_framework', 'rest_framework.authtoken', "rest_framework_api_key", + "generic_relations", 'corsheaders', 'annoying', 'django_extensions', 'crispy_forms', - 'messages_extends', 'django_elasticsearch_dsl', 'django_filters', 'polymorphic', @@ -349,13 +349,6 @@ def MIDDLEWARE(self): }, ] - MESSAGE_STORAGE = 'readthedocs.notifications.storages.FallbackUniqueStorage' - - NOTIFICATION_BACKENDS = [ - 'readthedocs.notifications.backends.EmailBackend', - 'readthedocs.notifications.backends.SiteBackend', - ] - # Paths SITE_ROOT = os.path.dirname( os.path.dirname(os.path.dirname(os.path.abspath(__file__)))) @@ -423,6 +416,7 @@ def TEMPLATES(self): 'django.template.context_processors.request', # Read the Docs processor 'readthedocs.core.context_processors.readthedocs_processor', + 'readthedocs.core.context_processors.user_notifications', ], }, }, @@ -486,11 +480,6 @@ def TEMPLATES(self): 'schedule': crontab(minute='*/15'), 'options': {'queue': 'web'}, }, - 'every-three-hour-clear-persistent-messages': { - 'task': 'readthedocs.core.tasks.clear_persistent_messages', - 'schedule': crontab(minute=0, hour='*/3'), - 'options': {'queue': 'web'}, - }, 'every-day-delete-old-search-queries': { 'task': 'readthedocs.search.tasks.delete_old_search_queries_from_db', 'schedule': crontab(minute=0, hour=0), diff --git a/readthedocs/subscriptions/notifications.py b/readthedocs/subscriptions/notifications.py index 0bbc52d538a..3ec27f094ff 100644 --- a/readthedocs/subscriptions/notifications.py +++ b/readthedocs/subscriptions/notifications.py @@ -1,12 +1,13 @@ """Organization level notifications.""" +import textwrap -from django.urls import reverse +from django.utils.translation import gettext_noop as _ from djstripe import models as djstripe -from messages_extends.constants import WARNING_PERSISTENT -from readthedocs.notifications import Notification, SiteNotification -from readthedocs.notifications.constants import REQUIREMENT +from readthedocs.notifications.constants import INFO +from readthedocs.notifications.email import EmailNotification +from readthedocs.notifications.messages import Message, registry from readthedocs.organizations.models import Organization from readthedocs.subscriptions.constants import DISABLE_AFTER_DAYS @@ -16,16 +17,15 @@ class SubscriptionNotificationMixin: """Force to read templates from the subscriptions app.""" app_templates = "subscriptions" + context_object_name = "organization" -class TrialEndingNotification(SubscriptionNotificationMixin, Notification): +class TrialEndingNotification(SubscriptionNotificationMixin, EmailNotification): """Trial is ending, nudge user towards subscribing.""" name = "trial_ending" - context_object_name = "organization" subject = "Your trial is ending soon" - level = REQUIREMENT @staticmethod def for_subscriptions(): @@ -36,17 +36,17 @@ def for_subscriptions(): ) -class SubscriptionRequiredNotification(SubscriptionNotificationMixin, Notification): +class SubscriptionRequiredNotification( + SubscriptionNotificationMixin, EmailNotification +): """Trial has ended, push user into subscribing.""" name = "subscription_required" - context_object_name = "organization" subject = "We hope you enjoyed your trial of Read the Docs!" - level = REQUIREMENT -class SubscriptionEndedNotification(SubscriptionNotificationMixin, Notification): +class SubscriptionEndedNotification(SubscriptionNotificationMixin, EmailNotification): """ Subscription has ended. @@ -56,12 +56,12 @@ class SubscriptionEndedNotification(SubscriptionNotificationMixin, Notification) """ name = "subscription_ended" - context_object_name = "organization" subject = "Your subscription to Read the Docs has ended" - level = REQUIREMENT -class OrganizationDisabledNotification(SubscriptionNotificationMixin, Notification): +class OrganizationDisabledNotification( + SubscriptionNotificationMixin, EmailNotification +): """ Subscription has ended a month ago. @@ -72,9 +72,7 @@ class OrganizationDisabledNotification(SubscriptionNotificationMixin, Notificati """ name = "organization_disabled" - context_object_name = "organization" subject = "Your Read the Docs organization will be disabled soon" - level = REQUIREMENT days_after_end = DISABLE_AFTER_DAYS @@ -87,20 +85,19 @@ def for_organizations(cls): return organizations -class OrganizationDisabledSiteNotification( - SubscriptionNotificationMixin, SiteNotification -): - success_message = 'The organization "{{ object.name }}" is currently disabled. You need to renew your subscription to keep using Read the Docs.' # noqa - success_level = WARNING_PERSISTENT - - def get_context_data(self): - context = super().get_context_data() - context.update( - { - "url": reverse( - "subscription_detail", - args=[self.object.slug], - ), - } - ) - return context +MESSAGE_ORGANIZATION_DISABLED = "organization:disabled" +messages = [ + Message( + id=MESSAGE_ORGANIZATION_DISABLED, + header=_("Your organization has been disabled"), + body=_( + textwrap.dedent( + """ + The organization "{instance.name}" is currently disabled. You need to renew your subscription to keep using Read the Docs + """ + ).strip(), + ), + type=INFO, + ), +] +registry.add(messages) diff --git a/readthedocs/subscriptions/tests/test_tasks.py b/readthedocs/subscriptions/tests/test_tasks.py index 13c71050f19..102a74be035 100644 --- a/readthedocs/subscriptions/tests/test_tasks.py +++ b/readthedocs/subscriptions/tests/test_tasks.py @@ -20,10 +20,9 @@ RTD_ALLOW_ORGANIZATIONS=True, RTD_ORG_DEFAULT_STRIPE_SUBSCRIPTION_PRICE="trialing", ) -@mock.patch("readthedocs.notifications.backends.send_email") -@mock.patch("readthedocs.notifications.storages.FallbackUniqueStorage") +@mock.patch("readthedocs.notifications.email.send_email") class DailyEmailTests(TestCase): - def test_trial_ending(self, mock_storage_class, mock_send_email): + def test_trial_ending(self, mock_send_email): """Trial ending daily email.""" now = timezone.now() @@ -54,22 +53,8 @@ def test_trial_ending(self, mock_storage_class, mock_send_email): created=now - timedelta(days=25), ) - mock_storage = mock.Mock() - mock_storage_class.return_value = mock_storage - daily_email() - self.assertEqual(mock_storage.add.call_count, 1) - mock_storage.add.assert_has_calls( - [ - mock.call( - message=mock.ANY, - extra_tags="", - level=31, - user=owner1, - ), - ] - ) self.assertEqual(mock_send_email.call_count, 1) mock_send_email.assert_has_calls( [ @@ -83,9 +68,7 @@ def test_trial_ending(self, mock_storage_class, mock_send_email): ] ) - def test_organizations_to_be_disable_email( - self, mock_storage_class, mock_send_email - ): + def test_organizations_to_be_disabled_email(self, mock_send_email): """Subscription ended ``DISABLE_AFTER_DAYS`` days ago daily email.""" customer1 = get(djstripe.Customer) latest_invoice1 = get( @@ -127,22 +110,8 @@ def test_organizations_to_be_disable_email( stripe_subscription=sub2, ) - mock_storage = mock.Mock() - mock_storage_class.return_value = mock_storage - daily_email() - self.assertEqual(mock_storage.add.call_count, 1) - mock_storage.add.assert_has_calls( - [ - mock.call( - message=mock.ANY, - extra_tags="", - level=31, - user=owner1, - ), - ] - ) self.assertEqual(mock_send_email.call_count, 1) mock_send_email.assert_has_calls( [ diff --git a/readthedocs/templates/base.html b/readthedocs/templates/base.html index 06a6700694d..7bd99f15458 100644 --- a/readthedocs/templates/base.html +++ b/readthedocs/templates/base.html @@ -103,8 +103,24 @@
+ {% comment %} + ``user_notifications`` comes from a Context Processor. + We need to use a CustomUser to have access to "user.notifications" + See https://docs.djangoproject.com/en/4.2/ref/settings/#std-setting-AUTH_USER_MODEL + {% endcomment %} + {% if user_notifications %} +
    + {% for notification in user_notifications %} +
  • + {{ notification.get_message.get_rendered_body|safe }} +
  • + {% endfor %} +
+ {% endif %} + {% block notify %} + {# TODO: migrate these notifications to new system #} {% if messages %}
    {% for message in messages %} diff --git a/readthedocs/templates/builds/build_detail.html b/readthedocs/templates/builds/build_detail.html index 540c8806e54..0f35005e633 100644 --- a/readthedocs/templates/builds/build_detail.html +++ b/readthedocs/templates/builds/build_detail.html @@ -150,6 +150,7 @@
{% if request.user|is_admin:project %} + {# TODO: migrate these "build ideas" to the new notification system #} {% if not build.success and "setup.py install" in build.commands.last.output %}

@@ -161,45 +162,22 @@

{% endif %} + {% endif %} - {# This message is not dynamic and only appears when loading the page after the build has finished #} - {% if build.finished and build.deprecated_build_image_used %} -
-

- {% blocktrans trimmed with config_file_link="https://blog.readthedocs.com/use-build-os-config/" %} - Your builds will stop working soon!
- "build.image" config key is deprecated and it will be removed soon. - Read our blog post to know how to migrate to new key "build.os" - and ensure your project continues building successfully. - {% endblocktrans %} -

-
- {% endif %} - {% if build.finished and build.deprecated_config_used %} -
-

- {% blocktrans trimmed with config_file_link="https://blog.readthedocs.com/migrate-configuration-v2/" %} - Your builds will stop working soon!
- Configuration files will soon be required by projects, and will no longer be optional. - Read our blog post to create one - and ensure your project continues building successfully. - {% endblocktrans %} -

-
- {% endif %} - {% endif %} + {% comment %} + Show non-error notifications as "build-ideas" (e.g. "build.commands is in beta") - {% if build.finished and build.config.build.commands %} -
-

- {% blocktrans trimmed with config_file_link="https://docs.readthedocs.io/page/build-customization.html#override-the-build-process" %} - The "build.commands" feature is in beta, and could have backwards incompatible changes while in beta. - Read more at our documentation to find out its limitations and potential issues. - {% endblocktrans %} -

-
+ This temporal until we render these notifications from the front-end. + {% endcomment %} + {% for notification in build.notifications.all %} + {% if notification.get_message.type != "error" %} +

+ {{ notification.get_message.get_rendered_body|safe }} +

{% endif %} + {% endfor %} + {% if build.output %} {# If we have build output, this is an old build #} @@ -233,6 +211,30 @@

{% trans "Build Errors" %}

{% else %} {# Show new command output if lacking build.output #} + {% comment %} + Show error notifications as "build-error" (e.g. "syntax error in YAML") + + This temporal until we render these notifications from the front-end. + {% endcomment %} + {% for notification in build.notifications.all %} + {% if notification.get_message.type == "error" %} +
+

{{ notification.get_message.type|title }}

+

{{ notification.get_message.get_rendered_body|safe }}

+
+ {% endif %} + {% endfor %} + + {% comment %} + After deploying the new notification system (`Notification` model), + we won't be populating `Build.error` field anymore and these errors won't be shown anymore. + However, old builds will still need to render the `Build.error` message + and that's why we are keeping this HTML + KO code here. + + In the future, we can decide fully migrate `Build.error` -> `Notification` and remove this + block completely. Read more at + https://github.com/readthedocs/readthedocs.org/issues/10980#issuecomment-1876785304 + {% endcomment %}