Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

automate spam detection for Job, Event, Codebase and MemberProfile using an external LLM service #772

Open
wants to merge 6 commits into
base: main
Choose a base branch
from

Conversation

asuworks
Copy link
Contributor

@asuworks asuworks commented Nov 16, 2024

This PR attempts to automate the spam detection process for Job, Event, Codebase and MemberProfile objects using an external LLM service.

LLM Spam Detection Process

  1. A SpamModeration record with status SCHEDULED_FOR_CHECK is stored on every Job, Event, Codebase, submission and User (SpamModeration object is attached to the associated MemberProfile) creation.
  2. A decoupled external service asuworks/comses.spamcheck queries for these SpamModeration objects (api/spam/get-latest-batch/), analyzes them for spam and submits a spam report to api/spam/update for each one of them.
  3. The handler for the api/spam/update on the CoMSES side updates the corresponding SpamModeration object according to the LLM report from the external service.

Starting the LLM Spam Detection Process

The external service asuworks/comses.spamcheck is deployed on an existing JetStream2 instance which is unshelved before the spam check workflow is triggered and shelved automatically after it is done by the following management command:

./manage.py  curator_llm_spam_check

# Following flags are available for troubleshooting purposes:
--skip-changing-instance-state # will skip changing the status of the JetStream2 instance
--skip-shelving-when-done # will start the JetStream2 instance, execute the `CheckSpamWorkflow`, but not shelve it

Environment & Secrets

Following environment variables must be set:

LLM_SPAM_CHECK_API_URL=http://<JetStream2 instance IP>:8001
LLM_SPAM_CHECK_JETSTREAM_SERVER_ID=<JetStream2 instance ID>

JetStream2 Credentials

can be found here: https://js2.jetstream-cloud.org/identity/application_credentials/

secrets/llm_spam_check_jetstream_os_application_credential_secret
secrets/llm_spam_check_jetstream_os_application_credential_id

X-API-Key header for the API

Access to api/spam/update and api/spam/get-latest-batch routes is protected by the X-API-Key header verification.
The key should be set in secrets/llm_spam_check_api_key

ALLOWED_HOSTS

The IP of the JetStream2 instance must be added to Django's ALLOWED_HOSTS

…`api/spam/update`. A SpamModeration record with status `SCHEDULED_FOR_CHECK` is stored on every Job, Event, Codebase submission. A decoupled external service will query for these objects to check them for spam.
@asuworks asuworks force-pushed the 283-spam-detection-with-ollama-on-jetstream2 branch 3 times, most recently from b46ba40 to 20c2574 Compare November 21, 2024 05:19
@asuworks asuworks self-assigned this Nov 21, 2024
…PAM_LIKELY

- fix tests
- add asdf & direnv to .gitignore and .dockerignore
@asuworks asuworks force-pushed the 283-spam-detection-with-ollama-on-jetstream2 branch from 20c2574 to 5b1ce46 Compare November 21, 2024 20:59
@asuworks
Copy link
Contributor Author

we would need to add the jetstream instance ip to ALLOWED_HOSTS

@asuworks asuworks requested review from sgfost and alee November 21, 2024 21:42
@sgfost
Copy link
Contributor

sgfost commented Nov 22, 2024

just missing the migration for SpamModeration.status, which I think should turn anything with "unreviewed" status into "spam_likely" status.

I'll read through it again but it seemed all good on a first pass, besides having a way to kick off the process

@asuworks
Copy link
Contributor Author

asuworks commented Nov 22, 2024

Regarding the way to start the process from CoMSES side:

  1. start the instance with openstack-cli
  2. poll the service until HEALTHY
  3. keep polling the service untill the workflow is COMPLETED
  4. shutdown instance with openstack-cli

something like this?

…tream2 instance, triggers the LLM spam check workflow and shelves the instance when the workflow is done.
@asuworks asuworks force-pushed the 283-spam-detection-with-ollama-on-jetstream2 branch from 79a1368 to 84f051c Compare November 26, 2024 20:59
@asuworks asuworks force-pushed the 283-spam-detection-with-ollama-on-jetstream2 branch from 84f051c to 0cbae60 Compare November 26, 2024 23:06
…pamModeration object is automatically created for the associated MemberProfile
@asuworks asuworks changed the title add spam detection endpoints for external LLM service automate the spam detection process for Job, Event, Codebase and MemberProfile using an external LLM service Nov 28, 2024
@asuworks asuworks marked this pull request as ready for review November 29, 2024 18:35
@asuworks asuworks changed the title automate the spam detection process for Job, Event, Codebase and MemberProfile using an external LLM service automate spam detection for Job, Event, Codebase and MemberProfile using an external LLM service Nov 29, 2024
@sgfost
Copy link
Contributor

sgfost commented Dec 3, 2024

@asuworks I just remembered there was some additional cleanup I wanted to do eventually with the spam stuff. This might be a good place to get that done if you are up for it. comses/planning#249. Namely the second point (refactoring the serializer mixin to actually be just a mixin)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants