Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error 500 "RPError" | OpenID Connect + SafeNet Trusted Access (STA) #1499

Open
avirgos opened this issue Sep 30, 2024 · 0 comments
Open

Error 500 "RPError" | OpenID Connect + SafeNet Trusted Access (STA) #1499

avirgos opened this issue Sep 30, 2024 · 0 comments
Labels
support A request for help setting things up

Comments

@avirgos
Copy link

avirgos commented Sep 30, 2024

Hello,

I would like to deploy OpenID Connect with SafeNet Trusted Access (STA).

From this 3-minute video, I've done all the steps, except for OAuth.tools which I don't use :
https://www.youtube.com/watch?v=hSWXFSadpQQ

Here's my bash script that deploys the containers | deploy.sh :

#!/bin/bash

# previous containers removed
sudo docker rm -f ollama
sudo docker rm -f mongodb
sudo docker rm -f chat-ui
sudo docker rm -f nginx

# previous networks removed
sudo docker network rm backend >/dev/null 2>&1
sudo docker network rm proxy >/dev/null 2>&1

# create networks
sudo docker network create backend
sudo docker network create proxy

# ollama
sudo docker run -d -p 11434:11434 -e HTTPS_PROXY="${HTTPS_PROXY}" -v /home/<my-user>/chat-ui/ollama:/root/.ollama --name ollama --network backend ollama-with-ca
sleep 5
sudo docker exec ollama taskset -c 0-40 ollama run llama3.1

# mongodb
sudo docker run -d -p 27017:27017 -v mongodb-data:/data/db --name mongodb --network backend mongo:latest

# chat-ui
sudo docker run -d -p 3000:3000 -e HTTPS_PROXY="${HTTPS_PROXY}" --mount type=bind,source="$(pwd)/.env.local",target=/app/.env.local -v chat-ui:/data --name chat-ui --network backend ghcr.io/huggingface/chat-ui-db
sudo docker network connect proxy chat-ui

# nginx
sudo docker run -d -p 80:80 -p 443:443 -v "$(pwd)/nginx:/etc/nginx/conf.d" -v "$(pwd)/ssl:/etc/ssl" --name nginx --network proxy nginx:latest

Here's my nginx configuration :

server {
  listen 80 default_server;
  listen [::]:80 default_server;
  server_name <my-chat-ui>.fr;
  return 301 https://$host$request$uri;
}

server {
  listen 443 ssl;
  server_name <my-chat-ui>.fr; 
  ssl_certificate /etc/ssl/chat-ui.crt;
  ssl_certificate_key /etc/ssl/chat-ui.key;

  proxy_connect_timeout   60;
  proxy_send_timeout      60;
  proxy_read_timeout      60;
  send_timeout            60;
  client_max_body_size    2G;
  proxy_buffering off;
  client_header_buffer_size 8k;

  location / {
    proxy_pass http://chat-ui:3000;
    proxy_set_header Host $host;
    proxy_set_header X-Real-IP $remote_addr;
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header X-Forwarded-Proto $scheme;

    add_header 'Access-Control-Allow-Origin' 'https://<my-chat-ui>.fr' always;
  }
}

Finally, here's my .env.local using Llama3.1 8B model :

MONGODB_URL=mongodb://mongodb:27017
HF_TOKEN=hf_*****

OPENID_CONFIG=`{
  "PROVIDER_URL": "https://idp.eu.safenetid.com/auth/realms/<realm-ID>-STA/protocol/openid-connect/auth",
  "CLIENT_ID": "*****",
  "CLIENT_SECRET": "*****",
  "SCOPES": "openid profile"
}`

MODELS=`[
  {
    "name": "Ollama | Llama3.1",
    "id": "llama3.1-8b",
    "description": "llama3.1-8b",
    "chatPromptTemplate": "<|begin_of_text|>{{#if @root.preprompt}}<|start_header_id|>system<|end_header_id|>\n\n{{@root.preprompt}}<|eot_id|>{{/if}}{{#each messages}}{{#ifUser}}<|start_header_id|>user<|end_header_id|>\n\n{{content}}<|eot_id|>{{/ifUser}}{{#ifAssistant}}<|start_header_id|>assistant<|end_header_id|>\n\n{{content}}<|eot_id|>{{/ifAssistant}}{{/each}}<|start_header_id|>assistant<|end_header_id|>\n\n",
    "parameters": {
      "temperature": 0.1,
      "top_p": 0.95,
      "repetition_penalty": 1.2,
      "top_k": 50,
      "truncate": 3072,
      "max_new_tokens": 1024,
      "stop": ["<|end_of_text|>", "<|eot_id|>"]
    },
    "endpoints": [
      {
        "type": "ollama",
        "url" : "http://ollama:11434",
        "ollamaName" : "llama3.1:latest"
      }
    ]
  }
]`

And I got this error when I press on "Login" button :

login-button-pressed

When I do the command sudo docker logs chat-ui, I see this line :

{"level":50,"time":1727703253975,"pid":30,"hostname":"fe9d8f548283","locals":{"sessionId":"3b700cd7b4efc2a2b47c0f13134904e01f01c3b7d6ff05c6726390e19ea5d431"},"url":"https://ia.chu-lyon.fr/login","params":{},"request":{},"message":"Internal Error","error":{"name":"RPError"},"errorId":"8d7d74e3-b12c-4c1e-9dc5-9847d5e61ea2","status":500}

Note that by adding the OPENID_CONFIG (with probably incorrect data), the application stops working completely and I can't launch prompts or delete/edit existing ones !

When I comment OPENID_CONFIG, everything starts working properly again.

I don't really know what to put exactly, especially for PROVIDER_URL and SCOPES.

Can you help me to resolve this issue ?

Thanks in advance.

@avirgos avirgos added the support A request for help setting things up label Sep 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
support A request for help setting things up
Projects
None yet
Development

No branches or pull requests

1 participant