Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow REQUESTS_CA_BUNDLE #385

Closed
sqlreport opened this issue Sep 12, 2023 · 2 comments
Closed

Allow REQUESTS_CA_BUNDLE #385

sqlreport opened this issue Sep 12, 2023 · 2 comments
Labels
enhancement New feature or request @jupyter-ai/chatui

Comments

@sqlreport
Copy link

Re: #321 (comment)

Problem

  • using different OpenAI base url for chat UI I am getting connection failed or timeout error
  • my OpenAI base url starts with https

Proposed Solution

  • allow adding https connection option with certificate

Additional context

@sqlreport sqlreport added the enhancement New feature or request label Sep 12, 2023
@sqlreport
Copy link
Author

error message

Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/openai/api_requestor.py", line 668, in arequest_raw
    result = await session.request(**request_kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/aiohttp/client.py", line 536, in _request
    conn = await self._connector.connect(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/aiohttp/connector.py", line 540, in connect
    proto = await self._create_connection(req, traces, timeout)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/aiohttp/connector.py", line 899, in _create_connection
    _, proto = await self._create_proxy_connection(req, traces, timeout)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/aiohttp/connector.py", line 1291, in _create_proxy_connection
    raise ClientHttpProxyError(
aiohttp.client_exceptions.ClientHttpProxyError: 404, message='Not Found', url=URL('https://my.proxy.address.here/v1/proxy/openai/v1')

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/jupyter_ai/chat_handlers/base.py", line 38, in process_message
    await self._process_message(message)
  File "/usr/local/lib/python3.11/site-packages/jupyter_ai/chat_handlers/default.py", line 69, in _process_message
    response = await self.llm_chain.apredict(input=message.body, stop=["\nHuman:"])
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/langchain/chains/llm.py", line 272, in apredict
    return (await self.acall(kwargs, callbacks=callbacks))[self.output_key]
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/langchain/chains/base.py", line 355, in acall
    raise e
  File "/usr/local/lib/python3.11/site-packages/langchain/chains/base.py", line 349, in acall
    await self._acall(inputs, run_manager=run_manager)
  File "/usr/local/lib/python3.11/site-packages/langchain/chains/llm.py", line 237, in _acall
    response = await self.agenerate([inputs], run_manager=run_manager)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/langchain/chains/llm.py", line 115, in agenerate
    return await self.llm.agenerate_prompt(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/langchain/chat_models/base.py", line 424, in agenerate_prompt
    return await self.agenerate(
           ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/langchain/chat_models/base.py", line 384, in agenerate
    raise exceptions[0]
  File "/usr/local/lib/python3.11/site-packages/langchain/chat_models/base.py", line 485, in _agenerate_with_cache
    return await self._agenerate(
           ^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/langchain/chat_models/openai.py", line 425, in _agenerate
    response = await acompletion_with_retry(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/langchain/chat_models/openai.py", line 92, in acompletion_with_retry
    return await _completion_with_retry(**kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/tenacity/_asyncio.py", line 88, in async_wrapped
    return await fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/tenacity/_asyncio.py", line 47, in __call__
    do = self.iter(retry_state=retry_state)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/tenacity/__init__.py", line 325, in iter
    raise retry_exc.reraise()
          ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/tenacity/__init__.py", line 158, in reraise
    raise self.last_attempt.result()
          ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/Cellar/[email protected]/3.11.4_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/concurrent/futures/_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/Cellar/[email protected]/3.11.4_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
    raise self._exception
  File "/usr/local/lib/python3.11/site-packages/tenacity/_asyncio.py", line 50, in __call__
    result = await fn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/langchain/chat_models/openai.py", line 90, in _completion_with_retry
    return await llm.client.acreate(**kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/openai/api_resources/chat_completion.py", line 45, in acreate
    return await super().acreate(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 217, in acreate
    response, _, api_key = await requestor.arequest(
                           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/openai/api_requestor.py", line 372, in arequest
    result = await self.arequest_raw(
             ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/openai/api_requestor.py", line 685, in arequest_raw
    raise error.APIConnectionError("Error communicating with OpenAI") from e
openai.error.APIConnectionError: Error communicating with OpenAI

I have replaced my https address with https://my.proxy.address.here/v1/proxy/openai/v1

@sqlreport
Copy link
Author

sqlreport commented Oct 31, 2023

@JasonWeill I am also getting

aiohttp.client_exceptions.ClientProxyConnectionError: Cannot connect to host {my.host.https:443} ssl:default [None]

Is there anything I can try on my end?

@sqlreport sqlreport closed this as not planned Won't fix, can't repro, duplicate, stale Jun 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request @jupyter-ai/chatui
Projects
None yet
Development

No branches or pull requests

2 participants