You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
fromlangchain_google_vertexaiimportChatVertexAIfromlangchain_core.promptsimportChatPromptTemplateimportasynciomodel1=ChatVertexAI(model_name="chat-bison@002")
model2=ChatVertexAI(model_name="gemini-1.0-pro")
chain1=ChatPromptTemplate.from_messages(["Why is the sky blue?"]) |model1chain2=ChatPromptTemplate.from_messages(["Why is the sky blue?"]) |model2asyncdefmain():
asyncwithasyncio.TaskGroup() astg:
tg.create_task(chain1.ainvoke({}))
tg.create_task(chain2.ainvoke({}))
asyncio.run(main())
Exception:
| Traceback (most recent call last):
| File "/Users/sufiyan/micromamba/envs/nlp-cloud-server/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2536, in ainvoke
| input = await step.ainvoke(
| ^^^^^^^^^^^^^^^^^^^
| File "/Users/sufiyan/micromamba/envs/nlp-cloud-server/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 179, in ainvoke
| llm_result = await self.agenerate_prompt(
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/Users/sufiyan/micromamba/envs/nlp-cloud-server/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 570, in agenerate_prompt
| return await self.agenerate(
| ^^^^^^^^^^^^^^^^^^^^^
| File "/Users/sufiyan/micromamba/envs/nlp-cloud-server/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 530, in agenerate
| raise exceptions[0]
| File "/Users/sufiyan/micromamba/envs/nlp-cloud-server/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 715, in _agenerate_with_cache
| result = await self._agenerate(
| ^^^^^^^^^^^^^^^^^^^^^^
| File "/Users/sufiyan/micromamba/envs/nlp-cloud-server/lib/python3.11/site-packages/langchain_google_vertexai/chat_models.py", line 635, in _agenerate
| with telemetry.tool_context_manager(self._user_agent):
| File "/Users/sufiyan/micromamba/envs/nlp-cloud-server/lib/python3.11/contextlib.py", line 144, in __exit__
| next(self.gen)
| File "/Users/sufiyan/micromamba/envs/nlp-cloud-server/lib/python3.11/site-packages/google/cloud/aiplatform/telemetry.py", line 48, in tool_context_manager
| _pop_tool_name(tool_name)
| File "/Users/sufiyan/micromamba/envs/nlp-cloud-server/lib/python3.11/site-packages/google/cloud/aiplatform/telemetry.py", line 57, in _pop_tool_name
| raise RuntimeError(
| RuntimeError: Tool context error detected. This can occur due to parallelization.
+------------------------------------
The text was updated successfully, but these errors were encountered:
The recently introduced telemetry context managers in #31 by @holtskinner causes issue with parallelisation.
Given that the generation is an io task, concurrency is a crucial aspect that cannot be overlooked.
If multiple models are awaited in parallel, they all seem to append to same list causing exception on exit
here's an reproducible example
Exception:
The text was updated successfully, but these errors were encountered: