We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
!!! Exception during processing !!! Unexpected error accessing result["data"]["payload"]. Result: {'type': 'error', 'message': 'Error processing task c6da63cb-e45c-4c14-ac6a-78d0f07fd6a3: Exceeded limit for instance LoraLoader count <= 0.', 'details': 'Traceback (most recent call last):\n File "/workspace/comfybridge/comfybridge/bizyair/schedulers.py", line 287, in process_request\n result = await self._execute_task(\n File "/workspace/comfybridge/comfybridge/bizyair/schedulers.py", line 331, in _execute_task\n return await loop.run_in_executor(self.pool, executor.execute)\n File "/opt/conda/lib/python3.10/concurrent/futures/thread.py", line 58, in run\n result = self.fn(*self.args, **self.kwargs)\n File "/workspace/comfybridge/comfybridge/bizyair/schedulers.py", line 80, in execute\n return self.execute_func(*self.args, **self.kwargs)\n File "/workspace/comfybridge/comfybridge/plugins/comfy_pipeline/utils/cuda_memory_utils.py", line 22, in wrapper\n result = func(*args, **kwargs)\n File "/opt/conda/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context\n return func(*args, **kwargs)\n File "/workspace/comfybridge/comfybridge/plugins/comfy_pipeline/server.py", line 111, in run\n user_params: ServerFormat = convert_to_server_format(\n File "/workspace/comfybridge/comfybridge/plugins/comfy_pipeline/transpiler/transpiler.py", line 80, in convert_to_server_format\n prompt, node_id_registry = allocator.generate_workflow_with_allocated_ids(\n File "/workspace/comfybridge/comfybridge/plugins/comfy_pipeline/transpiler/node_id_allocator.py", line 108, in generate_workflow_with_allocated_ids\n raise RuntimeError(\nRuntimeError: Exceeded limit for instance LoraLoader count <= 0\n', 'extra_info': None, 'data_status': 'COMPLETED', 'start_time': 1731462738.2187595, 'end_time': 1731462738.237253, 'task_id': 'c6da63cb-e45c-4c14-ac6a-78d0f07fd6a3', 'upload_to_s3': None} Traceback (most recent call last): File "D:\ComfyUI-aki-v1.2\custom_nodes\BizyAir\src\bizyair\commands\servers\prompt_server.py", line 43, in execute out = result["data"]["payload"] KeyError: 'data'
The text was updated successfully, but these errors were encountered:
Yes, SD3 doesn't support lora by now. We will make you know if it is available
Sorry, something went wrong.
这周内会支持
上周一些其它事情耽误了,大概率这周会支持。
No branches or pull requests
!!! Exception during processing !!! Unexpected error accessing result["data"]["payload"]. Result: {'type': 'error', 'message': 'Error processing task c6da63cb-e45c-4c14-ac6a-78d0f07fd6a3: Exceeded limit for instance LoraLoader count <= 0.', 'details': 'Traceback (most recent call last):\n File "/workspace/comfybridge/comfybridge/bizyair/schedulers.py", line 287, in process_request\n result = await self._execute_task(\n File "/workspace/comfybridge/comfybridge/bizyair/schedulers.py", line 331, in _execute_task\n return await loop.run_in_executor(self.pool, executor.execute)\n File "/opt/conda/lib/python3.10/concurrent/futures/thread.py", line 58, in run\n result = self.fn(*self.args, **self.kwargs)\n File "/workspace/comfybridge/comfybridge/bizyair/schedulers.py", line 80, in execute\n return self.execute_func(*self.args, **self.kwargs)\n File "/workspace/comfybridge/comfybridge/plugins/comfy_pipeline/utils/cuda_memory_utils.py", line 22, in wrapper\n result = func(*args, **kwargs)\n File "/opt/conda/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context\n return func(*args, **kwargs)\n File "/workspace/comfybridge/comfybridge/plugins/comfy_pipeline/server.py", line 111, in run\n user_params: ServerFormat = convert_to_server_format(\n File "/workspace/comfybridge/comfybridge/plugins/comfy_pipeline/transpiler/transpiler.py", line 80, in convert_to_server_format\n prompt, node_id_registry = allocator.generate_workflow_with_allocated_ids(\n File "/workspace/comfybridge/comfybridge/plugins/comfy_pipeline/transpiler/node_id_allocator.py", line 108, in generate_workflow_with_allocated_ids\n raise RuntimeError(\nRuntimeError: Exceeded limit for instance LoraLoader count <= 0\n', 'extra_info': None, 'data_status': 'COMPLETED', 'start_time': 1731462738.2187595, 'end_time': 1731462738.237253, 'task_id': 'c6da63cb-e45c-4c14-ac6a-78d0f07fd6a3', 'upload_to_s3': None}
Traceback (most recent call last):
File "D:\ComfyUI-aki-v1.2\custom_nodes\BizyAir\src\bizyair\commands\servers\prompt_server.py", line 43, in execute
out = result["data"]["payload"]
KeyError: 'data'
The text was updated successfully, but these errors were encountered: