You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
(.venv) D:\ComfyUI\ComfyUI-to-Python-Extension>py comfyui_to_python.py
Traceback (most recent call last):
File "D:\ComfyUI\ComfyUI-to-Python-Extension\comfyui_to_python.py", line 17, in
from nodes import NODE_CLASS_MAPPINGS
File "D:\ComfyUI\ComfyUI-to-Python-Extension..\nodes.py", line 21, in
import comfy.diffusers_load
File "D:\ComfyUI\ComfyUI-to-Python-Extension..\comfy\diffusers_load.py", line 3, in
import comfy.sd
File "D:\ComfyUI\ComfyUI-to-Python-Extension..\comfy\sd.py", line 5, in
from comfy import model_management
File "D:\ComfyUI\ComfyUI-to-Python-Extension..\comfy\model_management.py", line 120, in
total_vram = get_total_memory(get_torch_device()) / (1024 * 1024)
^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI\ComfyUI-to-Python-Extension..\comfy\model_management.py", line 89, in get_torch_device
return torch.device(torch.cuda.current_device())
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI.venv\Lib\site-packages\torch\cuda_init_.py", line 778, in current_device lazy_init()
File "D:\ComfyUI.venv\Lib\site-packages\torch\cuda_init.py", line 284, in _lazy_init
raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled
I run confyui with the --cpu arg. Like main.py -cpu.
Is there a way to do this with py comfyui_to_python.py as well? Like py comfyui_to_python.py --cpu ?
The text was updated successfully, but these errors were encountered:
(.venv) D:\ComfyUI\ComfyUI-to-Python-Extension>py comfyui_to_python.py
Traceback (most recent call last):
File "D:\ComfyUI\ComfyUI-to-Python-Extension\comfyui_to_python.py", line 17, in
from nodes import NODE_CLASS_MAPPINGS
File "D:\ComfyUI\ComfyUI-to-Python-Extension..\nodes.py", line 21, in
import comfy.diffusers_load
File "D:\ComfyUI\ComfyUI-to-Python-Extension..\comfy\diffusers_load.py", line 3, in
import comfy.sd
File "D:\ComfyUI\ComfyUI-to-Python-Extension..\comfy\sd.py", line 5, in
from comfy import model_management
File "D:\ComfyUI\ComfyUI-to-Python-Extension..\comfy\model_management.py", line 120, in
total_vram = get_total_memory(get_torch_device()) / (1024 * 1024)
^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI\ComfyUI-to-Python-Extension..\comfy\model_management.py", line 89, in get_torch_device
return torch.device(torch.cuda.current_device())
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI.venv\Lib\site-packages\torch\cuda_init_.py", line 778, in current_device
lazy_init()
File "D:\ComfyUI.venv\Lib\site-packages\torch\cuda_init.py", line 284, in _lazy_init
raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled
I run confyui with the --cpu arg. Like main.py -cpu.
Is there a way to do this with py comfyui_to_python.py as well? Like py comfyui_to_python.py --cpu ?
The text was updated successfully, but these errors were encountered: