Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Output is same as input file, why? #45

Open
nitinmukesh opened this issue Apr 30, 2024 · 0 comments
Open

Output is same as input file, why? #45

nitinmukesh opened this issue Apr 30, 2024 · 0 comments

Comments

@nitinmukesh
Copy link

(tokenflow) C:\tut\TokenFlow>python preprocess.py --data_path data/woman-running.mp4 --inversion_prompt "a silver sculpt
ure of a woman running"
A matching Triton is not available, some optimizations will not be enabled
Traceback (most recent call last):
  File "C:\Users\nitin\miniconda3\envs\tokenflow\lib\site-packages\xformers\__init__.py", line 55, in _is_triton_available
    from xformers.triton.softmax import softmax as triton_softmax  # noqa
  File "C:\Users\nitin\miniconda3\envs\tokenflow\lib\site-packages\xformers\triton\softmax.py", line 11, in <module>
    import triton
ModuleNotFoundError: No module named 'triton'
C:\Users\nitin\miniconda3\envs\tokenflow\lib\site-packages\torchvision\io\video.py:161: UserWarning: The pts_unit 'pts' gives wrong results. Please use pts_unit 'sec'.
  warnings.warn("The pts_unit 'pts' gives wrong results. Please use pts_unit 'sec'.")
[INFO] loading stable diffusion...
C:\Users\nitin\miniconda3\envs\tokenflow\lib\site-packages\diffusers\models\attention_processor.py:1117: UserWarning: 1Torch was not compiled with flash attention. (Triggered internally at C:\actions-runner\_work\pytorch\pytorch\builder\windows\pytorch\aten\src\ATen\native\transformers\cuda\sdp_utils.cpp:455.)
  hidden_states = F.scaled_dot_product_attention(
[INFO] loaded stable diffusion!
100%|████████████████████████████████████████████████████████████████████████████████| 500/500 [18:02<00:00,  2.16s/it]
100%|████████████████████████████████████████████████████████████████████████████████| 500/500 [18:03<00:00,  2.17s/it]

pip list


absl-py                 2.1.0
accelerate              0.29.3
av                      12.0.0
Brotli                  1.0.9
certifi                 2024.2.2
chardet                 4.0.0
charset-normalizer      2.0.4
colorama                0.4.6
diffusers               0.20.0
filelock                3.13.1
fsspec                  2024.3.1
ftfy                    6.2.0
gmpy2                   2.1.2
grpcio                  1.62.2
huggingface-hub         0.22.2
idna                    3.7
importlib_metadata      7.1.0
intel-openmp            2021.4.0
Jinja2                  3.1.3
kornia                  0.7.2
kornia_rs               0.1.3
Markdown                3.6
MarkupSafe              2.1.3
mkl                     2021.4.0
mkl-fft                 1.3.1
mkl-random              1.2.2
mkl-service             2.4.0
mpmath                  1.3.0
networkx                3.1
numpy                   1.24.3
opencv-python           4.9.0.80
packaging               24.0
pillow                  10.3.0
pip                     23.3.1
protobuf                5.26.1
psutil                  5.9.8
PySocks                 1.7.1
PyYAML                  6.0.1
regex                   2024.4.28
requests                2.31.0
safetensors             0.4.3
setuptools              68.2.2
six                     1.16.0
sympy                   1.12
tbb                     2021.12.0
tensorboard             2.16.2
tensorboard-data-server 0.7.2
tokenizers              0.19.1
torch                   2.3.0+cu121
torchvision             0.18.0
tqdm                    4.66.2
transformers            4.40.1
typing_extensions       4.11.0
urllib3                 2.1.0
wcwidth                 0.2.13
Werkzeug                3.0.2
wheel                   0.41.2
win-inet-pton           1.1.0
xformers                0.0.26.post1
zipp                    3.18.1

Output
https://github.com/omerbt/TokenFlow/assets/2102186/92f3cb7d-67f8-48a2-bccb-abe062384af8

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant