-
🐛 BugWhen I'm trying to use torch.device('cuda', 0) on model load - it fails at random places of text. To ReproduceSteps to reproduce the behavior:
The error message I'm getting each time is:
The complete script: https://github.com/S-trace/silero_tts_standalone/blob/master/tts.py Expected behaviorTTS should not fail with CUDA. EnvironmentCollecting environment information... OS: Manjaro Linux (x86_64) Python version: 3.10.10 (main, Mar 5 2023, 22:26:53) [GCC 12.2.1 20230201] (64-bit runtime) CPU: Versions of relevant libraries: Additional contextI'm using "xenia" voice. |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments
-
Does this happen if you limit input text length to 512, 768 or 1024 chars? |
Beta Was this translation helpful? Give feedback.
-
It happen with limit of 1024 chars or 941 chars, but not with 512 and 768 chars. With 1024 chars limit i've got a wonderful tracebacks pack:
|
Beta Was this translation helpful? Give feedback.
-
You should just limit the input length |
Beta Was this translation helpful? Give feedback.
You should just limit the input length