Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

疑似bug #1098

Closed
skcccc opened this issue May 16, 2024 · 7 comments
Closed

疑似bug #1098

skcccc opened this issue May 16, 2024 · 7 comments

Comments

@skcccc
Copy link

skcccc commented May 16, 2024

GPT_SoVITS/AR/data/dataset.py 代码中第67行是不是应该是os.path.dirname(phoneme_path),目前写成 os.path.basename(phoneme_path) 导致第227行的判断恒为false,无法读取bert_feature

bug

https://github.com/RVC-Boss/GPT-SoVITS/blob/main/GPT_SoVITS/AR/data/dataset.py

@45xjh
Copy link

45xjh commented May 16, 2024

你要一键三连之后才有这几个目录,也就是GPT_SoVITS下的prepare_dataset下的那几个

@skcccc
Copy link
Author

skcccc commented May 16, 2024

你要一键三连之后才有这几个目录,也就是GPT_SoVITS下的prepare_dataset下的那几个

这个不是一键三连目录的问题,而是os.path.basename(phoneme_path)这种写法导致self.path3 = '2-name2text.txt/3-bert',继而path_bert = 2-name2text.txt/3-bert/xxx.pt。导致227行的判断恒为False

#dataset.py 226-230行
path_bert = "%s/%s.pt" % (self.path3, item_name)
if os.path.exists(path_bert) == True:
        bert_feature = torch.load(path_bert, map_location="cpu")
else:
       flag = 1

@XXXXRT666
Copy link
Contributor

看下来无问题

@skcccc
Copy link
Author

skcccc commented May 16, 2024

看下来无问题
按照下面这段代码,微调的时候 bert_feature 设置为None是算法的设计吗,这样是不是会导致一键三连产生的bert没用呢(目前还没看完代码),感谢!

        path_bert = "%s/%s.pt" % (self.path3, item_name)
        if os.path.exists(path_bert) == True:
            bert_feature = torch.load(path_bert, map_location="cpu")
        else:
            flag = 1
        if flag == 1:
            # bert_feature=torch.zeros_like(phoneme_ids,dtype=torch.float32)
            bert_feature = None

@XXXXRT666
Copy link
Contributor

建议看完数据集准备代码再来看这段

@RVC-Boss
Copy link
Owner

预处理失败会是False的

@RVC-Boss
Copy link
Owner

RVC-Boss commented Jun 6, 2024

是我理解错楼主意思了
bert路径寻错了,已修复,属于重大bug。建议之前用webui微调中文模型的同学都重训下。(训练量小的不太要紧,训得多的要紧。)
底模中文带bert,修复前微调中文训错bert路径未带bert,推理中文带bert。修复后微调训练中文正确带bert。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants