Skip to content
This repository has been archived by the owner on Oct 28, 2024. It is now read-only.

Commit

Permalink
added more info about pretraineds
Browse files Browse the repository at this point in the history
  • Loading branch information
blaisewf committed Apr 18, 2024
1 parent 2be8494 commit 7b551a1
Show file tree
Hide file tree
Showing 2 changed files with 26 additions and 5 deletions.
Binary file added assets/Pretrained.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
31 changes: 26 additions & 5 deletions get-started/pretrained.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,15 +3,36 @@ icon: sliders
order: E
---

# Custom Pretraineds
# Pretrained

- In the training tab, check the **'Custom Pretrained'** box, upload the files, and select it in the **Pretrained G/D Path boxes**.
## What are pretraineds?

![](/assets/load_pretrained.png)
When it comes to training, you have two main options: building a model from the ground up or fine-tuning an already existing one. These pre-trained models are designed to streamline your training process, saving time and enhancing the overall quality of your results.

## What are custom pretraineds?
![](../assets/Pretrained.png)
_Simply put, the image illustrates that having a pre-trained model saves you effort during subsequent model training._

Those are pretraineds trained by AI enthusiasts, unlike the original pretraineds, were created using long higher-quality datasets. Additionally, they yield better results in models.
## How to create pretraineds?

When creating a pretrained model, you have two primary options to consider.

Firstly, you can either initiate a fine-tuning process on another pretrained model, which could be one of the originals, or start building one from scratch.

Should you opt for building from scratch, the ideal approach involves gathering a substantial amount of moderately clean data; it doesn't necessarily have to be perfectly pristine. Subsequently, fine-tune this model with high-quality data.

An essential consideration is to construct datasets devoid of copyrighted material.

Alternatively, if you choose to fine-tune a pretrained model, the crux lies in the quality of the audio inputs. You can tailor it to a specific language, incorporate diverse speakers, and even integrate various accents. The customization possibilities are vast. However, it's crucial to strike a balance; avoid overtraining the pretrained model. The more effectively you fine-tune it now, the less training it will necessitate later during usage.

To embark on building a model from scratch, conduct standard training while disabling the Pretrained option. For fine-tuning, engage in ordinary training while loading the desired pretrained model to fine-tune it.

## How to use pretraineds?

In the training tab, check the **'Custom Pretrained'** box, upload the files, and select it in the **Pretrained G/D Path boxes**.

![](../assets/load_pretrained.png)

## Where to find pretraineds?

### Ov2 Super by SimplCup

Expand Down

0 comments on commit 7b551a1

Please sign in to comment.