From a5b1a69aa96999835cd909981f53eaa662884fad Mon Sep 17 00:00:00 2001 From: Mutlu Simsek Date: Sun, 21 Jul 2024 19:08:31 +0300 Subject: [PATCH] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 28df8fc..ede1a27 100644 --- a/README.md +++ b/README.md @@ -19,7 +19,7 @@ PerpetualBooster is a gradient boosting machine (GBM) algorithm which doesn't ne ## Benchmark -Hyperparameter optimization usually takes 100 iterations with plain GBM algorithms. PerpetualBooster achieves the same accuracy in the single run. Thus, it achieves around 100x speed-up at the same accuracy with different `budget` levels and with different datasets. The speed-up might be slightly lower or significantly higher than 100x depending on the dataset. +Hyperparameter optimization usually takes 100 iterations with plain GBM algorithms. PerpetualBooster achieves the same accuracy in a single run. Thus, it achieves around 100x speed-up at the same accuracy with different `budget` levels and with different datasets. The speed-up might be slightly lower or significantly higher than 100x depending on the dataset. The following table summarizes the results for the [California Housing](https://scikit-learn.org/stable/modules/generated/sklearn.datasets.fetch_california_housing.html) dataset (regression):