-
-
Notifications
You must be signed in to change notification settings - Fork 293
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
docs: update documentation Oct 2024 #7178
base: unstable
Are you sure you want to change the base?
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## unstable #7178 +/- ##
=========================================
Coverage 49.21% 49.21%
=========================================
Files 598 598
Lines 39726 39794 +68
Branches 2092 2097 +5
=========================================
+ Hits 19550 19585 +35
- Misses 20136 20169 +33
Partials 40 40 |
Performance Report✔️ no performance regression detected Full benchmark results
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good work! Some initial comments. Haven't gone through the entire PR yet
docs/pages/index.md
Outdated
|
||
This documentation is open source, contribute at [Github Lodestar repository /docs](https://github.com/ChainSafe/lodestar/tree/unstable/docs). | ||
If you run the [execution client](https://ethereum.org/en/developers/docs/nodes-and-clients/#execution-clients) on the same host, you will need to check their requirements and add them to the above requirements. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shall we also add a hardware expectation for running EL+CL like what Nimbus does here?
Broadly, to run both an execution and a consensus client on the same machine, we recommend a 2 TB SSD and 16 GB RAM.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I opted not to only because of the variations of the different EL clients. Will be easier IMO to just have the user figure out their recommendations and add it to our recommendations.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Broadly, to run both an execution and a consensus client on the same machine, we recommend a 2 TB SSD and 16 GB RAM.
I am not sure this is even a good recommendation anymore, I generally recommend people to go for 4TB nowadays since it's unknown when we will ship the verge + blob count increase adds further storage burden.
16GB is also bit tight for Lodestar with most other ELs, it's not sufficient with Nethermind for example, my server is sitting at 17.2GB
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks pretty good overall!
docs/pages/index.md
Outdated
|
||
This documentation is open source, contribute at [Github Lodestar repository /docs](https://github.com/ChainSafe/lodestar/tree/unstable/docs). | ||
If you run the [execution client](https://ethereum.org/en/developers/docs/nodes-and-clients/#execution-clients) on the same host, you will need to check their requirements and add them to the above requirements. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Broadly, to run both an execution and a consensus client on the same machine, we recommend a 2 TB SSD and 16 GB RAM.
I am not sure this is even a good recommendation anymore, I generally recommend people to go for 4TB nowadays since it's unknown when we will ship the verge + blob count increase adds further storage burden.
16GB is also bit tight for Lodestar with most other ELs, it's not sufficient with Nethermind for example, my server is sitting at 17.2GB
strict_fee_recipient_check: false | ||
fee_recipient: "0xaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa" | ||
builder: | ||
enabled: true |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
enabled
is not part of file config, this is configured based on builder.selection
now, ie. to disable builder it has to be to set executiononly
lodestar/packages/cli/src/util/proposerConfig.ts
Lines 12 to 23 in 558ec2f
type ProposerConfigFileSection = { | |
graffiti?: string; | |
strict_fee_recipient_check?: string; | |
fee_recipient?: string; | |
builder?: { | |
// boolean are parse as string by the default schema readFile employs | |
// for js-yaml | |
gas_limit?: number; | |
selection?: routes.validator.BuilderSelection; | |
boost_factor?: bigint; | |
}; | |
}; |
|
||
### Enable Proposer Configuration | ||
|
||
After you have configured your proposer configuration YAML file, you can start Lodestar with an additional CLI flag option pointing to the file: `--proposerSettingsFile /path/to/proposer_config.yaml`. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should the flag be called --proposerConfigFile
instead? we could keep the old name as an alias
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should probably document that it can be retrieved via api which was recently added in #7210
Motivation
Our documentation has been undergoing incremental improvements to remove old content, update for relevance and add new content that hasn't been documented yet. This is one PR in a series of future PRs to improve docs for better user/developer experiences.
Related to #5961
Description
This PR addresses the following in our documentation:
More to come in future updates that were not included.