-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
2018-11-02 Video of p2p requirement meeting #1
Comments
Here are my notes from the meetings, @djrtwo mentioned he was going to get them reviewed with EF's research team.Tens of thousands of 'standard' nodes (i.e. non validating nodes) 300 validators per shard (a validator stays a month on a shard.), 1024 shards. Sizing target: a standard node has typically 10 shards. Same for a validator. Obviously, we will also have smaller and/or very large nodes. A shard has typically 10 peers, configurable, could go up to 100 or more. Current vision: each shard/peer has its own tcp connection to maximize independence. Could be revisited. Peers are worldwide, locality/proximity is not taken into consideration while choosing the peers. Impact: |
Thank you for providing the summary, @nkeywal! Some comments and clarifying questions:
Does that mean that a node with "standard" hardware that can handle ~1 Ethereum chain right now will need to be able to handle 10 shards in the future, or do we scale the hardware requirements by a factor of ~10? Because if we do the former, we effectively only scale by a factor of 100 instead of 1000 (at least if we ignore all non-sharding related improvements).
I'd say those come on top as the 20k nodes presumably don't include (pooled) miners at the moment either. But personally I think it would be great if the network would work even without any user nodes and only validators (i.e. 300 nodes per shard). The reason being that in the beginning there might be shards with nothing going on at all. |
|
First issue 🎉. Danny explained the current sharding spec and Nichola asked some questions.
The text was updated successfully, but these errors were encountered: