Replying to @stilkov
For any block they win settling in the hash collision lottery, which isn't very often.
1
Of course. Yet there are people and companies doing it. Theyโ€™re not doing it for charity.
1
Thatโ€™s something we can agree on โ€“ scaling is definitely an issue. Something will have to change, donโ€™t know what the best way is
2
alright, now we can talk arch :) The BC premise is that you can achieve globally converged, strongly consistent ledgers...
1
... which should light up your b/s caution lamp. The BC tradeoff trick is to play for time.
2
1
One party wins the POW lottery and has enough time to tell everyone else about it before anyone else also wins. And everyone moves
1
that particular tradeoff model for reaching consensus is effectively throttled by design as functional prerequisite
1
That is of course the whole point of PoW. But e.g. an increase in block size would immediately increase capacity
1
leads to another tradeoff call. Bigger block means more bytes to crunch hashing and potentially longer wait to collect.
1
Replying to @clemensv
My uneducated guess is that the SHA256 complexity increase due to increased input size isnโ€™t the bottleneck. I may be wrong.

Mar 18, 2017 ยท 10:37 AM UTC

1
Replying to @stilkov
SHA-256 is calculated over the input in 512 bit chunks and the race is to make a block w/ a hash smaller than the previous block's
1
more input plainly means more compute work to find out whether this block (w/ chosen nonce) has a winning chance.
1