top | item 47107868

(no title)

infogulch | 9 days ago

That's an interesting stress test for I2P. They should try to fix that, the protocol should be resilient to such an event. Even if there are 10x more bad nodes than good nodes (assuming they were noncompliant I2P actors based on that thread) the good nodes should still be able to find each other and continue working. To be fair spam will always be a thorny problem in completely decentralized protocols.

discuss

order

embedding-shape|9 days ago

> Even if there are 10x more bad nodes than good nodes [...] the good nodes should still be able to find each other

What network, distributed or decentralized, can survive such an event? Most of the protocols break down once you hit some N% threshold of the network being bad nodes, asking it to survive 1000%+ bad nodes when others usually is something like "When at least half the nodes are good". Are there existing decentralized/distributed protocols that would survive a 1000% attack of bad nodes?

sandworm101|9 days ago

No. They should not try to survive such attacks. The best defense to a temporary attack is often to pull the plug. Better than than potentially expose users. When there are 10x as many bad nodes as good, the base protection of any anonymity network is likely compromised. Shut down, survive, and return once the attacker has moved on.

conradev|9 days ago

This is why Tor is centralized, so that they can take action like cutting out malicious nodes if needed. It’s decentralized in the sense that anyone can participate by default.

martin-t|9 days ago

Why would an attacker move on if it can maintain a successful DoS attack forever?

01HNNWZ0MV43FF|9 days ago

Finding good nodes is a thorny problem for human friendship, too!

kkfx|9 days ago

That's why the Web of Trust, or classic GNUPG key signing parties are a forgotten/ignored must have. Anyone can change and go rouge of course, but it's statistically less likely.

seertaak|9 days ago

Funny and excellent comment!