top | item 45762166

(no title)

throwwgisgreat | 4 months ago

> processing billions of Kafka events per day

Except that the burden is on all clients to coordinate to avoid processing an event more than once since Kakfa is a brainless invention just dumping data forever into a serial log.

discuss

order

williamdclt|4 months ago

I'm not sure what you're talking about.

Do you mean different consumers within the same consumer group? There's no technology out there that will guarantee exactly-once delivery, it's simply impossible in a world where networks aren't magically 100% reliable. SQS, RedPanda, RabbitMQ, NATS... you call it, your client will always need idempotency.

mrkeen|4 months ago

That is called a 'consumer group' which has been a part of Kafka for 15 years.

The author is suggesting to avoid this solution and roll your own instead.