top | item 39460198

(no title)

tharakam | 2 years ago

I'm confused. Everything sounds very expensive to me.

The last table which compares it with the other vendors is surprising. Even Stich Data (cheapest) costs $1 to move 240K records: (1B / 4,166.67 = 240K). Is this real?

So, their solution costs $1 to process 13.6M records. Sounds like this is not very share-worthy.

What I'm missing here?

discuss

order

garciasn|2 years ago

What I want to know is why the fuck it takes 8 days to load 700MM records—in 2024.

I couldn’t even continue reading the article because it must be from 2006.

saisrirampur|2 years ago

700M records in 8 days (1024 rps) is to mimic a real-world transactional (OLTP) workload. It doesn't define limits on what throughput can be achieved.

api|2 years ago

Welcome to 2024 and the generation of developers raised in the cloud native world who think this is normal.

A billion rows is nothing and having $100 appear in conjunction with that is absurd unless you are doing some kind of really heavy compute or AI model training on that data.

By 2030 we’ll have those costs well up over a thousand dollars and it’ll take five or six separate SaaS systems wired together to do this. Progress!

Aeolun|2 years ago

To be fair, my machine that can process 1B rows an unlimited number of times still cost $1000 to build, so if you need a one off maybe paying the $100 is better?