top | item 36545259

(no title)

comment_ran | 2 years ago

It really depends on what data you're processing with. For someone whose data is hundreds of tetrabytes, the workflow to saving those data and manipulating or retrieving those data as request becomes very tricky, very hard task, especially when you think about the backup time. Maybe you can scale down the concept of L1, L2, L3 cache but at a higher, bigger scale. And always figure out what's the most important things to you at different stages and make those choices. You definitely compromise something in order to have large scale of data. So, again, it's about figuring out what's most important to you.

discuss

order

No comments yet.