top | item 17645513

(no title)

jssmith | 7 years ago

I think this has been the case for the case for the past 20 years, probably longer, though I'm optimistic that improvement is possible. E.g., perhaps serverless computing is inefficient today but it could help improve hardware utilization.

discuss

order

mmt|7 years ago

> perhaps serverless computing is inefficient today but it could help improve hardware utilization.

This was a big selling point of virtualization, originally. It was certainly true for environments that suffered from poor utilization due to, say, running one app per (often oversized and/or antiquated) physical servers, as I believe was common for enterprise IT shops.

Whether this improvement could have been achieved by other technical means (at least in non-Windows environments) is debatable. It's also unclear what percentage of total hardware utilization enterprise IT accounted for back then, and I suspect it was much higher than today.

For other environments, where virtualization would replace simple Unix time-sharing, it stands to reason that hardware utilization had to go up, if only moderately.

Interestingly, enterprise IT practices are still so extremely expensive today that moving to a cloud provider is obviously cheaper for them.