top | item 46391981

(no title)

c-linkage | 2 months ago

This seems like a tragedy of the commons -- GitHub is free after all, and it has all of these great properties, so why not? -- but this kind of decision making occurs whenever externalities are present.

My favorite hill to die on (externality) is user time. Most software houses spend so much time focusing on how expensive engineering time is that they neglect user time. Software houses optimize for feature delivery and not user interaction time. Yet if I spent one hour making my app one second faster for my million users, I can save 277 user hour per year. But since user hours are an externality, such optimization never gets done.

Externalities lead to users downloading extra gigabytes of data (wasted time) and waiting for software, all of which is waste that the developer isn't responsible for and doesn't care about.

discuss

order

Aurornis|2 months ago

> Most software houses spend so much time focusing on how expensive engineering time is that they neglect user time. Software houses optimize for feature delivery and not user interaction time.

I don’t know what you mean by software houses, but every consumer facing software product I’ve worked on has tracked things like startup time and latency for common operations as a key metric

This has been common wisdom for decades. I don’t know how many times I’ve heard the repeated quote about how Amazon loses $X million for every Y milliseconds of page loading time, as an example.

rovr138|2 months ago

There was a thread here earlier this month,

> Helldivers 2 devs slash install size from 154GB to 23GB

https://news.ycombinator.com/item?id=46134178

Section of the top comment says,

> It seems bizarre to me that they'd have accepted such a high cost (150GB+ installation size!) without entirely verifying that it was necessary!

and the reply to it has,

> They’re not the ones bearing the cost. Customers are.

dijit|2 months ago

I worked in e-commerce SaaS in 2011~ and this was true then but I find it less true these days.

Are you sure that you’re not the driving force behind those metrics; or that you’re not self-selecting for like-minded individuals?

I find it really difficult to convince myself that even large players (Discord) are measuring startup time. Every time I start the thing I’m greeted by a 25s wait and a `RAND()%9` number of updates that each take about 5-10s.

ponector|2 months ago

Contrary, every consumer facing product I've worked had no performance metrics tracked. And for enterprise software it was even worse as the end user is not the one who makes a decision to buy and use software.

>>what you mean by software houses

How about Microsoft? Start menu is a slow electron app.

ponector|1 month ago

>> I don’t know what you mean by software houses, but every consumer facing software product I’ve worked on has tracked things like startup time and latency for common operations as a key metric

Maybe Google? Gmail app is 700+ MB

moregrist|2 months ago

> I don’t know how many times I’ve heard the repeated quote about how Amazon loses $X million for every Y milliseconds of page loading time, as an example.

This is true for sites that are trying to make sales. You can quantify how much a delay affects closing a sale.

For other apps, it’s less clear. During its high-growth years, MS Office had an abysmally long startup time.

Maybe this was due to MS having a locked-in base of enterprise users. But given that OpenOffice and LibreOffice effectively duplicated long startup times, I don’t think it’s just that.

You also see the Adobe suite (and also tools like GIMP) with some excruciatingly long startup times.

I think it’s very likely that startup times of office apps have very little impact on whether users will buy the software.

xp84|2 months ago

> every consumer facing software product I’ve worked on has tracked things like startup time and latency for common operations as a key metric

Must be nice. In my career, all working on webapps, I've seen a few leaders popping in to ask us to fix a particularly egregious performance issue if the right customers complain, but aside from those finely-targeted and limited-attention-span drives to "improve performance" it seems the answer for the past decade or so is just to assume everyone is on at least a gigabit connection, stick fingers in ears, and just keep adding more node modules. If the developers' disks get full because node_modules got too big, buy a bigger SSD and keep going. (ok that last part is slight hyperbole but I also don't think frontend devs would be deterred from their ravenous appetite for libraries by a full disk).

j_w|2 months ago

Clearly Amazon doesn't care about that sentiment across the board. Plenty of their products are absurdly slow because of their poor engineering.

Yoric|2 months ago

Can confirm at least for Firefox. When I worked on it, I've spent literal years shaving seconds from startup, or shutdown, or milliseconds from tab switching.

Everybody likes to hate Telemetry, and yes, it can be abused, but that's how Mozilla (and its competitors) manage to make user's life more comfortable.

mindslight|2 months ago

> every consumer facing software product I’ve worked on has tracked things like startup time and latency for common operations as a key metric

Are they evaluating the shape of that line with the same goal as the stonk score? Time spent by users is an "engagement" metric, right?

eviks|2 months ago

The issue here is not tracking, but developing. Like, how do you explain the fact that whole classes of software have gotten worse on those "key metrics"? (and that includes web-selling webpages)

croes|2 months ago

Then why do many software house favor cloud software over on premise?

They often have a recognizable delay to user data input compared to local software

venturecruelty|2 months ago

>I don’t know what you mean by software houses, but every consumer facing software product I’ve worked on has tracked things like startup time and latency for common operations as a key metric.

Then respectfully, uh, why is basically all proprietary software slow as ass?

pjmlp|2 months ago

An exception that confirms the rule.

ekjhgkejhgk|2 months ago

I wouldn't call it tragedy of the commons, because it's not a commons. It's owned by microsoft. They're calculating that it's worth it for them, so I say take as much as you can.

Commons would be if it's owned by nobody and everyone benefits from its existence.

dahart|2 months ago

> so I say take as much as you can. Commons would be if it’s owned by nobody

This isn’t what “commons” means in the term ‘tragedy of the commons’, and the obvious end result of your suggestion to take as much as you can is to cause the loss of access.

Anything that is free to use is a commons, regardless of ownership, and when some people use too much, everyone loses access.

Finite digital resources like bandwidth and database sizes within companies are even listed as examples in the Wikipedia article on Tragedy of the Commons. https://en.wikipedia.org/wiki/Tragedy_of_the_commons

TeMPOraL|2 months ago

Still, because reality doesn't respect boundaries of human-made categories, and because people never define their categories exhaustively, we can safely assume that something almost-but-not-quite like a commons, is subject to an almost-but-not-quite tragedy of the commons.

jasonkester|2 months ago

It has the same effect though. A few bad actors using this “free” thing can end up driving the cost up enough that Microsoft will have to start charging for it.

The jerks get their free things for a while, then it goes away for everyone.

groundzeros2015|2 months ago

A public park suffers from tragedy of the commons even though it’s managed by the city.

drob518|2 months ago

Right. Microsoft could easily impose a transfer fee if over a certain amount that would allow “normal” OSS development of even popular software to happen without charge while imposing a cost to projects that try to use GitHub like a database.

TUSF|2 months ago

I wouldn't call it "tragedy of the commons" because the very idea was coined as a strawman. As far as I'm concerned, the entire concept is a fallacy, and people should stop perpetuating it.

rvba|2 months ago

I doubt anyone is calculating

Remember how GTA5 took 10 minutes to start and nobody cared? Lots of software is like this.

Some Blizzard games download 137 MB file every time you run them and take few minutes to start (and no, this is not due to my computer).

PunchyHamster|2 months ago

Well, till you choose to host something yourself and it becomes popular

ericyd|2 months ago

Tragedy of the Microsoft just doesn't sound as nice though

solatic|2 months ago

If you think too hard about this, you come back around to Alan Kay's quote about how people who are really serious about software should build their own hardware. Web applications, and in general loading pretty much anything over the network, is a horrible, no-good, really bad user experience, and it always will be. The only way to really respect the user is with native applications that are local-first, and if you take that really far, you build (at the very least) peripherals to make it even better.

The number of companies that have this much respect for the user is vanishingly small.

phkahler|2 months ago

>> The number of companies that have this much respect for the user is vanishingly small.

I think companies shifted to online apps because #1 it solved the copy protection problem. FOSS apps are not in any hurry to become centralized because they dont care about that issue.

Local apps and data are a huge benefit of FOSS and I think every app website should at least mention that.

"Local app. No ads. You own your data."

hombre_fatal|2 months ago

Software I don’t have to install at all “respects me” the most.

Native software being an optimum is mostly an engineer fantasy that comes from imagining what you can build.

In reality that means having to install software like Meta’s WhatsApp, Zoom, and other crap I’d rather run in a browser tab.

I want very little software running natively on my machine.

ghosty141|2 months ago

Yes because users don't appreciate this enough to pay for the time this takes.

zahlman|2 months ago

> Most software houses spend so much time focusing on how expensive engineering time is that they neglect user time. Software houses optimize for feature delivery and not user interaction time. Yet if I spent one hour making my app one second faster for my million users, I can save 277 user hour per year. But since user hours are an externality, such optimization never gets done.

This is what people mean about speed being a feature. But "user time" depends on more than the program's performance. UI design is also very important.

bawolff|2 months ago

> Software houses optimize for feature delivery and not user interaction time. Yet if I spent one hour making my app one second faster for my million users, I can save 277 user hour per year. But since user hours are an externality, such optimization never gets done.

Google and amazon are famous for optimizing this. Its not an externality to them though, even 10s of ms can equal an extra sale.

That said, i don't think its fair to add time up like that. Saving 1 second for 600 people is not the same as saving 10 minutes for 1 person. Time in small increments does not have the same value as time in large increments.

esafak|2 months ago

1. If you can price the cost of the externality, you can justify optimizing it.

2. Monopolies and situations with the principal/agent dilemma are less sensitive to such concerns.

ozim|2 months ago

About apps done by software houses, even though we should strive for doing good job and I agree with sentiment...

First argument would be - take at least two 0's from your estimation, most of applications will have maybe thousands of users, successful ones will maybe run with 10's of thousands. You might get lucky to work on application that has 100's of thousands, millions of users and you work in FAANG not a typical "software house".

Second argument is - most users use 10-20 apps in typical workday, your application is most likely irrelevant.

Third argument is - most users would save much more time learning how to use applications (or to use computer) properly they use on daily basis, than someone optimizing some function from 2s to 1s. But of course that's hard because they have 10-20 apps daily plus god know how many other not on daily basis. Though still I see people doing super silly stuff in tools like Excel or even not knowing copy paste - so not even like any command line magic.

robmccoll|2 months ago

I don't think most software houses spend enough time even focusing on engineering time. CI pipelines that take tens of minutes to over an hour, compile times that exceed ten seconds when nothing has changed, startup times that are much more than a few seconds. Focus and fast iteration are super important to writing software and it seems like a lot of orgs just kinda shrug when these long waits creep into the development process.

3371|2 months ago

The user hour analogy sounds weird tho, 1s feels 1s regardless how many users you have. It's like the classic Asian teachers' logic of "if you come in 1 min late you are wasting N minutes for all of us in this class." It just does not stack like that.

BenjiWiebe|2 months ago

If the class takes N minutes and one person arrives 1 minute late, and the rest of the class is waiting for them, it does stack. Every one of those students lost a minute. Far worse than one student losing one minute.

DrewADesign|2 months ago

> Yet if I spent one hour making my app one second faster for my million users, I can save 277 user hour per year. But since user hours are an externality, such optimization never gets done.

Wait times don’t accumulate. Depending on the software, to each individual user, that one second will probably make very little difference. Developers often overestimate the effect of performance optimization on user experience because it’s the aspect of user experience optimization their expertise most readily addresses. The company, generally, will have a much better ROI implementing well-designed features and having you squash bugs

drbojingle|2 months ago

A well designed feature IS considerate of time and attention. Why would I want a game on 20 fps when I could have it on 120? The smoothness of the experience increases my ability to use the experience optimally because I don't have to pay as much attention to it. I'd prefer if my interactions with machines were as smooth as my interactions driving a car down a empty dry highway mid day.

Prehaps not everyone cares but I've played enough Age of Empires 2 to know that there are plenty of people who have felt value gains coming from shaving seconds off this and that to get compound games over time. It's a concept plenty of folks will be familiar with.

pastor_williams|2 months ago

This was something that I heavily focused on for my feature area a year ago - new user sign up flow. But the decreased latency was really in pursuit of increased activation and conversion. At least the incentives aligned briefly.

inapis|2 months ago

>Yet if I spent one hour making my app one second faster for my million users, I can save 277 user hour per year. But since user hours are an externality, such optimization never gets done.

I have never been convinced by this argument. The aggregate number sounds fantastic but I don't believe that any meaningful work can be done by each user saving 1 second. That 1 second (and more) can simply be taken by me trying to stretch my body out.

OTOH, if the argument is to make software smaller, I can get behind that since it will simply lead to more efficient usage of existing resources and thus reduce the environmental impact.

But we live in a capitalist world and there needs to be external pressure for change to occur. The current RAM shortage, if it lasts, might be one of them. Otherwise, we're only day dreaming for a utopia.

adrianN|2 months ago

Time saved to increased productivity or happiness or whatever is not linear but a step function. Saving one second doesn’t help much, but there is a threshold (depending on the individual) where faster workflows lead to a better experience. It does make a difference whether a task takes a minute or half a second, at least for me.

jorvi|2 months ago

But there isn't just one company deciding externalizing cost on the rest of us is a great way to boost profit since it costs them very little. Especially for a monopoly like YouTube that can decide that eating up your battery is fine if it saves them a few cents in bandwidth costs.

Not all of those externalizing companies abuse your time but whatever they abuse can be expressed in a $ amount and $ can be converted to a median's person time via median wage. Hell, free time is more valuable than whatever you produce during work.

Say all that boils down to companies collectively stealing 20 minutes of your time each day. 140 minutes each week. 7280 (!) minutes each year, which is 5.05 days, which makes it almost a year over the course of 70 years.

So yeah, don't do what you do and sweettalk the fact that companies externalize costs (private the profits, socialize the losses). They're sucking your blood.

Aerroon|2 months ago

One second is long enough that it can put a user off from using your app though. Take notifications on phones for example. I know several people who would benefit from a habitual use of phone notifications, but they never stick to using them because the process of opening (or switching over to) the notification app and navigating its UI to leave a notification takes too long. Instead they write a physical sticky note, because it has a faster "startup time".

WhyNotHugo|2 months ago

> I have never been convinced by this argument. The aggregate number sounds fantastic but I don't believe that any meaningful work can be done by each user saving 1 second. That 1 second (and more) can simply be taken by me trying to stretch my body out.

I’d see this differently from a user perspective. If the average operations takes one second less, I’d spend a lot of time less waiting for my computer. I’d also have less idle moments where my mind wanders while waiting for some operation to complete too.

schubidubiduba|2 months ago

Just because one individual second is small, it still adds up.

Even if all you do with it is just stretching, there's a chance it will prevent you pulling a muscle. Or lower your stress and prevent a stroke. Or any number of other beneficial outcomes.

gritzko|2 months ago

Let’s make a thought experiment. Suppose that I have a data format and a store that resolves the issues in the post. It is like git meets JSON meets key-value. https://github.com/gritzko/go-rdx

What is the probability of it being used? About 0%, right? Because git is proven and GitHub is free. Engineering aspects are less important.

pdimitar|2 months ago

I am very interested by something like this but your README is not making it easy to like. Demonstrating with 2-3 sample apps using RDX might have gone a long way.

So how do I start using it if I, for example, want to use it like a decentralized `syncthing`? Can I? If not, what can I use it for?

I am not a mathematician. Most people landing on your repo are not mathematicians either.

We the techies _hate_ marketing with a passion but I as another programmer find myself intrigued by your idea... with zero idea how to even use it and apply it.

stkdump|2 months ago

Sorry, I am turned off by the CRDT in there. It immediately smells of overengineering to me. Not that I believe git is a better database. But why not just SQL?

vlovich123|2 months ago

I think it’s naive to think engineers or managers don’t realize this or don’t think in these ways.

https://www.folklore.org/Saving_Lives.html

pdimitar|2 months ago

Is it truly naive if most engineer's careers pass and they never meet even one such manager?

For 24 years of career I've met the grand total of _two_ such. Both got fired not even 6 months after I got in the company, too.

Who's naive here?

loloquwowndueo|2 months ago

Just a reminder that GitHub is not git.

The article mentions that most of these projects did use GitHub as a central repo out of convenience so there’s that but they could also have used self-hosted repos.

machinationu|2 months ago

Explain to me how you self-host a git repo which is accessed millions of time a day from CI jobs pulling packages.

justincormack|2 months ago

They probably would have experienced issues way sooner, as the self hosted tools don't scale nearly as well.

imiric|2 months ago

> GitHub is free after all, and it has all of these great properties, so why not?

The answer is in TFA:

> The underlying issue is that git inherits filesystem limitations, and filesystems make terrible databases.

JohnHaugeland|2 months ago

> This seems like a tragedy of the commons -- GitHub is free after all, and it has all of these great properties, so why not?

because it's bad at this job, and sqlite is also free

this isn't about "externalities"

gverrilla|2 months ago

Nothing surprising. Capital hates people, even though we sustain his kingdom.

brightball|2 months ago

User time is typically a mix of performance tuning and UX design isn’t it?

threatofrain|2 months ago

> Most software houses spend so much time focusing on how expensive engineering time is that they neglect user time. Software houses optimize for feature delivery and not user interaction time.

Oh no no no. Consumer-facing companies will burn 30% of your internal team complexity budget on shipping the first "frame" of your app/website. Many people treat Next as synonymous with React, and Next's big deal was helping you do just this.

machinationu|2 months ago

[deleted]

benchloftbrunch|2 months ago

As long as you don't have any security compliance requirements and/or can afford the cost of self hosting your LLM, sure.

Anyone working in government, banking, or healthcare is still out of luck since the likes of Claude and GPT are (should be) off limits.

camgunz|2 months ago

I've never been more convinced LLMs are the vanguard of the grift economy now that green accounts are low effort astroturfing on HN.

massysett|2 months ago

> Externalities lead to users downloading extra gigabytes of data (wasted time) and waiting for software, all of which is waste that the developer isn't responsible for and doesn't care about.

This is perfectly sensible behavior when the developers are working for free, or when the developers are working on a project that earns their employer no revenue. This is the case for several of the projects at issue here: Nix, Homebrew, Cargo. It makes perfect sense to waste the user's time, as the user pays with nothing else, or to waste Github's bandwidth, since it's willing to give bandwidth away for free.

Where users pay for software with money, they may be more picky and not purchase software that indiscriminately wastes their time.

BobbyTables2|2 months ago

Microsoft would have long gone out of business if users cared about their time being wasted.

Windows 11 should not be more sluggish than Windows 7.