top | item 46867390

(no title)

smackeyacky | 26 days ago

The payment for software was well established on big iron by the 1970s before Microsoft were established

discuss

order

fuzzfactor|26 days ago

That was supposed to be enough.

It was well-accepted even at big companies like HP, microcomputers that fit on a desk were intended for individual "owners" to get the most out of the electronics by widely sharing programs so each person in that particular hardware ecosystem from the beginning could build on everybody else's work.

"Teams" weren't supposed to be necessary.

At least for the utility type of things that there was the most widespread need for. Nobody dreamed of making an improvent to something like a file manager and not contributing it to a newsletter.

There's no faster way to move ahead when you're talking about technology.

Something truly novel would probably be expected to receive a patent, otherwise code itself was not copyrighted like it can be today. Most people considered code to be the part of the computer that you programmed in yourself, and everybody needed as many sources as possible to learn from or nobody was going to get as far as the electronics were capable of. Everybody was expected to freely run all kinds of things that other people wrote, otherwise how are best practices supposed to arise?

This was before PC's or even Apples, these were industrial HP's not hobbyists using them at home.

These were microprocessor devices, built to be ideally affordable like nothing else, just the opposite of a mainframe which had always been out-of-reach for almost all aspiring programmers.

The idea was for nothing about them to cost money for the user unless there was absolutely no other way. And then it needed to be attractively priced. Nothing else could be considered suitable.

Most of the code to do most anything that most anybody was doing, was supposed to be easily available without actual business transactions. So the only programming you had to do yourself was mainly the specialized stuff for your own unique requirements beyond that. And that was supposed to get easier constantly because everything else was moving forward for every user in unison. The sky was the limit if microprocessors could catch on and everybody get the most they have to offer. After a few decades at the rate it was going? Sheesh.

Like nothing else can compare to, that part was getting easier all the time as everything was building upon the continuous progress everyone with that type hardware was making.

Until Gates came along and did this.

kens|26 days ago

I'll mention that mainframes also had extensive libraries of shared software. The IBM user group SHARE (Society to Help Avoid Redundant Effort) started in 1955 to share software and had hundreds of programs in an IBM-maintained library. IBM also had the Contributed Program Library. Software ranged from math libraries to programming languages to statistics packages to nuclear reactor simulators.

References: https://www.si.edu/object/archives/sova-nmah-ac-0498 http://www.bitsavers.org/pdf/ibm/650/programLibrary/Addition... https://en.wikipedia.org/wiki/IBM_Type-III_Library https://web.archive.org/web/20110322030511/http://www.bitsav...

ThrowawayR2|26 days ago

The suggestion that nobody was considering selling for-profit software on microcomputers out of hacker camaraderie seems rather difficult to believe, to put it mildly. CP/M wasn't free either IIRC and pre-dates Gates' BASIC.

[EDIT] Beyond that, if it hadn't been Gates, it would have been Steve Jobs. And if it hadn't been him either, it unquestionably would have been IBM, once they started taking the microcomputer market seriously and released the IBM PC.