top | item 45036233

(no title)

a1371 | 6 months ago

I don't know if it's selection/survivor bias, but every time I watch a video about computers from the 60s and 70s, I am amazed how spot on they are with the trajectory of the technology.

Take this CAD demo from MIT back in 1963 showing features that I commonly use today: https://youtu.be/6orsmFndx_o

Then the 80s and 90s rolled in, the concept is computers that entered the mainstream. Imagination got too wild with movies like Electric Dreams (1984).

Videos like this make me think that our predictions of AI super intelligence are probably pretty accurate. But just like this machine, in actuality it may look different.

discuss

order

kristopolous|6 months ago

That's Ivan Sutherland though. He's one of the living legends of computing.

His doctor advisor was Claude Shannon and some of his students include the founder of Adobe, The founder of SGI and the creators of both Phong and Gouraud shading.

He also ran the pioneering firm Evans & Sutherland, a graphics research company starting in the 1960s. They produced things like https://en.wikipedia.org/wiki/Line_Drawing_System-1

He was a key person during the Utah school of computing's most influential years - when the Newell's famous Teapot came out for instance.

Saying his predictions are right on is kinda like saying Jony Ives predictions about what smartphones would look like was accurate

pixelpoet|6 months ago

I always fund it amusing that Phong was actually the guy's given name, but Vietnamese family name ordering isn't the same as in the US so everyone thought it was his surname and just rolled with it.

nxobject|6 months ago

NB: Sutherland is co-directs a lab on asynchronous logic with a colleague at Portland State. As the site says: "Please visit us when you are in the neighborhood!" The lab takes summer students, too, although Portland State is broke so don't expect compensation.

https://arc.cecs.pdx.edu

Tuna-Fish|6 months ago

Yeah. Ivan Sutherland did not predict the future. He decided what the future should be and made it happen.

JdeBP|6 months ago

It definitely is survivorship bias. Go and watch videos from the retrocomputing enthusiasts. There are loads of branches in computing history that are off-trajectory in retrospect, inasmuch as there can be said to be a trajectory at all.

Microdrives. The Jupiter Ace. Spindle controllers. The TMS9900 processor. Bubble memory. The Transputer. The LS-120. Mattel's Aquarius. …

And while we remember that we had flip-'phones because of communicators in 1960s Star Trek we forget that we do not have the mad user interfaces of Iron Man and that bloke in Minority Report, that the nipple-slapping communicators from later Star Trek did not catch on (quelle surprise!), that dining tables with 3-D displays are not an everyday thing, …

… and that no-one, despite it being easily achievable, has given us the commlock from Space 1999. (-:

* https://mastodonapp.uk/@JdeBP/114590229374309238

adrian_b|6 months ago

The Transputer as an implementation has failed, but all modern server/workstation CPUs have followed the Transputer model of organizing the CPU interfaces, starting with some later models of the DEC Alpha, followed by AMD Athlon and then by all others.

Unlike the contemporaneous CPUs and many later CPUs (which used buses), the Transputer had 3 main interfaces: a memory interface connecting memory to the internal memory controller, a peripheral interface and a communication interface for other CPUs.

The same is true for the modern server/workstation CPUs, which have a DRAM memory interface, PCIe for peripherals and a proprietary communication interface for the inter-socket links.

By inheriting designers from DEC Alpha, AMD has adopted this interface organization early (initially using variants of HyperTransport for peripherals and for inter-CPU communication), while Intel, like always, has been the last in adopting it, but they were forced to do this eventually (in Nehalem, i.e. a decade after AMD), because their obsolete server CPU interfaces reduced too much the performance.

pcblues|6 months ago

The Jupiter Ace was unreal, but only from a computer science perspective. You had to know a lot to know how to program Forth which was the fundamental language of that white but Spectrum-looking dish of a PC, in spite of a manual that read like HGTTG. Critically, it didn't reward you from the start of your programming journey like Logo or Basic did, and didn't have the games of the ZX Spectrum. I knew a person who tried to import and sell them in Australia. When I was young, he gave me one for free as the business had failed. RIP IM, and thanks for the unit!

https://80sheaven.com/jupiter-ace-computer/

Second Edition Manual: https://jupiter-ace.co.uk/downloads/JA-Manual-Second-Edition...

MomsAVoxell|6 months ago

>There are loads of branches in computing history that are off-trajectory in retrospect, inasmuch as there can be said to be a trajectory at all.

Vectrex. Jaz drives. MiniDisc. 8-track. CB Radio.

The more I notice, the less I feel there is a discussion to be had over this distinction.

The sci-fi predictions all came true - many of them, also came to pass, which is to say that the weight of the accomplishment of speculation to reality becomes immediately irrelevant in the context of the replacing technology.

Star Treks' communicators did catch on - among the content creation segment - but on the other hand, we also got the 'babelfish'-like reality of EarPods ..

I think the never-ending march of technology becomes fantastic at first, but mundane and banal the moment another fantasy is realised.

undebuggable|6 months ago

That's one of the reasons why touchscreen smartphones dominated the market in less than one decade. They made the dream of "real-time videotelephony from a rectangle" come true, a dream which had been present in literature and culture for around hundred of years.

zahlman|6 months ago

I read and watched quite a bit of sci-fi (including from the golden age) as a kid in the early 90s and don't recall such a dream. What media exactly did I overlook?

Cthulhu_|6 months ago

And yet, while 90's (and earlier) TV was talking breathlessly about video communication, it feels like it just "snuck in" to our daily lives when webcams and e.g. Skype became mainstream, and it never felt magical. Of course, the demos were tightly scripted and stifled.

smokel|6 months ago

One might also take on the more cynical perspective and be disappointed that we are still stuck with these early achievements.

FCOL most of us are now happy to have our AI overlords type out software on 80 column displays in plain ASCII because that is what we standardized on with Fortran.

zahlman|6 months ago

(I've never seen "FCOL" before and had to look it up. For onlookers: "for crying out loud", apparently.)

We aren't stuck with the terminal and CLIs. We stick with them, because they actually do have value.

80 columns is a reasonable compromise length, once you've accepted monospace text, that works with human perception, visual scanning of text etc. But many programmers nowadays don't feel beholden to this; they use any number of different IDEs, and they have their linters set varying maximum line lengths according to their taste, and make code windows whatever number of pixels wide makes sense for their monitor (or other configuration details), and set whatever comfortable font size with the corresponding implication for width in columns. (If anything, I'd guess programmers who actually get a significant amount of things done in terminal windows — like myself — are below average on AI-assisted-programming adoption.) Meanwhile, the IDE could trivially display the code in any font installed on the system, but programmers choose monospace fonts given the option.

As for "plain ASCII", there just isn't a use for other characters in the code most of the time. English is dominant in the programming world for a variety of historical reasons, both internal and external. Really, all of the choices you're talking about flow naturally from the choice to describe computer programs in plain text. And we haven't even confined ourselves to that; it just turns out that trying to do it in other ways is less efficient for people who already understand how to program.

kens|6 months ago

That's my "unpopular opinion" too. As I look at computer history, it amazes me how many things from the 1970s we still use. We are stuck at a local maximum due to the historical trajectory. Languages, terminal windows, editors, instruction sets, operating systems, CLIs, ...

NoSalt|6 months ago

Man, I LOVE Electric Dreams. It is one of my "guilty pleasure" movies. X-D