top | item 25330241

(no title)

izuchukwu | 5 years ago

The article’s position comes down to “no fundamentally new way to program would make sense for today’s programmers to switch to it”, and gives examples like the platforms of the no-code movement.

From previous generational leaps, we’ve learned that the users post-leap don’t look like the pre-leap users at all. The iPod’s introduction brought about a generation of new digital music users that didn’t look like the Limewire generation, and the iPhone’s average user didn’t look like the average user of the BlackBerry before it.

Modern programming is at the core of HN, and of most of SV, sure. That said, we should still be the first to realize that a successful, fundamentally new way to program would target a new generation and idea of software maker, one that won’t look like the modern developer at all.

discuss

order

benjaminjosephw|5 years ago

Exactly. A paradigm shift implies new mental models and new metaphors for our abstractions that might not be valuable to people who think our current abstractions serve us well.

A great example of this is the fact that we still use the metaphor of files and folders for organizing our source code. The Unison language works directly with an AST that is modified from a scratch file[0]. For people committed to new models of distributed computing, that makes sense; for everyone else, it might be seen as an idea that messes with their current tooling and changes existing and familiar workflows.

I think the really big leaps forward are going to go well beyond this and they will look like sacrilege to the old guard. New programmers don't care if a programming language is Turing complete or if the type system has certain properties, they only care about working software but existing programmers are dogmatic about these concepts. I think the next leap forward in programming is going to offend the sensibilities of current programmers. Having to break with orthodoxy to get a job done won't worry people who don't know much about programming tradition to begin with.

[0] - https://www.unisonweb.org/docs/tour#-to-the-unison-codebase-...

giantDinosaur|5 years ago

Perhaps. Or it'll be like civil engineering or something, where the fundamental principles really stay similar even as technology/theory dramatically improves.

_8ljf|5 years ago

“I think the next leap forward in programming is going to offend the sensibilities of current programmers.”

Honestly, programmers have been railing against progress ever since the first machine coders shook their canes at those ghastly upstart programming languages now tearing up their lawns.

Meanwhile, what often does pass for “progress” amounts to anything but:

https://fermatslibrary.com/s/the-emperors-old-clothes

--

“It’s a curious thing about our industry: not only do we not learn from our mistakes, we also don’t learn from our successes.” – Keith Braithwaite

xorcist|5 years ago

> we still use the metaphor of files and folders for organizing our source code

We don't use the metaphor for storing things. What we use is an hierarchical naming scheme. This makes sense for a number of use cases, and has been independently discovered multiple times through the short history of computing.

You may call the nodes files and folders. It is however just a word, a parable for the underlying data structure, which is the physical reality. You could just as easily call it something else. And many people, whose first language is different from yours, probably does.

zupa-hu|5 years ago

Wow, thanks for sharing Unison, seems super interesting! I've been thinking about content addressed code compilation lately that could allow one to have all versions of a program within a single binary. Apparently there are other benefits to it. Can't wait to learn what they have discovered!

tabtab|5 years ago

Re: "A great example of this is the fact that we still use the metaphor of files and folders for organizing our source code."

I agree 100%! Trees are too limiting. I'm not sure we need entirely new languages to move away from files, we just need more experiments to see what works and what doesn't, and add those features to existing languages & IDE's if possible. I don't like the idea throwing EVERYTHING out unless they can't be reworked. (Files may still be an intermediate compile step, just not something developers have to normally be concerned with.)

I believe IDE's could integrate with existing RDBMS or something like Dynamic Relational, which tries to stick to most RDBMS norms rather than throw it all out like NoSql tried, in order to leverage existing knowledge.

Your view of source code would then be controlled by querying (canned and custom): bring all of aspect A together, all of aspect B together, etc. YOU control the (virtual) grouping, not Bill Gates, Bezos, nor your shop's architect.

Most CRUD applications are event driven, and how the events are grouped for editing or team allocation should be dynamically determined and not hard-wired into the file system. Typical event search, grouping, and filter factors include but are not limited to:

   * Area (section, such as reference tables vs. data)
   * Entity or screen group
   * Action type: "list", "search", "edit", etc.
   * Stage: Query, first pass (form), failed validation, render, save, etc.
And "tags" could be used to mark domain-specific concerns. Modern CRUD is becoming a giant soup of event handlers, and we need powerful RDBMS-like features to manage this soup using multiple attributes, both those built into the stack and application-specific attributes/tags.

goatlover|5 years ago

Then why hasn't this happened over the past 40 years? That's more than one generation of programmers over a whole lot of change from mainframes to PCS, the web, mobile devices and cloud services with thousands of programming languages and tools being invented over that time, but mostly it's incremental process. PLs today aren't radically different than they were in the 60s. It's something visionaries like Alan Kay have repeatedly complained about.

waheoo|5 years ago

> Unison language

There goes my Christmas break.

Thanks.

renox|5 years ago

I wonder why they created their own language over creating a tool over Go for example.

PaulDavisThe1st|5 years ago

> A great example of this is the fact that we still use the metaphor of files and folders for organizing our source code.

I think there's something akin to a category error here.

First, let's agree that we do want to organize our source code to some degree. There are chunks of source code (on whatever scale you prefer: libraries, objects, concepts etc) thare are related to each other more than they are related to other chunks. The implementation of "an object" for example consists of a set of chunks that are more closely related to each other than they are to any chunk from the implementation of a different object.

So we have some notion of conceptual proximity for source code.

Now combine that with just one thing: scrolling. Sure, sometimes when I'm working on code I want to just jump to the definition of something, and when I want to do that, I really don't care what the underlying organization of the bytes that make up the source code.

But scrolling is important too. Remove the ability to scroll through a groups of conceptually proximal code chunks and I think you seriously damage the ability of a programmer to interact in fundamentally useful ways with the code.

So, we want the bytes that represent a group of conceptually proximal code chunks to be scrollable, at least as one option in a set of options about how we might navigate the source code.

Certainly, one could take an AST and "render" some part of it as a scrollable display.

But what's another name for "scrollable bytes"? Yes, you've guessed it: we call it a file.

Now, rendering some subset of the AST would make sense if there were many different ways of putting together a "scroll" (semantically, not implementation). But I would suggest that actually, there are not. I'd be delighted to hear that I'm wrong.

I think there's a solid case for programming tools making it completely trivial to jump around from point to point in the code based, driven by multiple different questions. Doing that well would tend to decouple the programmer's view of the source as "a bunch of files" from whatever the underlying reality is.

But ... I haven't even mentioned build systems yet. Given how computers actually work, the end result of a build is ... a set of files. Any build system's core function is to take some input and generate a set of files (possibly just one, possibly many more). There's no requirement that the input also be a set of files, but for many reasons, it is hellishly convenient that the basic metaphor of "file in / file out" used by some many steps in a build process tends to lead to the inputs to the build process also being files.

Forge36|5 years ago

I wonder how much of this comes from how tightly "programming" has been defined as/synonymous with "writing code".

I have two family members who brought up much of their job was "custom formulas in excel". They would not call themselves programmers, but they'd learned some basic programming for their job.

I wonder how much "Microsoft Flow Implementer" will become its own job focus with more and more people getting access to Teams.

KMag|5 years ago

Former Limewire developer here. I definitely had an iPod mini prior to Limewire's peak years, counted by number of monthly peers reachable via crawling the Gnutella network.

izuchukwu|5 years ago

That’s very interesting to note. I imagine that the popularity of the iPod led a lot of new people to Limewire before the iTunes Store and Spotify took off, pushing its true peak to be a lot later than most (including myself) might recall.

The “Don’t steal music” label on every new iPod might as well have been a Limewire ad.

echelon|5 years ago

What's your take on how everything works these days?

Do you miss p2p?

Do you think we could ever get back to it?

coldtea|5 years ago

I think a better criterium for "peak years" is peak momentum (user increase over time), vs peak users.

Most technologies reach peak users when they stop growing or when their growth stops accelerating (and then they either get somewhat stable -e.g. Microsoft, or like many, start to decline - e.g. Blackberry).

toyg|5 years ago

I agree with the overall point but I think you're using the wrong example. Music consumption didn't really change with the ipod, it changed with abundant mobile data that removed the need for locally-stored files (and hence their management). You can argue that the introduction of iTunes changed the game, which it did a bit, but imho mobile data is what fundamentally altered the field. Imho the move really was cd -> mp3 (filesharing/iTunes) -> streaming to mobile.

golergka|5 years ago

When I bought a 40gb music player in 2005, I stopped downloading songs and started downloading discographies of entire artists and labels. The change that came with streaming services wasn't the first one.

miki123211|5 years ago

I think the next paradigm shift in programming is the shift from local to cloud IDEs.

I see a lot of backlash against that idea these days, but it seems inevitable.

I don't think we can predict the full consequences of that, but one I see already is massively lowering friction. If the cloud knows how to run your code anyway, there's no reason why the fork button couldn't immediately spin up a dev environment. No Docker, no hunting for dependencies, just one click, and you have the thing running.

The next generation of programmers (mostly young teenagers at this point) is often using repl.it apparently, and building cool stuff with it. This is definitely promising for this approach, as the old generation will pass away eventually.

BariumBlue|5 years ago

I think I know of an example:

I work with some folks who use Brewlytics (https://brewlytics.com/). It's basically a way to use logical modeling to automate tasks, actions, and pull and push data for said automation. It's parallel to programming - these folks are using iterators, splitting and recombining fields, creating reusable parts out of smaller parts. They basically ARE programming, but almost none of them know anything more about programming than Hello World in Python.

I find the situation absolutely bizarre

hodgesrm|5 years ago

Excel and spreadsheets in general are one of the best examples of a generational leap that expands the programming market to new users.

username90|5 years ago

We don't call people working in excel programmers though, not even themselves do that. That is the thing, we create a ton of wonderful no/low code tools, but then we create different jobs from programming since the programming is no longer the hard part the role is no longer a programmer.

flohofwoe|5 years ago

IMHO programming language design is (or at least should be) guided by the underlying hardware. If the hardware dramatically changes, the way this new hardware is programmed will also need to change radically. But as long as the hardware doesn't radically change (which it didn't so far for the last 70 years or so), programming this hardware won't (and shouldn't) radically change either. It's really quite simple (or naive, your pick) :)

vbezhenar|5 years ago

CPUs are progressing with more cores. Today a program should utilize 100+ cores to fully saturate modern server CPU (or 32 cores for consumer CPU).

GPU computations are a thing for many years and their hardware drastically differs from conventional CPUs.

There are neural accelerators in the latest computers. I have no idea what they do, but may be they warrant new programming approaches as well.

scotty79|5 years ago

It is guided by hardware changes and changed over the last 70 years a lot.

GPUs spawned shader languages and network cards spawned html and javascript.

andrewjl|5 years ago

> That said, we should still be the first to realize that a successful, fundamentally new way to program would target a new generation and idea of software maker, one that won’t look like the modern developer at all.

Channeling Gibson[1], do you see any potential successors already out there?

[1] “The future is already here – it's just not evenly distributed."

izuchukwu|5 years ago

I’d keep an eye on the no-code movement.

Real leaps can be distinguished from hype by where the passion is coming from. The fact that the movement’s passion is coming from actual, paying users and not just no-code platform makers is key here.

It’s rapidly creating a new generation of software creators that could not create software before, and it’s improving very, very fast.

AnimalMuppet|5 years ago

I notice, though, that your examples are not from programming at all. Your examples are about users of devices. True, programmers use languages, but programming is far more complicated than using a music service.

Something like "no code" may make programming easier... until it doesn't. That is, you get to the point where either you can't do what you need to do, or where it would be easier to do it by just writing the code. If the "no code" approach lets you write significant parts of your program that way, it may still be a net win, but it's not the way we're going to do all of programming in the future.

EricE|5 years ago

"I notice, though, that your examples are not from programming at all. Your examples are about users of devices. "

Just to level set - as a program manager when I engage with programmers it's not because I want to buy programmers.

I want the fruits of their labors.

Let me put it another way - programmers love to bemoan the way users abuse Excel. Users abuse Excel because it meets their needs best, given all other factors in their environments.

If things like no code environments progress where they can provide at a minimum the level of functionality Excel can for many tasks then it will take off. No, it won't be "all of programming" but enough to be a paradigm shif?

You betcha.

izuchukwu|5 years ago

Generational leaps emerge in the same ways everywhere.

For any space, if you provide a large enough net win for a large enough number of people, you introduce a generational leap. Very often, those people are completely new to the space.

The measure here isn’t how many growing companies that started with no-code adopt code as they grow. The measure here is how many growing companies that started with no-code wouldn’t have been started otherwise.

Siira|5 years ago

This claim is resting upon flimsy metaphors only. Of course, tautologically, each new wave of tech has some differences in demographics, and old people are slow to learn new paradigms, and generations differ, BUT ultimately we have no idea if/when there will be a new wave and how different its users will be. It might very well happen that the demographics don’t change as much, as the programming profession is already one of the most fragmented and eclectic, and attracts people whose primary virtue is manipulating logical abstractions.

grumple|5 years ago

I think you are taking the weakest possible extrapolation of the article's position and attacking that.

This article is about changing techs for an existing product. And the author is correct; tech changes are very costly for existing products. You have to weigh the cost of the rewrite. Swapping out your markdown parsing library is probably relatively low-cost. Swapping out your web framework is potentially years of work for no practical gain.

Most of us aren't working on new things. Day 2 of a company's existence, you already have legacy code and have to deal with things that were built before.

oblio|5 years ago

Science advances, one burial at a time :-)

higerordermap|5 years ago

I think no-code was not the right idea.

Removing barriers of entry comes with its own problems. Today we see that horrible error-prone excel sheets that are created by non-programmers wasn't a great idea at all. Similarly, many web developers don't understand performance and we end up with bloated sites / electron apps.

I think lot of progress will be incremental. Seemingly "revolutionary" ideas like light table break on modestly real world stuff. Function programming is elegant and all unless you hit a part of problem fundamentally imperative or if there's a performance problem. I think programming progress will be incremental, just as industry continues to mature.

carlmr|5 years ago

>Function programming is elegant and all unless you hit a part of problem fundamentally imperative or if there's a performance problem.

Most functional languages allow you to do imperative stuff. so this is not an issue. They just usually provide an environment where the defaults guide you to functional (immutable by default, option/result types instead of exceptions, making partial application of functions and piping easy, etc.).

A prime example would be F#. You can program pretty much the same as in C# if you need to, but there are a lot of facilities for programming in a more functional style.