top | item 37023382

(no title)

multicast | 2 years ago

The need to adapt it for the average user is mentioned. The average person uses a pc (if ever - good luck with this on mobile) mostly for work (ms office + some ERP) and in some cases for private uses (e.g. news, e-banking, mails, important administrative work). If you go a bit deeper maybe reddit and video games. An average user would never want to link around stuff on the web with a hundred arrows and multiple colors. He simply does not care.

The author and the old guy in the video he linked to behave almost cult-like, especially the old guy: He literally claims that this IS the best method for working with documents ever, the www is a fork of his idea based on a "dumbed-down in the 70s at brown university", he does not understand why it has not already taken off and he thinks its the most important feature for the human race. Really?

If people really see potential in this and work in spaces like journalism and academic research there would be already big programs out there.

Yes it is good to have passion about something and yes it is good if someone has a real need for this and his delivered with a solution, but this will never go mainstream. And in my opinion not even in the segment of technical skilled people like engineers.

This is the typical invention that fits the "I know it is the best thing, I love it and almost pressure people to use it, but it has not taken off the lightest for decades" case.

discuss

order

patterns|2 years ago

The old guy in the video is Ted Nelson, the man who coined the term hypertext, made significant contributions to computer science, inspired two generations of researchers and continues to inspire as his works are being rediscovered.

There have been "big programs" but when the web came, fundamental hypertext research and development on other systems came to a grinding halt. Ted Nelson, and many other researchers, predicted many of the problems that we now face with the Web, notably broken links, copyright and payment as well as usability/user interface issues.

I don't know what an average user is, but what a user typically does or wants to do with a computer is somewhat (pre)determined by its design. Computer systems have, for better or worse, strong influence on what we consider as practical, what we think we need and even what we consider as possible. (Programming languages have a similar effect).

One of the key points of Ted Nelson's research is that much of the writing process is re-arranging, or recombining, individual pieces (text, images, ...) into a bigger whole. In some sense, hypertext provides support for fine-grained modularized writing. It provides mechanisms and structures for combination and recombination. But this requires a "common" hypertext structure that can be easily and conveniently viewed, manipulated and "shared" between applications. Because this form of editing is so fundamental, it should be part of an operating system and an easily accessible "affordance".

The Web is not designed for fine-grained editing and rearranging/recombining content and has started as a compromise to get work done at CERN. For example, following a link is very easy and almost instantaneous, but creating a link is a whole different story, let alone making a collection of related web pages tied to specific inquiries, or, even making a shorter version of a page with some details left out or augmented. Hypertext goes far deeper than this.

Although a bit dated, I recommend reading Ted Nelson's seminal ACM publication in which he touches many issues concerning writing, how we can manage different versions and combinations of a text body (or a series of documents), what the problems are and how they can be technically addressed.

[1] "Complex information processing: a file structure for the complex, the changing and the indeterminate" https://dl.acm.org/doi/pdf/10.1145/800197.806036

majormajor|2 years ago

> One of the key points of Ted Nelson's research is that much of the writing process is re-arranging, or recombining, individual pieces (text, images, ...) into a bigger whole. In some sense, hypertext provides support for fine-grained modularized writing. It provides mechanisms and structures for combination and recombination. But this requires a "common" hypertext structure that can be easily and conveniently viewed, manipulated and "shared" between applications. Because this form of editing is so fundamental, it should be part of an operating system and an easily accessible "affordance".

Here's where I'm stuck:

Hypertext - whether on the web or just on a local machine - can't solve the UX problem of this on its own, though. People can re-arrange contents in a hypertext doc, recombine pieces of it... but mostly through the same cut-and-paste way they'd do it in Microsoft Word 95.

The web adds an abstraction of "cut and paste just the link or tag that points to an external resource to embed instead making a fresh copy of the whole thing" but all that does is add in those new problems of stale links, etc.

So compared to a single-player Word doc, or even a "always copy by value" shared-Google-doc world that reduces the problems of dead external embeds, what does hypertext give me as a way of making rearranging things easier? Collapsible tags? But in a GUI editor the ability to select and move individual nodes can be implemented regardless of the backend file format anyway.

TLDR: I haven't seen an compelling-to-me-in-2023 demo of how this system should work, doing things that Google docs today can't that avoids link-rot problems and such, to think that the issue is on the document format instead of user tools interface side.