top | item 23567744

The Return of the 90s Web

786 points| mxbck | 5 years ago |mxb.dev

338 comments

order
[+] waltbosz|5 years ago|reply
What I miss most from the early days of the Internet is the content. It was all created with love.

My theory is that the high barrier to entry of online publishing kept all but the most determined people from creating content. As a result, the little content that was out there was usually good.

With today's monetized blogs, it is often content for content's sake. People don't try, or they write about topics which they are not really interested in, but did just to have a new post. Or often the writing is bad.

Maybe today's problem isn't the blogs, but the SEO that puts the crap blogs at the top of the search results. Or maybe I'm misremembering and the old content was crap too, or maybe my standards are higher than they were in my teenage years.

[+] kickscondor|5 years ago|reply
People are still creating great stuff along these lines - you just won't find it through Google or Facebook or most of Reddit. Complex, interesting hypertext creations and web sites are still everywhere. But try typing "interesting hypertext" into Google or Facebook and see where it gets you. You can't search for something that's off the beaten track.

This is where directories come back in. Check some of these out:

* https://marijnflorence.neocities.org/linkroll/

* https://neonaut.neocities.org/directory/

* https://webring.xxiivv.com/ (which led me to this gem: https://dreamwiki.sixey.es/)

Competing with Google in search has become an insurmountable task. Personal directories attack from the opposite direction (human curation, no algorithm) in a way that actually puts Google far behind. It's kind of exciting and unexpected.

[+] mtgp1000|5 years ago|reply
I think what you're saying about reduced barriers to entry has lowered the standard of all popular media.

It used to be expensive to publish anything - especially the further back in time you go. So classics for example typically represent particularly bright writers, as having something published before the printing press, and widely disseminated, was simply unlikely to happen.

But today anyone can create an account on YouTube or stream on twitch and it doesn't matter if the content is of any particular quality or veracity, so long as the common man sees what he wants to see.

I think there's a major secondary effect, in that now that we are surrounded by low quality media, the average person's ability to recognize merit in general is lessened.

[+] krapp|5 years ago|reply
>My theory is that the high barrier to entry of online publishing kept all but the most determined people from creating content. As a result, the little content that was out there was usually good.

The barrier wasn't that high. Making a site on Geocities, Tripod or Angelfire wasn't that difficult. Writing 90's style HTML wasn't exactly writing a kernel in C, and most of those services had WYSIWYG editors and templates anyway. Few of the people publishing to the web in the 90s were programmers, so the technical knowledge required was minimal.

And plenty of people are publishing high quality content on the modern web, even on blogs and centralized platforms. I follow writers, scientists and game developers on Twitter, watch a lot of good content on Youtube, read a lot of interesting conversations on Reddit. The fact that people publishing content nowadays don't have to write an entire website from scratch has little to do with their personal passions (or lack thereof), whether they're interesting or (and ye gods how I've come to hate this) "quirky." That's like saying writers can't write anything worth reading unless they also understand mechanical typesetting.

As far as the old content goes, of course most of it was crap. Sturgeon's Law applies to every creative medium. Most blogs were uninteresting, many personal sites were just boring pages full of links or stuff no one but the author and maybe their few friends cared about. In both cases, between the old and new web, a bias for the past (as HN tends to have) leads people to only remember the best of the former and correlate it with the worst of the latter.

[+] pwdisswordfish2|5 years ago|reply
We are asked to believe that advertising is required for there to be any content on the Web.

However, comments like this seem to be proof that is not true.

I have personal memories of what the Web was like in 1993 but there are so many people today who are feeding off the advertiser bosom what are the chances anyone will listen. No one wants to hear about what the network was like before it was sold it off as ad space. Young people are told "there is no other way. We must have a funding model". Even his article is rambling about "the problem of monetization". No ads and "poof", the Web will disappear. Yeah, right. More like the BS jobs will go away. This network started off as non-commercial.

There was plenty of high quality content on the 90's Web. Even more on the 90's Internet. That is because there were plenty of high quality people using it, well before there was a single ad. Academics, military, etc. It all faded into obscurity so fast. Overshadowed by crap. The Web has become the intellectual equivalent of a billboard or newspaper. The gateway to today's Web is through an online advertising services company. They will do whatever they have to do in order to protect their gatekeeper position.

[+] duxup|5 years ago|reply
I remember visiting big corporate websites and there was always a little corner for the 'webmaster' often with a photo of the server the site was running on... or a cat... or something like that.

Geocities was a beautiful mess as ... it was just folks trying to figure out HTML and post silly stuff, but it was genuine.

[+] netcan|5 years ago|reply
My favourite spot was a little later, early & mid 2000s. The barriers had been lowered, but publishing still required some motivation. There was a ton of content to discover, a lot of discovery channels and "information wants to be free" still felt like a prevailing wind.

I think part of the reason was, as you say, lower standards. We were being exposed to content that didn't have an outlet before that. The music was new, black, polished chrome... to borrow a Jim Morrison line.

A bigger part is discovery though. Blogrolls & link pages were a thing. One good blog usually lead you to 3 or 4 others.

These days, most content is pushed, often by recommendation engines. Social media content is dominated by quick reaction posts, encouraged by "optimization."

The medium is the message. In 98, the medium was html pages FTPed to some shoddy shared host to be read by geeks with PCs. In 2003, it was blog posts. In 2020 it's facebook & twitter.

[+] nicbou|5 years ago|reply
I think we have far more quality content than ever before. YouTube is a goldmine of high quality content, and so are the other publishing platforms.

The signal to noise ratio might have gotten worse, and discovery might be flawed, but the absolute quantity of quality content has never been higher.

[+] julianeon|5 years ago|reply
I had just started college and I remember going to the computer lab and clicking around for hours at a time, at night. Just going from blog to blog, reading interesting stuff. You didn't have to have a particular goal in mind - one blog would lead to another interesting blog would lead to another one, endlessly. They would all be engagingly written, to a high standard of quality.

Like you, I know things have changed, but I still can't imagine I could do that today, going from blog to blog, without running low on material within ~60 minutes.

EDIT: I see the webring links here now, I may try them.

[+] moksly|5 years ago|reply
People are still creating massive amounts of cool content, it’s just really hard to find. I play blood bowl, I care a lot about the statistics, and I’ve searched a lot on the topic over the past few years.

The best result I could find concerning some official data from FUMBBL (a place you can play blood bowl) was a blog entry from 2013.[1] My circle of friends and the different leagues I play in have been using that as reference for years. We’ve searched and searched to find the data source to no avail.

The other day I’m randomly site: searching for some thing completely unrelated and find a source for live FUMBBL data[2]. You’d think that was the first search engine result related to blood bowl statistics on any search engine, as it’s really the best damn source I’ve ever seen, but it’s not.

I know you were probably referring to something a little more interest based. Well I once sat next to a retired biology professor at a wedding, and it turned out he ran an interest site, detailing all the plants specific to the danish island Bornholm. I don’t care much about plants, but it was exactly a 90ies styles page. Unfortunately I didn’t save the link (I don’t care much about plants), because I haven’t been able to find it since, despite searching for his name.

So I think it’s still there, it’s just not easy to find it.

[1] http://ziggyny.blogspot.com/2013/04/fumbbl-high-tv-blackbox-...

[2] http://fumbbldata.azurewebsites.net/stats.html

[+] vb6sp6|5 years ago|reply
> My theory is that the high barrier to entry

I have the same feelings about social media. It used to be that you only had to listen to your stupid Uncle at Thanksgiving. Now he constantly spews his garbage on Facebook

[+] fit2rule|5 years ago|reply
>monetized blogs

I believe that we have the 'web' today because big decisions were made about how little control the end-user (i.e., consumer) should have over the content made by producers, and that the #1 priority for all technology involved in the web has been to separate producer from consumer as stringently as possible.

If we had the ability to safely and easily share a file that we create on our own local computer, using our own local computer, to any other computer in the world - we would have a nice balancing act of user-create content and world-wide consumption.

Instead, we have walled gardens, and the very first part of the wall is the operating system running on the users computer - it is being twisted and contorted in such ways as to make it absolutely impossible for the average user (i.e. the computer owner/user) to easily share information.

Instead, we have web clients and servers, and endless, endless 'services' that are all solving the same thing for their customers: organising documents in a way people can read them. And all the other things.

And its all so commercial, because there is a huge gate in the way, and it is the OS Vendors. They are intentionally stratifying the market by making the barrier to entry - i.e. ones own computing device - untenable to serve the purpose.

Imagine a universe where OS vendors didn't just give up to the web hackers, in the early days, and instead of making advertising platforms, pushed their OS to allow anyone, anywhere, to serve their documents to other people, easily, directly from their own system. I.e. we didn't have a client-/server age, but rather leaped immediately to peer-to-peer, because in this alternative universe, there were managers at places like Microsoft that could keep the old guard and the new young punks from battling with each other .. which is how we get this mess, incidentally.

There really isn't any reason why we all have to meet at a single website and give our content away. We could each be sharing our own data directly from our own devices, if the OS were being designed in a way to allow it. We have the ability to make this happen - it has been intentionally thwarted in order to create crops.

Give me a way to turn my own computer, whether it is a laptop or a server or my phone, into a safe and easy to use communications platform, and we'll get that content, created with love, back again.

Its sort of happening, with things like IPFS, but you do have to go looking for the good stuff .. just like the good ol' days ..

[+] inimino|5 years ago|reply
My only correction is that there was a lot of content out there! We didn't call it that, of course, because we're people and not corporations, so we just called it articles, blogs, rants and musings. A lot of it is still out there and a lot more is on the wayback machine!
[+] MaxBarraclough|5 years ago|reply
> content for content's sake

I think there's some truth to this. Some junior developers make it a goal to be seen as a respected blogger, so they feel the need to write something, even if they have nothing to say.

[+] izietto|5 years ago|reply
My theory is that the more companies you throw in the less humanity you can find
[+] foolmeonce|5 years ago|reply
I think the problem is that hiring practices across the entire "intellectual worker" market is a market for lemons that specifically ruins anything of quality.

Some blogs were great (i.e. created to solve the problem of too much interest in what one is up to even to answer internal questions individually) and signaled a few great minds that can be hired at a discount.. Those engineers told also pretty good co-workers in stage 1, by stage ~3, managers tell their underperforming direct reports to blog whatever they understand about what their group is doing in the hopes that they (either improve or better yet:) become a burden somewhere else.

[+] rawoke083600|5 years ago|reply
I think you are right about the "high barrier" being a filter for good work yes !

Also I miss the wide and wonderful design and color scheme of the 90's :) Long before bootstrap or "material design" !

[+] TheMightyLlama|5 years ago|reply
The whole idea of SEO in order to get clicks for your advertising based revenue model feels “bad” to me. Content is created which is controversial because that will get the most eyeballs on page. The side effect is that we veer towards a broken society the moment we go down that route.

I had an idea of a search engine that allowed you to permanently remove domains or pages with certain keywords as a paid service.

[+] tikiman163|5 years ago|reply
I think you have good points, but I would also add to it that the high barrier to entry also prevented people from being copy cats. A simple messenger was a major accomplishment, so when something worked there weren't 10,000 copies of it by the end of the week, so nearly all content you found was actually the original and not just some repost or copy paste job.
[+] eloisant|5 years ago|reply
I feel like you can find better quality today. Sure you have to dig a little more.

But at the time it was so magical compared to pre-web where the only content you could find was professionally published magazines or books, and suddenly you had all this niche content about stuff that wasn't worth publishing, all in a few clicks.

[+] JohnBooty|5 years ago|reply
I can't wait for server-side rendering to take its place in the sun again.

There are many use cases for which a client-side framework like React is eesential.

But I feel the vast majority of use cases on the web would be better off with server-side rendering.

And...

There are issues of ethics here.

You are kidding yourself to an extent when you say that you are building a "client-side web app." It is essentially an application targeted at Google's application platform, Chromium. Sure, React (or whatever) runs on FF and Safari too. For now. Maybe not always. They are already second-class citizens on the web. They will probably be second-class citizens of your client-side app unless your team has the resources to devote equal time and resources to non-Chromium browsers. Unless you work in a large shop, you probably don't.

Server-side rendering is not always the right choice, but I also do see it as a hedge against Google's, well, hegemony.

[+] purerandomness|5 years ago|reply
I recently watched the "Helvetica" documentary that was posted here a few days ago [0], where they briefly mention "Grunge Typography" [1], a seemingly dead-end branch of typography that, for some strange reason, became pretty popular for a short period of time.

After some years however, consensus amongst designers formed that what they've created was a pile of illegible garbage, and realized that there was no other way than completely dismiss that branch, go back to the roots, and evolve from a few steps back.

I feel the same kind of consensus is slowly forming around ideas like SPAs, client-side rendering and things like CSS-in-JS.

We saw the same happen with NoSQL and many other ideas before that.

We recently deployed an entire SaaS only using server-side rendering and htmx [2] to give it an SPA-like feel and immediate interactivity where needed. It was a pleasure to develop, it's snappy and we could actually rely on the Browser doing the heavy lifting for things like history, middle click, and not break stuff. I personally highly recommend it and see myself using this approach in many upcoming projects.

[0] https://www.hustwit.com/helvetica/

[1] https://www.theawl.com/2012/08/the-rise-and-fall-of-grunge-t...

[2] https://htmx.org/ (formerly "Intercooler")

[+] simias|5 years ago|reply
I'm not a web developer but my girlfriend needed a website to show her photography work so I decided to make it for her.

It's the simplest thing in the world, basically just three columns with photo thumbnails and the only javascript is some simple image viewer to display the images full screen when you click the thumbnails.

It's really, really basic but I was impressed with the feedback I received from it, many people were impressed by how slick and fast it was. And indeed, I went looking for professional photographer websites and indeed, what a huge mess most of them are. Incredibly heavy framework for very basic functionality, splash screens to hide the loading times etc... It's the electron-app syndrome, it's simpler to do that way so who cares if it's like 5 orders of magnitude less efficient than it should be? Just download more RAM and bandwidth.

Mine is a bunch of m4 macros used to preprocess static HTML files, and a shell script that generates the thumbnails with image magic. I wonder if I could launch the new fad in the webdev community. What, you still use React? That's so 2019. Try m4 instead, it's web-scale!

[+] pgm8705|5 years ago|reply
I'm glad this is the case. I've been a Rails developer for close to 10 years now, but 3 or 4 years back I got sucked into the React world. I bought right in and my company quickly adopted the "React on Rails" pattern. Looking back, it was one of the worst professional decisions I've made in my career. Now we're back to server side rendering and StimulusJS on the front-end when needed. Productivity is way up, and developer happiness is way up. With new tools like https://docs.stimulusreflex.com and https://cableready.stimulusreflex.com I'm very excited about what can be accomplished with minimal JS.

(Note: I still think React is an awesome library! I'm sure there are devs that are super productive with it too. It just wasn't the best fit for me and my company)

[+] wlll|5 years ago|reply
A company I contract to for backend and server stuff made the jump from static HTML to client side rendering with react. They did it because the consulting company they went to receommended it "because it was the future", and I am sure in no small part because that was what the consulting company specialised in.

It was the worst decision they have ever made. The site they ended up with was incredibly slow, and given the relatively few pages on the site you never really make for that initial load in time saved later.

It's also incredibly hard to write well, requires a special third party service to show anything in Google and is incredibly hard to manage.

They don't realise this of course, and are now attempting to solve the management and initial load issues by splitting the app up into three distinct apps. It won't help.

[+] kingdomcome50|5 years ago|reply
I think the problem here is less about choosing to utilize a client-side rendering implementation and more about choosing to adopt the “SPA” paradigm - where everything is smashed together in a single application bundle.

React + React DOM is something like 35kb gzipped. That’s not nothing (don’t forget caching) and pushing the initial render to the client (though not strictly necessary) does incur a bit of a penalty, but I think the benefits outweigh the drawbacks in many more use cases than people give credit.

The real problem is two-fold: The first, as I stated above, is wrapping the entire application in a client-side implementation. As many people are pointing out this is often unnecessary. You don’t need to go full “SPA” in order to benefit from the vdom.

The second (related) reason is when developers just start adding 3rd party dependencies without considering their impact (or if they are necessary). React is a library, and for the features you get it’s really not that big. If that’s all you are using to add that extra sparkle to some of your pages I firmly believe you are getting the absolute most “bang for your buck”.

[+] kickscondor|5 years ago|reply
I really like the turbolinks approach - you simply write HTML and then include the script in your head tags. However, I'm still hooked on Markdown. So I am still prerendering HTML - and then doing the routing with Hyperapp. (See https://href.cool/Tapes/Africa for an example - you get a prerendered static HTML page, but it uses JavaScript from there to render the other pages.)

The ultimate approach is Beaker Browser though. You can actually just write your whole site in Markdown (/index.md, /posts/batman-review.md, /posts/covid-resources.md) and then write a nice wrapper for them at /.ui/ui.html. This means you can edit posts with the built-in editor - and people can 'view source' to see your original Markdown! It's like going beyond the 90s on an alternate timeline.

(A sample of this is this wiki: hyper://1c6d8c9e2bca71b63f5219d668b0886e4ee2814a818ad1ea179632f419ed29c4/. Hit the 'Editor' button to see the Markdown source.)

[+] dredmorbius|5 years ago|reply
Schopenhauer's 19th century essay "On Authorship" has been a personal fave since discovering it last year:

Writing for money and reservation of copyright are, at bottom, the ruin of literature. No one writes anything that is worth writing, unless he writes entirely for the sake of his subject. What an inestimable boon it would be, if in every branch of literature there were only a few books, but those excellent! This can never happen, as long as money is to be made by writing. It seems as though the money lay under a curse; for every author degenerates as soon as he begins to put pen to paper in any way for the sake of gain. The best works of the greatest men all come from the time when they had to write for nothing or for very little....

https://www.gutenberg.org/files/10714/10714-h/10714-h.htm#li...

Brain Pickings articulates my reasons well, though really, just read the source:

https://www.brainpickings.org/2014/01/13/schopenhauer-on-aut...

[+] secondcoming|5 years ago|reply
My iPad is 10 years old and there are websites that bring it to its knees, especially mobile.twitter.com links. I don't click them anymore, it's too frustrating. Maybe web devs should be given low-end machines to work on so they can experience what their non-desktop users experience. The whole 'mobile web' distinction really shouldn't need to exist, my iPad isn't a mobile phone from 2005.
[+] stickfigure|5 years ago|reply
This seems to be one developer's wishful thinking, without any evidence presented to back up the assertion. Pointing out "hey, here's a couple websites that do server side rendering" does not a trend make.

We're ripping out webflow, if anecdata counts for anything (it doesn't). Webflow occupies the barren middle ground of "too complicated for marketing people, too simple for technical people". I find it much easier to write html than to figure out how to get their UI to make the html I want.

[+] petepete|5 years ago|reply
GOV.UK is a good example of a mainstream site that's built in the traditional manner. It's actually a collection of hundreds (thousands) of separate services, the vast majority of which are rendered on the server and use JS only where necessary.

As there's no advertising or images on most pages they tend to be incredibly fast too.

[+] djohnston|5 years ago|reply
I just hit the same wall with bubble.. it definitely has some impressive attributes but sometimes I just need to see the code. I really hate clicking around looking for descendent nodes in a UI vs just looking at html template
[+] yagodragon|5 years ago|reply
I really hope personal blogging becomes popular again!. Speaking of which, I still haven't found a really good alternative to the "horrible" WordPress for blogging. It has:

- Integrated API, RSS

- Tons of plugins

- Accessibility, translations

- Easy and powerful editor(Gutenberg)

- Comments sections and forms w/ complete ownership and moderation

- Easy data imports from multiple platforms.

- Users and roles

- 100% open source w/ GPL. You own your data

- Extremely easy and cheap to host and move around.

I love modern tooling and git based workflows for all my project but my "static" 11ty/Gatsby.js blog doesn't provide all these features out of the box. Instead of writing, you end up reimplementing basic cms features.

[+] pjmlp|5 years ago|reply
It never went away, those of us old fashioned devs on Java and .NET stacks, it has been our bread and butter for the last 20 years regarding Web stacks, SSR with some JavaScript on top.

I guess what it happening is the newer generations re-discovering that actually it makes sense to generate static content once, instead of redoing it in every client device across the globe.

[+] sbussard|5 years ago|reply
Social networks are failing us and we want independence and community. The web used to be that, then it turned into a gated gossip community.
[+] badsectoracula|5 years ago|reply
> Frontpage and Dreamweaver were big in the 90s because of their “What You See Is What You Get” interface. People could set up a website without any coding skills, just by dragging boxes and typing text in them. > > Of course they soon found that there was still source code underneath, you just didn’t see it. And most of the time, that source code was a big heap of auto-generated garbage - it ultimately failed to keep up with the requirements of the modern web.

If you do not see the source code it doesn't matter if it is garbage or not or if it is following any "modern web requirements" or not - all it matters is if it does what you expect it to do. Besides, it is a bit of a hypocrisy nowadays to complain about the code underneath a WYSIWYG tool when many web developers use transpilers that target CSS, JavaScript and pretty much all sites rely on dynamically generated and altered HTML that doesn't let you make more sense on the final output than something like Frontpage or Dreamweaver would generate.

Sadly the closest thing i could find nowadays to something like a WYSIWYG site editor is Publii[0]. It suffers greatly from the 'developer has a huge screen so they assume everyone has a huge screen' syndrome and i really dislike pretty much all of the themes available for it (everything is too oversized). And it is an Electron app because of course it will be an Electron app despite not needing to be one (it doesn't offer full WYSIWYG functionality, only for the article editor which isn't any more advanced than Windows 95's WordPad and it relies on an external browser to show you the final site). But it does the job (i tried on a new attempt for a dev blog of mine[1]) even if i dislike how oversized everything is.

[0] https://getpublii.com/

[1] http://runtimeterror.com/devlog/

[+] bradgessler|5 years ago|reply
I’d love to use a search engine that simply didn’t index websites with moderate or excessive amounts of JavaScript, images, and video.

You wouldn’t need AMP because it would load quickly, ads would be minimal, and the text content would probably be forced to be higher quality because it would have to stand on its own.

Does such a thing exist?

[+] _xymm|5 years ago|reply
It'd be so easy for Google (or others) to add that capability in their search, using the pagespeed insights they're now using to rank. Like, "squirrel -amp cumulative_layout_shift:0 total_blocking_time:0"
[+] hedora|5 years ago|reply
Even better. Blacklist all sites with ads. I’m guessing google will never implement that.
[+] codr7|5 years ago|reply
I've been playing around with this for several years now; building more or less elaborate frameworks for server side rendering and dividing the interface into separate pages.

I blame Seaside [0] for corrupting me. Never used it to build anything, but once the idea of building the user interface on the server was in my head there was no way back.

Though I have to admit I still find JSON really convenient for submissions compared to using fields for everything as it allows massaging the data on the way.

Besides that I've found the approach to be a total success. Pages load instantly, bookmarks and back buttons work as expected and most of the application stays on the server.

[0] http://www.seaside.st/

[+] Animats|5 years ago|reply
Webflow is touted as "the new Dreamweaver". Of course, it's "software as a service", about 3x as expensive as basic web hosting.
[+] INTPenis|5 years ago|reply
Another point to this, I equate the modern fediverse with all the old message boards.

Message boards are still around but it used to be an integral part of web culture. They essentially took over from dial up BBS.

But now community boards have moved to cloud services like Discord. The self-hosted boards are still around in the shape of federated ActivityPub instances.

It makes a lot more sense than hosting an isolated island of PunBB or vBulletin.

I just hope more communities host their own small localized ActivityPub instance, using AP relays to create a vibrant fediverse.

[+] yedava|5 years ago|reply
For a lot of internal corporate web applications, server side rendering is what makes most sense. These applications are always used from browsers on a company provided laptop. You don't need to worry about multiple frontends and "web scale"

Back in the day, it used be that internal apps were shitty to use, and had slow service layer code. But once the data got to the view layer, at least the pages rendered fast. Now with proliferation of SPAs, we have shitty user experience, slow backends and slow UI renders.

[+] usrusr|5 years ago|reply
Preloading on button-down, nice detail optimization. It's possible that you have to abort that request, but it will be the rare exception. It's my favorite thing I learned today.