Danny o'brien, he runs one of the oldest surviving blogs, oblomovka, coined a term "hinternet" sometime in 2007, that was when the internet was still being run by the technological elite, for themselves, but normal people have also joined. The idea of hinternet was that there was essentially two internets. One is the sophisticated technology and a value add, and the other one is the internet of the viagra pills and popup banners. We, the technology elite, would rarely venture into the hinternet, like going into a bad neighborhood, where's normal people had no such mechanism for discernment, so their experience of the internet was distinctly different and inferior.
Now most of the internet is hinternet, and we're all forced more and more to rely on it. Banking systems, mortgage platforms, car payments, utilities payments are generally designed mobile first, desktop later, they employ various dark techniques for "verifying real user", which break on open platforms, forcing you to access them from iPads and other such locked down devices, or not at all. If hinternet used to be the dark shady streets where hucksters were peddling you knockoff watches, then now hinternet is the dystopian landscape of vertical information integration, ran, behind the scenes, by para-governmental institutions. You can't log in into irs without using id.me, a digital wallet and identity management platform, that sells you things.
There are attempts to cultivate little gardens of sophistication, but they are of mixed success. On a personal level there's a strong disincentive to participate in the hinternet beyond the mandatory, carefully navigating poorly designed and conceived systems just long enough to achieve an objective. One has to login into irs, but one doesn't really need to read that popup and upsell blocked, mobile centric news article.
From this perspective "mobile-first web design" is a symptom removed from its greater context.
This "hinternet" is a cool concept, but there's something missing from
its account.
On one side we have the cultured elites of academia, the military and
government - as rightful founders.
On the other, the unwashed masses, immigrants of the Eternal
September. Eventually this hoi polloi of hucksters, chancers and
grifters became naturalised as the businesses and bankers in the new
world.
The dotcom era is a colonisation story and the elites are the
aboriginal natives driven off their own land. It sure fits a "woke"
narrative.
But what's missing from this fairy-tale is the actual real people.
The truth is, dotcom, Web2.0 and the empire building between 1997 and
about 2010 was still a marginal affair, where existing money and power
moved into the internet, along with a handful of rugged
"entrepreneurs" (as we like to call ourselves around here).
The 99% remained spectators caught between the Scylla and Charibdes,
and now they are corralled into ranches, all lovingly watched over...
The potential for a "people's internet" still remains, but we have not
solved many (indeed any) of the classical problems of freeloaders,
tragedy of the commons.... and at this point I think "Web 3.0 and
blockchain web" is dead (?)
A good start to moving things forward to an internet that is once
again public, high-quality and large might be looking more closely at
the history/narrative of the internet and who the real stakeholders
are.
I'm struggling to understand from your explanation what is this "hinternet".
The hard-to-use internet of the 90s, centered around IRC and Usenet? The seedy parts of the 90s internet with illegal content hosted on free hosts? Because you say today's internet is like that, and I don't see any comparison at all however I look at it.
How is the 90s Internet (forums, IRC, Viagra spam and goatse) anything like the modern sterilized version full of dark patterns in the hands of a dozen megacorps?
It’s basically what computer illiterates (which is growing, not declining because of mobile use) endure on windows desktop computers. At some time they’ve clicked the wrong link, managed to install a toolbar in their browser - and slowly it’s been infested with random dark patterns. I see it fairly frequently when relatives call me about computer issues.
You’re lucky if it works with an iPad and that it doesn’t make you sign your life away to Verizon or T-Mobile the ur-Carrier (sic).
I alway get voted down when I use words like “phonish” or “phonishness” but I feel that smartphones made life worse not better and made people serve computers than the other way around. Here’s to the next platform.
This is an interesting thing. Honestly can't remember the last time I saw an ad or easy phishing thing. Like obv in your junk mail but thats literally the extent of my exposure to it, everything else is curated and "good" or probably aligns with your concept of the priveleged netizens/areas even though Im poor as hell. Knowledge wise I suppose I rich so there's that but there's also seems like a tradgedy of the commons type situation that depends on all the tech-illiterates to be the meat shields for advertising and paying for things like YouTube
This is the difference between building a website as an experiment for an untested audience and running a modern business. The simple reason most websites are optimized for mobile is because most users are on mobile. And the reason there are ads and email acquisition forms is because they are worth something to businesses. Text-only, ad-free, non-responsive text content isn't worth much.
> forcing you to access them from iPads and other such locked down devices, or not at all
Why does HN have to turn literally everything into an opportunity to bash Apple?
Apparently most of HN is not aware that multiple Android manufacturers implemented hardware destruction features when a device detects it has been rooted? That Google sold devices which were nearly impossible to root, jailbreak, or install another OS on, to protect the interests of a carrier (Verizon)?
Over the years it's become clear to me that frontend is probably the hardest part of the stack. People think they can just shit out some bootstrap react app and it's perfect, but being able to write complex UIs that can work on any browser, any device, with all assistive technologies, and all languages, is extremely hard. You need someone with a deep knowledge of html, css, and the supportive web apis. A good frontend engineer is incredibly rare, even at big tech. What's even more rare is a UX designer that also thinks about these things, who are worth a million bucks.
The problem sadly isn't even mobile-first vs. desktop, but rather designers who still haven't figured out that the web is dynamic content that should be allowed to flow according to the user's display device size and shape. It's not and never has been a static medium like paper. It's not limited to a specific size and shape like paper, and should not be treated as if it were. Web "designers" should not be trying to force the content into any specific size or pixel resolution, as there's just too many different resolutions of screens and width vs. height layouts of those screens to ever be able to cover them all appropriately without adapting to the idea that the content must be able to flow accordingly. It also severely harms accessibility for folks with vision issues who might scale up their fonts to compensate if doing so causes the content to break in horrible ways that make it unreadable.
>It's not and never has been a static medium like paper. It's not limited to a specific size and shape like paper, and should not be treated as if it were.
I often get "print first" page layouts, created from dynamic data that can have varying amounts of content. These pages also have to work in mobile and desktop browsers and look good on all of them. I don't find it to be that difficult. Sure it takes a little longer, but it's what the job requires. Media queries make it all possible, as well as a little bit of javascript.
I work with agencies coming from print and oh boy, don’t get me started.
At the same time it’s a freaking mess even for someone who’s been doing this for a long time.
My go to strategy is now to address concerns as much as possible by being involved in the design phase, as I have a design background, then implement and ship and then always have a “tell me everything that’s broken in obscure viewports only 1% of people use” and fix those live with CSS patches and viewport specific media queries during review phase or even live.
Designing good UI for such a varied platform as "the web frontend" is incredibly challenging. They might be at 300px wide, they might be at 1920px wide, they might have a mouse, or they might be using all touch. Or both. Or neither.
The designers probably know that, but it costs money to design for 2 platforms. So they pick the 2 biggest platforms for non-hackers... Android and iOS...
I get why they are used on mobile. If you only have room for content, it makes sense to tuck actions away into a hamburger menu except for a small number that you assign to tiny little hieroglyphs. Fine. However, if you have space, this is a terrible way to use it. At best it adds steps, at worst it invites experimentation and disaster to figure out what the heiroglyphs do (which wouldn't be so bad if undo worked but we've apparently decided undo is fine to break too). Like the Apple HIGs used to say, on Desktop you should want to get the most common actions out of menus and onto labeled buttons so that users can answer "what can I do?" without playing hide and seek. Undo should be baked in from the very start (it's hard to retrofit) to reduce the consequences of experimentation.
Unfortunately mobile design has taken over so completely that even on apps which will be used almost entirely on desktop, even on apps with an internal advocate for Desktop design, UI designers go for the hamburgers and heiroglyphs and broken undo because it's standard these days. Sigh.
Oh, and modals are back with a vengeance, but I need to stop here or my blood pressure is going to get unhealthy.
I don't think there's any excuse for hieroglyphs. Even a low-end 5 year old phone like the moto e4 has a 1280x720 display; there's plenty of pixels available to label the icons. Hieroglyphs are a "we hate our users and want them to know it" first design.
Hamburger menus could also frequently be done away with when you look at how many options they have. Like Gmail's app has them when it could fit the icons across the screen as a bar. And it's hard to argue that real estate was important since they put in a bottom bar for chat, video, and spaces whatever that is.
I built the nav bar at the top of my website[0] to be scrollable if the content doesn’t fit horizontally. I’m slightly concerned about users not realizing there are more options to scroll over, but I prefer it to a hamburger menu that has to open and cover the content since you can see every option and read the corresponding word. No need for any of that when visiting on a desktop however.
I agree hieroglyphs make discovery difficult. But I believe designers like them because they allow for consistent design across different languages. E.g. accommodating for German localisation can be difficult.
This study doesn't make any practical sense. These pages weren't designed to convey the maximal amount of information in the least amount of space, they were designed to sell a product. It's impossible to claim if these designs have a negative impact due to content dispersion or not unless you are measuring them against the purpose they were designed for.
They explicitly studied ecommerce/product pages here. The relevant metrics are which page had a higher perceived product value? Which page had a higher conversion ratio? Which page resulted in a higher NPS? Which page created a more positive brand affinity?
You don't sell portable speakers using specs, you sell it with aspirational images of it being used on a beach. Of course expanding an accordion of product details then asking "On a scale from 1-7, How well do you feel you understood the offering communicated on the page?" results in a higher survey score. If you said the more dense page converted better, then I would be surprised.
It's like designing a study on the negative impact of hard F1 race car seats, adding a bunch of foam, testing which is more comfortable, then proclaiming one is better than the other because it was rated more comfortable, when the only metric they were designed for is lap time.
Thank you for saying this, was wondering this all along while reading the article.
They compared the information density of, what is essentially a marketing flyer or a billboard and then stated that it doesn’t convey everything: well that’s the point, an ad is meant to invoke the desire for a product in its viewer, instead of being a spec sheet.
Many websites will completely alter their UI based on resizing your window. Sometimes I want to set a browser to take up half of a monitor so that I can put something else next to it. This often fails on modern websites because the UI becomes unusable after the window is resized
This is the "responsive" part of "responsive design". When done well, it's great, but you're right that many websites are way too aggressive about it, as if they only really tested two size classes.
Yes! I hate websites that put useful content in a thin column in the center.
What's worse, formerly normal websites start degrading. Typically, when a new manager is hired and decides to "reimagine" the product with the "mobile-first" vision. The recent Patreon is a good example.
And of course, in some cases normal websites go away entirely, and are replaced with crapps: Venmo, Amazon Alexa, Chamberlain, etc.
The thing that I truly find awkward is no nice touch equivalent to hover.
It's such a useful piece of the UX to have thrown away in the move to mobile first.
Other than that I probably fall in the "it's not that hard" camp. Of all the problems you have to solve, getting it to look reasonable on a few different screen sizes is pretty far down the list in terms of time and complexity.
As a webdesigner myself I kind of disagree. Simple content per screen just works better for the example they gave. It’s easier to visually parse. Their condensed version has a lot of multi column layouts which I really dislike.
High content density works for desktop applications, but not for what’s basically a brochure website.
I’m less concerned with brochure and ecommerce sites. But mobile first design for productivity sites and tool interfaces drive me crazy. Who normally accesses these tools from their phones?
What you really want is two apps, completely different UX's and its not worth the effort more times than not. No one is going to Herman Miller's website for example and not buying a chair because they don't like the desktop web app experience.
They might pass though if they can't get the site to work well on their phone.
I hate the “hero” UX concept and how everyone uses it in a useless landing page. Scrolling and clicking through endless marketing BS to figure out what you do and why it’s relevant to me. It’s the business equivalent of restaurant sites that don’t put their hours, address, and phone right in your face.
This article has completely misunderstand the term, mobile first it's the technical way of organizing the CSS to render the mobile viewport first and then the rest of the 'breakpoints'.
Mobile-first design has the advantage, over the desktop-first design, that it directly render first the mobile design and the cellphone experience is much faster and with less rearragning flickering.
Mobile-first doesn't have the corresponding disadvantages at the desktop browser since they tend to have a land line internet and much faster cpu and memory.
It has finally been told! I was crying and talking about making the return of information rich websites, but people were just following trends blindly :(
I’ve been a web developer since before “mobile first”, and since before mobile as a serious web target. Granted when it did become a serious target, we had different terms as well: adaptive and responsive design come to mind. “Mobile first” as a concept was not then—and never has been, for anyone who takes those other concepts seriously—“mobile at the expense of all else”.
What I mean is that quite a lot of these examples and others people frequently cite when complaining about “mobile first” are not inherent to “mobile first” per se. To my mind, they’re an incomplete application of the principle. And the principle became prominent when the inverse problem was more universal: designs (or simply their implementation) targeted desktop first, and added mobile affordances as an afterthought.
“Mobile first” shouldn’t mean that other web experiences aren’t just as important a consideration. Philosophically, it comes from the perspective that a broadly usable and accessible web experience accounts for the most stringent constraints and works out from there.
I distinctly recall solving problems like those discussed in the article well over a decade ago. It was a lot of work. It requires a lot of care and attention to detail. That doesn’t excuse skipping any of it! And it really should be more achievable as the standards have evolved. But it does require dedication to addressing a large matrix of users’ needs and usage conditions.
The irritating thing is that it is neither mobile first nor desktop first. Most websites I come across are designed and built for desktop but using a mobile aesthetic.
I've decided, whenever anyone weighs in on information/interaction design, and says "consume...content", what they are saying is, "Hey, we've been stabbing everyone in the face all wrong, here's a better way to stab everyone in the face."
Vanguard has been guilty of this as well. Its recent site redesign has been all over the place, and sometimes leaning heavily toward folks using its phone app (not that its phone app is very user friendly either; for example, checking what orders one has placed by account is impossible). It used to be that the website has a table view that shows almost everything I need to know about my holdings. Now, it takes a few clicks to find that info and even then, the layout is so sparse (mobile optimized) and hard to read.
Maybe I'm getting really old and just like to complain about this because I'm not very much used to this phone-oriented UIs.
[+] [-] r113500|2 years ago|reply
Now most of the internet is hinternet, and we're all forced more and more to rely on it. Banking systems, mortgage platforms, car payments, utilities payments are generally designed mobile first, desktop later, they employ various dark techniques for "verifying real user", which break on open platforms, forcing you to access them from iPads and other such locked down devices, or not at all. If hinternet used to be the dark shady streets where hucksters were peddling you knockoff watches, then now hinternet is the dystopian landscape of vertical information integration, ran, behind the scenes, by para-governmental institutions. You can't log in into irs without using id.me, a digital wallet and identity management platform, that sells you things.
There are attempts to cultivate little gardens of sophistication, but they are of mixed success. On a personal level there's a strong disincentive to participate in the hinternet beyond the mandatory, carefully navigating poorly designed and conceived systems just long enough to achieve an objective. One has to login into irs, but one doesn't really need to read that popup and upsell blocked, mobile centric news article.
From this perspective "mobile-first web design" is a symptom removed from its greater context.
[+] [-] nonrandomstring|2 years ago|reply
On one side we have the cultured elites of academia, the military and government - as rightful founders.
On the other, the unwashed masses, immigrants of the Eternal September. Eventually this hoi polloi of hucksters, chancers and grifters became naturalised as the businesses and bankers in the new world.
The dotcom era is a colonisation story and the elites are the aboriginal natives driven off their own land. It sure fits a "woke" narrative.
But what's missing from this fairy-tale is the actual real people.
The truth is, dotcom, Web2.0 and the empire building between 1997 and about 2010 was still a marginal affair, where existing money and power moved into the internet, along with a handful of rugged "entrepreneurs" (as we like to call ourselves around here).
The 99% remained spectators caught between the Scylla and Charibdes, and now they are corralled into ranches, all lovingly watched over...
The potential for a "people's internet" still remains, but we have not solved many (indeed any) of the classical problems of freeloaders, tragedy of the commons.... and at this point I think "Web 3.0 and blockchain web" is dead (?)
A good start to moving things forward to an internet that is once again public, high-quality and large might be looking more closely at the history/narrative of the internet and who the real stakeholders are.
[+] [-] sph|2 years ago|reply
The hard-to-use internet of the 90s, centered around IRC and Usenet? The seedy parts of the 90s internet with illegal content hosted on free hosts? Because you say today's internet is like that, and I don't see any comparison at all however I look at it.
How is the 90s Internet (forums, IRC, Viagra spam and goatse) anything like the modern sterilized version full of dark patterns in the hands of a dozen megacorps?
[+] [-] dsco|2 years ago|reply
[+] [-] PaulHoule|2 years ago|reply
I alway get voted down when I use words like “phonish” or “phonishness” but I feel that smartphones made life worse not better and made people serve computers than the other way around. Here’s to the next platform.
[+] [-] Obscurity4340|2 years ago|reply
[+] [-] tootie|2 years ago|reply
[+] [-] ParetoOptimal|2 years ago|reply
Mastodon comes to mind, whose openness allows me to browse it from emacs with mastodon.el.
Removing full or even useful content from both RSS and notification emails comes to mind as well.
[+] [-] dannyobrien|2 years ago|reply
(It was both strangely broken in my WP database, and also I think my website was down when OP write this. Sorry!)
[+] [-] gipp|2 years ago|reply
Wow, I just assumed it was some kind of auth flow the government runs themselves and never did any research
[+] [-] n8cpdx|2 years ago|reply
[+] [-] KennyBlanken|2 years ago|reply
Why does HN have to turn literally everything into an opportunity to bash Apple?
Apparently most of HN is not aware that multiple Android manufacturers implemented hardware destruction features when a device detects it has been rooted? That Google sold devices which were nearly impossible to root, jailbreak, or install another OS on, to protect the interests of a carrier (Verizon)?
[+] [-] charlie0|2 years ago|reply
[+] [-] throwaway290|2 years ago|reply
[+] [-] hatware|2 years ago|reply
[deleted]
[+] [-] zeroCalories|2 years ago|reply
[+] [-] blooalien|2 years ago|reply
[+] [-] klysm|2 years ago|reply
[+] [-] leptons|2 years ago|reply
I often get "print first" page layouts, created from dynamic data that can have varying amounts of content. These pages also have to work in mobile and desktop browsers and look good on all of them. I don't find it to be that difficult. Sure it takes a little longer, but it's what the job requires. Media queries make it all possible, as well as a little bit of javascript.
[+] [-] camillomiller|2 years ago|reply
[+] [-] stevebmark|2 years ago|reply
[+] [-] mattacular|2 years ago|reply
[+] [-] ReactiveJelly|2 years ago|reply
[+] [-] jjoonathan|2 years ago|reply
I get why they are used on mobile. If you only have room for content, it makes sense to tuck actions away into a hamburger menu except for a small number that you assign to tiny little hieroglyphs. Fine. However, if you have space, this is a terrible way to use it. At best it adds steps, at worst it invites experimentation and disaster to figure out what the heiroglyphs do (which wouldn't be so bad if undo worked but we've apparently decided undo is fine to break too). Like the Apple HIGs used to say, on Desktop you should want to get the most common actions out of menus and onto labeled buttons so that users can answer "what can I do?" without playing hide and seek. Undo should be baked in from the very start (it's hard to retrofit) to reduce the consequences of experimentation.
Unfortunately mobile design has taken over so completely that even on apps which will be used almost entirely on desktop, even on apps with an internal advocate for Desktop design, UI designers go for the hamburgers and heiroglyphs and broken undo because it's standard these days. Sigh.
Oh, and modals are back with a vengeance, but I need to stop here or my blood pressure is going to get unhealthy.
[+] [-] ndriscoll|2 years ago|reply
Hamburger menus could also frequently be done away with when you look at how many options they have. Like Gmail's app has them when it could fit the icons across the screen as a bar. And it's hard to argue that real estate was important since they put in a bottom bar for chat, video, and spaces whatever that is.
[+] [-] winstonrc|2 years ago|reply
[0] https://www.winstoncooke.com/
[+] [-] et-al|2 years ago|reply
[+] [-] dsmmcken|2 years ago|reply
They explicitly studied ecommerce/product pages here. The relevant metrics are which page had a higher perceived product value? Which page had a higher conversion ratio? Which page resulted in a higher NPS? Which page created a more positive brand affinity?
You don't sell portable speakers using specs, you sell it with aspirational images of it being used on a beach. Of course expanding an accordion of product details then asking "On a scale from 1-7, How well do you feel you understood the offering communicated on the page?" results in a higher survey score. If you said the more dense page converted better, then I would be surprised.
It's like designing a study on the negative impact of hard F1 race car seats, adding a bunch of foam, testing which is more comfortable, then proclaiming one is better than the other because it was rated more comfortable, when the only metric they were designed for is lap time.
[+] [-] dutchCourage|2 years ago|reply
Actually websites looked like that in the late 2000's, before responsive design became ubiquitous.
e.g. Apple https://www.versionmuseum.com/history-of/apple-website
[+] [-] supriyo-biswas|2 years ago|reply
They compared the information density of, what is essentially a marketing flyer or a billboard and then stated that it doesn’t convey everything: well that’s the point, an ad is meant to invoke the desire for a product in its viewer, instead of being a spec sheet.
[+] [-] notfed|2 years ago|reply
I guess that works for some audiences. For me, this can be a strong signal of low quality bs.
[+] [-] kunwon1|2 years ago|reply
[+] [-] tiltowait|2 years ago|reply
[+] [-] pacomerh|2 years ago|reply
[+] [-] cyberax|2 years ago|reply
What's worse, formerly normal websites start degrading. Typically, when a new manager is hired and decides to "reimagine" the product with the "mobile-first" vision. The recent Patreon is a good example.
And of course, in some cases normal websites go away entirely, and are replaced with crapps: Venmo, Amazon Alexa, Chamberlain, etc.
[+] [-] jddj|2 years ago|reply
It's such a useful piece of the UX to have thrown away in the move to mobile first.
Other than that I probably fall in the "it's not that hard" camp. Of all the problems you have to solve, getting it to look reasonable on a few different screen sizes is pretty far down the list in terms of time and complexity.
[+] [-] extraduder_ire|2 years ago|reply
Actual hover detection would be possible, but I imagine that UX would suck unless you were using a stylus.
[+] [-] Sanju_2306|2 years ago|reply
[+] [-] peebeebee|2 years ago|reply
High content density works for desktop applications, but not for what’s basically a brochure website.
[+] [-] Ensorceled|2 years ago|reply
[+] [-] strangescript|2 years ago|reply
They might pass though if they can't get the site to work well on their phone.
[+] [-] moribvndvs|2 years ago|reply
[+] [-] tempodox|2 years ago|reply
[+] [-] hypertexthero|2 years ago|reply
Industry got obsessed with JS frameworks made for giganto-orgs as a “solution” for everything.
Eyes became fixated on the claustrophobic mobile-first screens.
Went back to doing primarily graphic design. Less money, much happier.
[+] [-] melenaos|2 years ago|reply
This article has completely misunderstand the term, mobile first it's the technical way of organizing the CSS to render the mobile viewport first and then the rest of the 'breakpoints'.
Mobile-first design has the advantage, over the desktop-first design, that it directly render first the mobile design and the cellphone experience is much faster and with less rearragning flickering.
Mobile-first doesn't have the corresponding disadvantages at the desktop browser since they tend to have a land line internet and much faster cpu and memory.
[+] [-] kodisha|2 years ago|reply
It has finally been told! I was crying and talking about making the return of information rich websites, but people were just following trends blindly :(
[+] [-] eyelidlessness|2 years ago|reply
What I mean is that quite a lot of these examples and others people frequently cite when complaining about “mobile first” are not inherent to “mobile first” per se. To my mind, they’re an incomplete application of the principle. And the principle became prominent when the inverse problem was more universal: designs (or simply their implementation) targeted desktop first, and added mobile affordances as an afterthought.
“Mobile first” shouldn’t mean that other web experiences aren’t just as important a consideration. Philosophically, it comes from the perspective that a broadly usable and accessible web experience accounts for the most stringent constraints and works out from there.
I distinctly recall solving problems like those discussed in the article well over a decade ago. It was a lot of work. It requires a lot of care and attention to detail. That doesn’t excuse skipping any of it! And it really should be more achievable as the standards have evolved. But it does require dedication to addressing a large matrix of users’ needs and usage conditions.
[+] [-] gherkinnn|2 years ago|reply
[+] [-] neilv|2 years ago|reply
I've decided, whenever anyone weighs in on information/interaction design, and says "consume...content", what they are saying is, "Hey, we've been stabbing everyone in the face all wrong, here's a better way to stab everyone in the face."
[+] [-] programmertote|2 years ago|reply
Maybe I'm getting really old and just like to complain about this because I'm not very much used to this phone-oriented UIs.