As someone who's just written about the "small web" [1], this warms my heart. The lite version is probably a little too bare-bones for most people's tastes, but it sure is tiny -- great for people on very slow or flakey connections. Some numbers:
HTML homepage transfers 34KB (94KB uncompressed) over 6 requests. HTML search results page transfers 133KB (248KB uncompressed) over 31 requests.
Lite homepage transfers 13KB (11KB uncompressed, ha) over 4 requests. Lite search results page transfers 21KB (43KB uncompressed) over 5 requests.
In all cases, all requests are to *.duckduckgo.com, which was very good to see from a privacy perspective. Nice work, DDG!
Note that this enables any bang searches as well, though you'll need to single quote these to avoid attempted history expansion.
lite is also my default w3m search bookmark entry.
So:
ddg '!w foo' # Wikipedia article on foo
ddg '!dict foo' # Dictionary search on foo
ddg '!etym foo' # Etymology Dictionary search on foo
Note that if the endpoint itself relies on JS for local search, the bang won't be successful (though DDG will do its bit). Reddit, and HN/Algolia, I'm looking at you.
I happened to be running some quick lookups a few days ago while a friend was watching, and they 1) wondered how I was doing that and 2) if they could have a similar feature. Power of the shell.
These are great, and I’m glad to be able to use them without direct contact with “Big Tech” and the tracking bloat that entails, but as I understand it DuckDuckGo is more or less a glorified — albeit relatively glorious and pleasant — proxy to Big Tech’s Bing (well, more, not less, because it does add some great conveniences like the IIUC formerly open source Instant Answers¹ and !bangs, and some fraction of results from its own crawler, and likely some from Yandex. But still, the largest fraction of its core service seems dependent on Bing. Not that that’s its fault or there are attractive alternatives).
1. My site is also smol, but brand new and probably doesn't warrant a mention here, but respect for smol web!
2. > Because most people only view one or two articles on my site, I include my CSS inline. With HTTP/2, this doesn’t make much difference, but Lighthouse showed around 200ms with inline CSS, 300ms with external CSS.
With respect, this seems to be the frontend consensus and it... just doesn't make sense to me. Unless you expect the majority of your traffic to never hit a cache header on that external CSS. That 100ms perf hit (which is probably a warning sign that something's wrong with your assets or server config anyway) should be a one time affair, and not repeating that payload over and over surely makes up for it in the sub-200ms responses after.
The lite version has a broken image at the bottom. Other than that it is very nostalgic of Alta Vista home page back in the days. Kudos to DDG for having such simple non-javascript version. In current days, non-javascript web page is on the rise. All web pages should have a non-javascript lite weight version.
"HTML" version shows no images, just text and favicons. The total size of visible text content shown on the page is about 4KB (just did a count of characters on the page for "steve jobs" query). DDG shows those 4KB using 248KB of data.
1. That is not _that_ small. It's only small compared to the rest of the web, but not in absolute numbers, or in comparison to useful content shown.
2. The content to data ratio comes at 1:62. Or for one character shown on screen you are transferring 62 bytes of stuff. For context, Wikipedia page for Steve Jobs, which also includes multiple images, is 1.08MB and content is 128KB for a content to data ratio of about 1:9. So DuckDuckGo could in theory do ~7x better than this.
Up until not long ago, Google search result pages worked well without JS, and if you sent a suitably old UA header, you'd get the "original" version without mangled URLs or any of the other bloat. Then in 2019 or so, in what I consider to be an extremely hostile move, they started redirecting (using meta tags) to a horrible dumbed-down mobile-ish version and using styles-removed-by-JS to "hide" the actual content of the "full" version (which has the "modern" mangled URLs) if you managed to reach it anyway, so all you'd see was a blank page. It's almost like they were deliberately trying to make something that only worked in the latest version of Chrome or whatever few other "modern" browsers exist today, and decided to force non-JS users to a worse experience than they had had for the past decade or more. Nothing a filtering proxy can't fix (and I brought back true URLs at the same time), but I was absolutely incensed when that happened.
I don't think a search engine should ever require JS to be usable. The existence of search engines[1] predates the existence of JS, and of course the HTML mechanism of form submission and displaying a list of links predates both of those.
Looking at the network inspector, "lite" version search results load in about 900ms - 1.4sec for non-cached queries. That is fast only compared to the rest of the "big web" maybe, but not _that_ fast in absolute terms (and for context, Google full search results load in usually that much). The bottleneck for DuckDuckGo is probably the Bing API (usually 600-700ms response) for which there is no way around.
Just tested the pure HTML version and was pleased to find out that it does not force a minimum browser width upon the user. I usually set my desktop (1080p) to use a 3-column layout with i3 and am always bothered by the fact that I cannot escape from horizontal scrolling on every search I run on DDG.
I needed to add the HTML version as a new search engine option in the browser options to have it running as default, but hey, it works! Horizontal scrolling is now a thing of the past.
> not force a minimum browser width … 1080p … 3-column layout … cannot escape from horizontal scrolling
I have my second monitor in 1920×1080 portrait and sometimes that is an irritation at 1080px wide long before getting to the 640px your columns have. To pick one example, in Google search pages you lose over half the right-hand column without scrolling (though at least the main results are fully visible). A lot of layouts seem to assume having around 1280 pixels to play with (less a little for scroll bars).
I've found zooming out a little works fine usually, and modern browsers remember the setting, but that isn't perfect. If I unzoom on the main display it does so for tabs open on that site on the second too (to use the Google example again if I open maps or an image search on the main screen I probably want it at 100%) and some sites seem to actively resist being zoomed.
I compared all 3 searching "venera" and prefer the JS one, (sorry?), but I could see this being useful for their niche audience very well. I worry more about tracking than JS in browsers, when it runs super fast on my devices.
This is awesome... I was always pushing for non-JS versions of products and was always met with the dumbest reasons... 'the only reason someone wouldn't run JS is if they're hacking'
To be fair, disabling JS seems to be an extremely fringe thing to do and using resources to accommodate it should be a hard sell. I've only ever actually heard of people doing it here on HN.
Same, though looks like maybe it doesn't actually return the same results as the full site, which is really weird. All 3 front-end clients should show identical results.
HTML version recently get 403 Forbidden in Firefox's context menu search. It appears to be caused by Firefox adds a 'Origin: null' header in the request. Hope they can solve this bug one day.
You can achieve it without CSS actually (in chrome)
Just put <meta name="color-scheme" content="light dark"> to <head>
It will also make vanilla scrollbar into dark theme
I don't get why these would be preferable unless you are an HN elitist that hates javascript for no other reason than it's own sake.
Normal DuckDuckGo loads just as fast for me as the html variant and it doesn't provide location (I had to change it manually), it doesnt have image search, maps, news etc.
I don't understand why anyone would use these version over the normal DDG experience. It's not like the normal DDG is ridicolous amounts of javascript anyway and the normal DDG experience is pretty much just as fast. I cannot really notice any difference anyway.
The words I'd pick are 'careless' or 'sloppy' or 'laughable' use of javascript. The speed of carefully-crafted JS in today's browsers is amazing. If the mule collapses hauling that 20-ton-wagon of borax, there's the culprit.
[+] [-] benhoyt|5 years ago|reply
HTML homepage transfers 34KB (94KB uncompressed) over 6 requests. HTML search results page transfers 133KB (248KB uncompressed) over 31 requests.
Lite homepage transfers 13KB (11KB uncompressed, ha) over 4 requests. Lite search results page transfers 21KB (43KB uncompressed) over 5 requests.
In all cases, all requests are to *.duckduckgo.com, which was very good to see from a privacy perspective. Nice work, DDG!
[1] https://benhoyt.com/writings/the-small-web-is-beautiful/
[+] [-] dredmorbius|5 years ago|reply
As a bash (or zsh) function:
Note that this enables any bang searches as well, though you'll need to single quote these to avoid attempted history expansion.lite is also my default w3m search bookmark entry.
So:
Note that if the endpoint itself relies on JS for local search, the bang won't be successful (though DDG will do its bit). Reddit, and HN/Algolia, I'm looking at you.I happened to be running some quick lookups a few days ago while a friend was watching, and they 1) wondered how I was doing that and 2) if they could have a similar feature. Power of the shell.
[+] [-] boogies|5 years ago|reply
1: (Edit) https://duckduckhack.com/
[+] [-] nerdponx|5 years ago|reply
[+] [-] eyelidlessness|5 years ago|reply
2. > Because most people only view one or two articles on my site, I include my CSS inline. With HTTP/2, this doesn’t make much difference, but Lighthouse showed around 200ms with inline CSS, 300ms with external CSS.
With respect, this seems to be the frontend consensus and it... just doesn't make sense to me. Unless you expect the majority of your traffic to never hit a cache header on that external CSS. That 100ms perf hit (which is probably a warning sign that something's wrong with your assets or server config anyway) should be a one time affair, and not repeating that payload over and over surely makes up for it in the sub-200ms responses after.
[+] [-] divbzero|5 years ago|reply
Noticeably faster than the default JavaScript DDG.
[+] [-] deepstack|5 years ago|reply
[+] [-] freediver|5 years ago|reply
"HTML" version shows no images, just text and favicons. The total size of visible text content shown on the page is about 4KB (just did a count of characters on the page for "steve jobs" query). DDG shows those 4KB using 248KB of data.
1. That is not _that_ small. It's only small compared to the rest of the web, but not in absolute numbers, or in comparison to useful content shown.
2. The content to data ratio comes at 1:62. Or for one character shown on screen you are transferring 62 bytes of stuff. For context, Wikipedia page for Steve Jobs, which also includes multiple images, is 1.08MB and content is 128KB for a content to data ratio of about 1:9. So DuckDuckGo could in theory do ~7x better than this.
Sorry, as an engineer, not impressed at all!
[+] [-] userbinator|5 years ago|reply
I don't think a search engine should ever require JS to be usable. The existence of search engines[1] predates the existence of JS, and of course the HTML mechanism of form submission and displaying a list of links predates both of those.
[1] https://en.wikipedia.org/wiki/JumpStation
[+] [-] andrewshadura|5 years ago|reply
[+] [-] aerovistae|5 years ago|reply
[+] [-] freediver|5 years ago|reply
If you want to see a fast web search engine try https://rightdao.com
I am not affiliated with them in any way. But their search results come back at 70ms (with network latency included) and that is darn impressive!
[+] [-] radicalriddler|5 years ago|reply
[+] [-] jeduardo|5 years ago|reply
I needed to add the HTML version as a new search engine option in the browser options to have it running as default, but hey, it works! Horizontal scrolling is now a thing of the past.
[+] [-] dspillett|5 years ago|reply
I have my second monitor in 1920×1080 portrait and sometimes that is an irritation at 1080px wide long before getting to the 640px your columns have. To pick one example, in Google search pages you lose over half the right-hand column without scrolling (though at least the main results are fully visible). A lot of layouts seem to assume having around 1280 pixels to play with (less a little for scroll bars).
I've found zooming out a little works fine usually, and modern browsers remember the setting, but that isn't perfect. If I unzoom on the main display it does so for tabs open on that site on the second too (to use the Google example again if I open maps or an image search on the main screen I probably want it at 100%) and some sites seem to actively resist being zoomed.
[+] [-] dellcybpwr|5 years ago|reply
Turning on javascript returns sites from a USA POV. I'm in the USA.
[+] [-] new_guy|5 years ago|reply
[+] [-] OminousWeapons|5 years ago|reply
[+] [-] dmix|5 years ago|reply
But TIL about DDG's awesome help pages.
[+] [-] mikem170|5 years ago|reply
I browse with javascript disabled, except for sites/domains that I exempt like banks and stuff.
[+] [-] WrtCdEvrydy|5 years ago|reply
[+] [-] standardUser|5 years ago|reply
[+] [-] newsbinator|5 years ago|reply
People should be able to get all the info they need via curl in the terminal.
HTML and CSS are too easy to exploit for tracking users.
[+] [-] powersnail|5 years ago|reply
I can't find this page on JS DDG, but it's the first result on HTML DDG.
Query: carving a violin bridge
Desired URL: https://trianglestrings.com/carving-a-violin-bridge/
[+] [-] bnj|5 years ago|reply
[+] [-] vmsp|5 years ago|reply
[+] [-] Nextgrid|5 years ago|reply
[+] [-] chadlavi|5 years ago|reply
[+] [-] xxyzz|5 years ago|reply
[+] [-] tjbiddle|5 years ago|reply
[+] [-] dumdum4356|5 years ago|reply
[+] [-] xxyzz|5 years ago|reply
[+] [-] CivBase|5 years ago|reply
https://gist.github.com/CivBase/818f7f4f56050c9769c4b783c08c...
EDIT: I replaced the raw code with a link to a GitHub gist to reduce the obnoxious size of my post.
[+] [-] encryptluks2|5 years ago|reply
[+] [-] CodeHz|5 years ago|reply
[+] [-] abhayhegde|5 years ago|reply
[+] [-] wh33zle|5 years ago|reply
Anyone an idea how I can configure this search engine in Firefox Mobile?
As far as I can see, there is no way I can teach FF Mobile to make a POST request instead.
[+] [-] howeyc|5 years ago|reply
https://html.duckduckgo.com/html/?q=searchstring
[+] [-] ecmascript|5 years ago|reply
Normal DuckDuckGo loads just as fast for me as the html variant and it doesn't provide location (I had to change it manually), it doesnt have image search, maps, news etc.
I don't understand why anyone would use these version over the normal DDG experience. It's not like the normal DDG is ridicolous amounts of javascript anyway and the normal DDG experience is pretty much just as fast. I cannot really notice any difference anyway.
[+] [-] cyberlab|5 years ago|reply
And why does it have to be 'heavy'? Surely adding a few small embellishments with JS is okay, but the JS needn't be heavy at all.
[+] [-] 8bitsrule|5 years ago|reply
The words I'd pick are 'careless' or 'sloppy' or 'laughable' use of javascript. The speed of carefully-crafted JS in today's browsers is amazing. If the mule collapses hauling that 20-ton-wagon of borax, there's the culprit.
[+] [-] 1vuio0pswjnm7|5 years ago|reply