ec429 | 14 years ago | on: Elements of a Clean Web Design
ec429's comments
ec429 | 14 years ago | on: Elements of a Clean Web Design
I'm not advocating regression to non-design; merely the automation of design, and its implementation as close to the user as possible. It may disturb the more artistically inclined to learn that their profession is in the process of being obsoleted by technology, but there it is and I hope they won't be Ludditic about it.
"we like to read a number of words on each line" (etc.) - who are "we"? I certainly don't; a line break forces you to re-acquire the text stream, producing regressive eye movement. Do you have data for your assertions? My understanding of the science was that the optima are narrow columnar formats and maximal width.
ec429 | 14 years ago | on: Elements of a Clean Web Design
Perhaps you are confusing it with kerning, which does make text easier to read, and which absolutely should at all times be automated (no exceptions, not even for 'display' text).
Also, I quoted "designers" in the strict sense of scare quotes: to indicate that the word's meaning was different from (my understanding of) its usual definition. Most people who design things are not visual artists; properly construed, "designer" is almost synonymous with the modern meaning of "engineer". However, on the web at least, its meaning has been blurred.
ec429 | 14 years ago | on: Elements of a Clean Web Design
It's very much a UNIXy attitude, and ties in to ideas like "mechanism, not policy". If stylistic fashion changes, much simpler for the browsers to update their presentation mechanisms than for every website to redesign their CSS. Policy should be pushed as close to the user as possible - but the flipside is that the user shouldn't have to decide anything you can reliably deduce automatically; applying this with the web designer considered as the user of HTML, CSS etc is left as an exercise for the reader.
ec429 | 14 years ago | on: Elements of a Clean Web Design
CSS to change colours and relative proportions is, I suppose, a concession that had to be made to 'designers', but when you, as the page author, are having to concern yourself with typography, then something has gone horribly wrong. Layout, spacing, leading - these are all the /browser's/ job; at most the page should give a few hints ("this section needs to be clear" -> use more leading/spacing), since such hints are meaningful to other renderings than the graphical browser. Did these designers ever stop to think about blind people relying on text-to-speech? It's obvious how that should render <em> - with emphasis - but what does it do with "letter-spacing:110%;"?
As for the advice about using a grid, that should definitely have been accompanied with the caveat that your design should still flow to the browser's width. Fixed-width web pages are _evil_, where by evil I actually mean _stupid_.
ec429 | 14 years ago | on: Python 2.8 Un-release Schedule
Perl 5 is right out.
ec429 | 14 years ago | on: Tell HN: Steal my ideas
You're asking for the Semantic Web.
No, really: if everyone's data were available as data, so you didn't have to scrape human-readable representations, this would be approximately trivial (at least, it'd be trivial for Google, because they already have this kind of raw computational horsepower, and experience with embarrassingly parallel programming).
ec429 | 14 years ago | on: Uncle Sam: If It Ends in .Com, It’s .Seizable
ec429 | 14 years ago | on: Uncle Sam: If It Ends in .Com, It’s .Seizable
Wired suggests that some want to move this stewardship to the UN. This is not a solution.
A still more decentralised approach is needed; if the Internet is to truly "treat censorship as damage, and route around it", no core Internet protocol can afford to have any kind of "root server", "switch board" or "root certification authority".
There are plenty of good hackers here, perhaps more importantly there are people here who can build communities; rather than trying to invent the "next facebook", I hope you will divert some of your attention to making a pervasively distributed Internet both (a) technically feasible and (b) desired by the Man In The Street.
ec429 | 14 years ago | on: Vortex radio waves could boost wireless capacity “infinitely”
And you certainly wouldn't be able to use it at HF - imagine what the ionosphere will do to your carefully constructed wave!
In general, using more parameters of the wave reduces your resilience to noise; the usual approach of extracting only amplitude, frequency and (perhaps) phase is a summation operator that smooths out a lot of interference. Conceptually, this is like how QPSK needs a higher SNR than BPSK does - you're using more parameters, so you're reducing the 'distance' between things you want to distinguish, so you're increasing the chance that a given amount of noise will produce errors.
ec429 | 14 years ago | on: If you want reproducible science, the software needs to be open source
ec429 | 14 years ago | on: Splash screens == sloth
In other words, use UNIX.
This isn't difficult to understand, guys.
ec429 | 14 years ago | on: Why LightSquared failed: It was science, not politics
Also there is an upper bound on filter sharpness given by the latency constraints (GPS needs accurate timing, and a brick wall filter has infinite latency) but that bound's probably not even being approached here; I don't have hard data on GPS signal latency requirements nor a simple formula relating rejection to latency.
ec429 | 14 years ago | on: Infinity is not a number
Or, to put it another way, Infinity is a hyperreal number.
ec429 | 14 years ago | on: Introducing the HUD. Say hello to the future of the menu
Moving to a different distro (and perhaps WM) is a good idea. I recently moved to Debian with Xfce and am enjoying the lack of gnomes.
ec429 | 14 years ago | on: Feds, Please Return My Personal Files Stored at MegaUpload
ec429 | 14 years ago | on: Feds, Please Return My Personal Files Stored at MegaUpload
ec429 | 14 years ago | on: Magdalen Oxford gets rejection letter from student
However, I would add that she does have a point about Oxford not being up to scratch (she should have applied to Cambridge instead ;)
ec429 | 14 years ago | on: SOPA lives—and MPAA calls protests an "abuse of power"
If the US were to start messing with, say, DNS, it seems fairly obvious that they couldn't restrict the effects to their own country (especially since the Internet is canonically /not/ organised around national boundaries). So, they'd be breaking not only their own internet but everyone else's too - and they simply do not have that right, morally speaking.
If bills like SOPA/PIPA pass, I intend to write to my MP about the importance of establishing a separate infrastructure that co-operates with, but is not dominated by, the existing system. The US has too much control over things like name authorities and SSL root CAs. ICANN is a US corporation. If the US wanted to break the BGP routing table, they wield enough power to do it (heck, AS7007 did it by /accident/).
It is becoming increasingly clear that the US cannot be trusted with stewardship of the global Internet; a still more decentralised approach is needed.
(Maybe, if they break it entirely, we can build a new one with all the lessons we've learned over the past few decades about how to build peer-to-peer decentralised internetworking. Plus, y'know, we could use IPv6 from the start)
ec429 | 14 years ago | on: UK Schools ICT to be replaced by computer science programme
If, for instance, uppercase text requires wider spacing, the browser — or better still, the font renderer — should work that out; the designer should not have to explicitly tell it things like that.
Since, in any case, most manual typography is already algorithmic (if such-and-such condition, the spacing needs to be increased), and since the 'judgement calls' about the aesthetics of type are already encoded into the font by the foundry, manual typography outside the foundry is /obsolete/: machines can do it better, because they can use optimisation algorithms to find the best trade-offs (compare, for instance, TeX's line-breaking algorithm to the common manual approach of "first-fit, but backtrack if the solution is too poor").