ec429's comments

ec429 | 14 years ago | on: Elements of a Clean Web Design

Non-emphatic letter-spacing changes are susceptible to automation, thus should be automated.

If, for instance, uppercase text requires wider spacing, the browser — or better still, the font renderer — should work that out; the designer should not have to explicitly tell it things like that.

Since, in any case, most manual typography is already algorithmic (if such-and-such condition, the spacing needs to be increased), and since the 'judgement calls' about the aesthetics of type are already encoded into the font by the foundry, manual typography outside the foundry is /obsolete/: machines can do it better, because they can use optimisation algorithms to find the best trade-offs (compare, for instance, TeX's line-breaking algorithm to the common manual approach of "first-fit, but backtrack if the solution is too poor").

ec429 | 14 years ago | on: Elements of a Clean Web Design

That's not gopher. Gopher doesn't have semantic markup. In fact, gopher doesn't define a document format at all (unless you count the format of the menus); it's a transport protocol.

I'm not advocating regression to non-design; merely the automation of design, and its implementation as close to the user as possible. It may disturb the more artistically inclined to learn that their profession is in the process of being obsoleted by technology, but there it is and I hope they won't be Ludditic about it.

"we like to read a number of words on each line" (etc.) - who are "we"? I certainly don't; a line break forces you to re-acquire the text stream, producing regressive eye movement. Do you have data for your assertions? My understanding of the science was that the optima are narrow columnar formats and maximal width.

ec429 | 14 years ago | on: Elements of a Clean Web Design

I disagree: increased letter spacing is a form of emphasis.

Perhaps you are confusing it with kerning, which does make text easier to read, and which absolutely should at all times be automated (no exceptions, not even for 'display' text).

Also, I quoted "designers" in the strict sense of scare quotes: to indicate that the word's meaning was different from (my understanding of) its usual definition. Most people who design things are not visual artists; properly construed, "designer" is almost synonymous with the modern meaning of "engineer". However, on the web at least, its meaning has been blurred.

ec429 | 14 years ago | on: Elements of a Clean Web Design

Well, I'm a bit autistic myself, so it's no surprise. But in fact I do care about typography (in particular, I obsessed over bad kerning /before/ xkcd mentioned it); I just think that typography /is not the job/ of the creator of the page. Essentially everything we know about typesetting running text has been successfully automated for years (heck, TeX could automagically typeset mathematical copy 34 /years/ ago), so _push that decision as close to the user as possible_.

It's very much a UNIXy attitude, and ties in to ideas like "mechanism, not policy". If stylistic fashion changes, much simpler for the browsers to update their presentation mechanisms than for every website to redesign their CSS. Policy should be pushed as close to the user as possible - but the flipside is that the user shouldn't have to decide anything you can reliably deduce automatically; applying this with the web designer considered as the user of HTML, CSS etc is left as an exercise for the reader.

ec429 | 14 years ago | on: Elements of a Clean Web Design

Hmm, I disagree. In my opinion, 'clean' web design means the now sadly old-fashioned approach of: have some text, marked up to indicate emphasis, headings etc., then let the damn browser decide what it should look like.

CSS to change colours and relative proportions is, I suppose, a concession that had to be made to 'designers', but when you, as the page author, are having to concern yourself with typography, then something has gone horribly wrong. Layout, spacing, leading - these are all the /browser's/ job; at most the page should give a few hints ("this section needs to be clear" -> use more leading/spacing), since such hints are meaningful to other renderings than the graphical browser. Did these designers ever stop to think about blind people relying on text-to-speech? It's obvious how that should render <em> - with emphasis - but what does it do with "letter-spacing:110%;"?

As for the advice about using a grid, that should definitely have been accompanied with the caveat that your design should still flow to the browser's width. Fixed-width web pages are _evil_, where by evil I actually mean _stupid_.

ec429 | 14 years ago | on: Python 2.8 Un-release Schedule

Three shall be the version of the Python, and the version of the Python shall be three. Version four shalt thou not use, nor shalt thou use version two, excepting that thou then upgrade to three.

Perl 5 is right out.

ec429 | 14 years ago | on: Tell HN: Steal my ideas

"For that matter, you'd think there would be a better service out there that would research anything like that."

You're asking for the Semantic Web.

No, really: if everyone's data were available as data, so you didn't have to scrape human-readable representations, this would be approximately trivial (at least, it'd be trivial for Google, because they already have this kind of raw computational horsepower, and experience with embarrassingly parallel programming).

ec429 | 14 years ago | on: Uncle Sam: If It Ends in .Com, It’s .Seizable

I've said it before, and I'll say it again: _the US cannot be trusted with stewardship of the global Internet_.

Wired suggests that some want to move this stewardship to the UN. This is not a solution.

A still more decentralised approach is needed; if the Internet is to truly "treat censorship as damage, and route around it", no core Internet protocol can afford to have any kind of "root server", "switch board" or "root certification authority".

There are plenty of good hackers here, perhaps more importantly there are people here who can build communities; rather than trying to invent the "next facebook", I hope you will divert some of your attention to making a pervasively distributed Internet both (a) technically feasible and (b) desired by the Man In The Street.

ec429 | 14 years ago | on: Vortex radio waves could boost wireless capacity “infinitely”

I strongly suspect that this will prove to be limited by the RF path; it relies (as far as I can tell) on coherent properties of the wavefront. This means it will break as soon as the signal passes through heterogeneous material (such as a building), or as soon as reflections produce multi-path interference.

And you certainly wouldn't be able to use it at HF - imagine what the ionosphere will do to your carefully constructed wave!

In general, using more parameters of the wave reduces your resilience to noise; the usual approach of extracting only amplitude, frequency and (perhaps) phase is a summation operator that smooths out a lot of interference. Conceptually, this is like how QPSK needs a higher SNR than BPSK does - you're using more parameters, so you're reducing the 'distance' between things you want to distinguish, so you're increasing the chance that a given amount of noise will produce errors.

ec429 | 14 years ago | on: Splash screens == sloth

Solution: don't use monolithic applications with bloated GUIs. Instead, use small, simple tools, driven from the command line where possible. If the problem domain is naturally graphical, have a lightweight graphical frontend driving the small simple tools through shellouts or a plugin or library interface.

In other words, use UNIX.

This isn't difficult to understand, guys.

ec429 | 14 years ago | on: Why LightSquared failed: It was science, not politics

According to the article, the interference is 86dB above the GPS signal (400mn.×). Wikipedia gives figures of 60-80dB. The LightSquared band is from 50MHz to 16MHz below the GPS L1 band, which as a percentage of 1.5GHz is from 1% to 3%. This is not very far away, and 86dB is a lot. I don't think it's reasonable to expect any equipment to have that kind of selectivity; after all, high-Q filters are not only expensive, they are often also heavy and bulky.

Also there is an upper bound on filter sharpness given by the latency constraints (GPS needs accurate timing, and a brick wall filter has infinite latency) but that bound's probably not even being approached here; I don't have hard data on GPS signal latency requirements nor a simple formula relating rejection to latency.

ec429 | 14 years ago | on: Introducing the HUD. Say hello to the future of the menu

I was thinking the same, and found it particularly amusing that "design" types are only now finally realising the virtues of the command line. What'll the next Ubuntu UI revolution be? No X11, just a 'screen' session in a tty? (I'd actually use that, were it not for the fact that (a) too much of the Web is overly graphical to be browsed effectively in text mode and (b) screen sucks)

Moving to a different distro (and perhaps WM) is a good idea. I recently moved to Debian with Xfce and am enjoying the lack of gnomes.

ec429 | 14 years ago | on: Magdalen Oxford gets rejection letter from student

I quite agree (and it's sickening how 'elitism' has become an insult).

However, I would add that she does have a point about Oxford not being up to scratch (she should have applied to Cambridge instead ;)

ec429 | 14 years ago | on: SOPA lives—and MPAA calls protests an "abuse of power"

I think the /real/ abuse of power here isn't the RIAA/MPAA at all... it's the US (or rather, govt and certain corporations thereof) thinking that because key Internet infrastructure is located on their territory, they have a right to screw with it.

If the US were to start messing with, say, DNS, it seems fairly obvious that they couldn't restrict the effects to their own country (especially since the Internet is canonically /not/ organised around national boundaries). So, they'd be breaking not only their own internet but everyone else's too - and they simply do not have that right, morally speaking.

If bills like SOPA/PIPA pass, I intend to write to my MP about the importance of establishing a separate infrastructure that co-operates with, but is not dominated by, the existing system. The US has too much control over things like name authorities and SSL root CAs. ICANN is a US corporation. If the US wanted to break the BGP routing table, they wield enough power to do it (heck, AS7007 did it by /accident/).

It is becoming increasingly clear that the US cannot be trusted with stewardship of the global Internet; a still more decentralised approach is needed.

(Maybe, if they break it entirely, we can build a new one with all the lessons we've learned over the past few decades about how to build peer-to-peer decentralised internetworking. Plus, y'know, we could use IPv6 from the start)

page 1