takeoutweight's comments

takeoutweight | 11 years ago | on: Types Are The Truth

Types give you the power of abstraction -- they allow you to define new concepts that "don't to go wrong" and allow you to speak of things other than just bytes-shuttled-on-a-turing-tape. Rigorously justifying the ignorance of the underlying bytes (i.e. being parametric over representation) is an advantage, not a downside.

But maybe I am misunderstanding your comment? Perhaps you were just equivocating over what "really there" means? If that's the case, is multiplication not "really there" because it can be implemented in terms of bit shifting and addition? We don't imagine multiplication as somehow putting these restrictions on what we can do with our bit shift operators -- the multiplication is the "real" thing we care about and the binary operations are incidental.

In any case, see the famous parable that opens Reynolds' "Types, Abstraction, and Parametric Polymorphism" for a colourful example of ignoring the underlying data representation:

http://www.cse.chalmers.se/edu/year/2010/course/DAT140_Types...

takeoutweight | 11 years ago | on: A Year of Functional Programming

I suppose I can only speak for myself, but as a functional programmer I would never ask you to give up mutation completely.

I do think it's nice that most FP languages give us pure functions and persistent data structures that are easy to use and relatively performant, so we don't have to go out of our way to provide a pure (by construction) abstraction when we want to.

The key idea is to be honest when you are asking for mutation. Haskell would likely be far on the "left wing" (or right wing?) on your political spectrum but you can do mutation whenever you please as long as you're honest and mark it in the types with something like IO, State, or ST (Clojure does something similar in spirit with 'transient'). There is even unsafePerformIO if you are absolutely sure that you are wrapping an impure computation in a way that you know looks pure from the outside.

Purity isn't promoted because it's a virtue. It's that pure expressions generally come with nice commutativity and idempotency laws which give you the ability to refactor code while remaining confident you won't change its meaning or alter the behaviour of distance subsystems. It's worth preserving those properties when you can.

takeoutweight | 12 years ago | on: Compile Clojure to Objective-C

The best place to start is actually the scheme iOS repl that comes packaged with the Gambit source. Take a look at the misc/build-gambit-iOS script (mentioned on slide 39,40 of my linked talk slides). It's a bit of a hairy build process (and clojure-scheme only adds more hair into the mix) but once you have the repl working it's easier to see where to slide in the .scm/.m files that clojure-scheme and Gambit generate. Some kind of CMake script would probably ease this pain in setting up all this Xcode stuff.

takeoutweight | 12 years ago | on: How to Choose Colours Procedurally

Brewer palettes (http://mkweb.bcgsc.ca/brewer/) are hand-picked colour ranges that were originally meant for cartography but are useful for general data viz, where it is important to maintain perceptual regularity while displaying quantitative data.

There are Brewer palettes geared for sequential data, where you further need to express proportionality along a scale; and "diverging" data that has a natural zero point with two extremes where you need to present a spectrum. In all these cases it's important that one category doesn't appear "heavier" than another, and that subjective notions like "about twice as intense" reflect the underlying data.

takeoutweight | 12 years ago | on: An Animation About Quantum Computers

Re 1. Periodicity: The quantum Fourier transform is an operation that reflects information about a quantum state into the the phases of the amplitudes. Quantum phase information is exactly where quantum computing differs from classical probabilistic computing, so it makes sense that this technique might show up in places where quantum computing beats classical. For example: Shor's factorization of integers makes direct use of the quantum Fourier transform. I mention periodicity only as an example of a sort of problem where the Fourier transform might be useful. This is as an alternative intuition to what quantum computers are "good at" to combat the notion that "quantum computing is good for parallel problems."

Re 2. "covering all paths": I don't have a problem interpreting quantum superposition as inhabiting all possible states. However, classical probabilistic computing also can be interpreted this way: an un-flipped coin is both heads and tails until the flipping happens. But probabilistic computing doesn't give us faster-than-classical speedup, therefore it's not just the "existing in all states at once" that buys us the speedup: it's the unique kind of math we can do on these states because our amplitudes are unitary-complex, not positive-real.

You understand this distinction, so I perhaps shouldn't have adopted the tone I did (sorry!). The leap between "superposition involves a complex-valued combination of several states" and "tries all answers at once so is very fast" is a very common leap made in popular science articles and is the kind of misconceptions that I think the "quantum computers are good at parallel" intuition encourages. Just because a quantum state is a superposition doesn't mean we get to, for free, observe and evaluate all those states and pick out the one we like. We can sometimes arrange things in such a way that the quantum phases interact non-classically to leverage the structure of some problems to reveal information that isn't available to classical algorithms.

Periodicity information, via the quantum Fourier transform, is an example of one way to arrange things to extract information that would be more expensive to calculate classically.

takeoutweight | 12 years ago | on: An Animation About Quantum Computers

You can know the probability that the algorithm will output the correct answer. If the probability of a correct response is > 50%, all you have to do is run the algorithm multiple times to get within, say, 99.99% certainty that the most-commonly reported answer is correct.

takeoutweight | 12 years ago | on: An Animation About Quantum Computers

This is a VERY common misconception, and is not a good intuition for how quantum computing is different than classical.

A better intuition, in my opinion, is computing with "un-flipped" coins, except with funny complex-valued probabilities of heads vs tails instead of real values between 0.0 and 1.0. Because the probabilities are complex-valued, they interact in ways that can seem a bit counter-intuitive, (it makes sense it is counter-intuitive, because how many quantities do we deal with day-to-day that are described with complex numbers?) These interactions can provide some algorithmic speed up on some nicely structured problems.

It's not always obvious what these problems are, and it doesn't relate to do whether the problem is highly-parallel. My intuition for "quantum-friendly" is if a problem involves some kind of periodicity, or Fourier-like frequency analysis, then perhaps you'll see a quantum speedup. But it's important to remember the coins don't somehow "try every flip possible" to find the answer you're looking for -- if they did that you'd be able to solve NP problems in constant time.

takeoutweight | 12 years ago | on: Running Lua in a browser via Emscripten

By "efficient" I assume you mean the object code is compact, filesize-wise? It's fairly typical that byte-code or otherwise high-level object code is more compact than machine code. For example, Java .class files will be smaller than their equivalent native-compiled object code.
page 1