thetwiceler | 12 years ago | on: New edX courses
thetwiceler's comments
thetwiceler | 12 years ago | on: Isla: a programming language for young children
Each "instruction" was represented on a computer as a "LEGO block". For example, there was a conditional block. You'd place it under a block that would return an output (like a "sense" command), and then you'd place two blocks below it for the True/False conditions. So basically you ended up building a tree of the logical structure of the program.
I think this style is the BEST way for children to learn programming. In fact, I think a block-like/tree-like structure is often a great way to think about program structure (even for advanced programs; Simulink is really great!).
Now, once you get comfortable with the constructs, actually dragging-and-dropping blocks often gets too slow. But for children, and for building simple programs, I think it's great.
I'm afraid that text-based languages are just unnecessarily difficult for children - only a small fraction of what you can type actually represents a valid program. The blocks allow programming without having to deal with these syntactic barriers. After all, when I program in a language like Haskell, 90% of my programming errors cause the program to fail to compile. These sorts of errors (syntax errors or type errors) can be prevented as the program is built with the block interface.
And while I'm extolling Mindstorms, another great thing about creating these programs was that you weren't just flipping bits on a computer; it was pretty satisfying to see a robot "in the real word" follow your commands.
thetwiceler | 12 years ago | on: Human-Powered Helicopter Wins the $250,000 Sikorsky Prize
On a similar note, for those of you who may not be aware, there has also been a human-powered airplane: http://en.wikipedia.org/wiki/Gossamer_Albatross
I am totally amazed by the Gossamer Albatross. It was made in 1979. And here's the most awesome part - a cyclist flew it across the English Channel!
thetwiceler | 12 years ago | on: Anatomy of a pseudorandom number generator – visualising Cryptocat's buggy PRNG
Suppose you are using the PRNG to make a stream cipher. Basically, your random key is a seed for the PRNG. You then generate lots of pseudo-random characters from that seed, and XOR them together (character-by-character) with your message to encrypt it.
Now, the fact that XOR is linear (it's just addition mod 2) means when you XOR two probability distributions against a constant (i.e., an atomic distribution), you'll get a shifted distribution. Let's say your PRNG disproportionately outputs "0" at each character. Then the distribution of each character of the ciphertext will be centered at p XOR 0, where p is the corresponding character of the plaintext!
So by the law of large numbers, if we see the same message encrypted many times, we can determine with high probability exactly what each character of the plaintext is, and completely break encryption!
thetwiceler | 12 years ago | on: Google Correlate - Draw
http://www.google.com/trends/correlate/search?e=id:UfmYZwmin...
In case you're wondering what's popular in winter: it's lots of diseases!
thetwiceler | 12 years ago | on: Normal vs. Fat-tailed Distributions
But the variance of your probability distribution of where you'll be at time t is linear in t. So say that your variance is v(t)=t. Then at t=1, there is a 32% chance that you'll be outside of the range (-1,1). As you can see, as t increases, you expected to drift further than further.
So while the expectation of x(t) may be 0 for all time, the expectation of |x(t)| scales like sqrt(t) (the standard deviation of the distribution).
thetwiceler | 12 years ago | on: Papers and essays that every programmer should be aware of
Fortunately Haskell does this too with Data.Ratio!
thetwiceler | 12 years ago | on: Economics Is A Lost Field
However, we can't predict the weather, nor can we predict the earthquakes. And as you correctly point out, the reason for this is chaos. The reason is not that we don't understand the underlying processes that govern these dynamics. Rather, the uncertainties in our initial conditions increase exponentially with time (due to chaos) and make long-term predictions unreliable.
thetwiceler | 12 years ago | on: Economics Is A Lost Field
A major recent criticism is of some economic models which have assumptions to the effect of "returns on this investment will be normally distributed." The criticism is that evidence seems to suggest that returns are not actually normally distributed; rather they have "fat tails," meaning that extreme events are more likely than with a normal distribution. And this error causes economic models to discount the role of these unlikely extreme events.
The Black-Scholes model effectively models stock prices as Brownian motion. This boils down to a single assumption: stock prices are the buildup of lots of very small INDEPENDENT random events (with FINITE variance) that occur over short times periods; this leads straight to Brownian motion. Brownian motion is indeed continuous; however, it is nowhere differentiable with probability 1. Because of the assumptions of finite variance and independence, the CLT tells us that returns will be normally distributed. So here, people pretty much agree that it's always these assumptions of independence that cause these models to predict poorly.
This is totally different from chaos theory. First of all, chaotic systems are not necessarily complex. Here's a chaotic map with a "nice little formula":
f : [0,1) -> [0,1)
f(x) = 2*x (mod 1)
While this is chaotic, we can understand much about this map. For example, each application produces an entropy of 1 bit (see entropy of dynamical systems...). So this is a map that "produces randomness" in a way. This is philosophically nice, because probability theory itself doesn't at all explain how randomness comes about.thetwiceler | 12 years ago | on: The State of the Tau
Like if we have a function f(x,y), we often write the "total" derivative of f as df = (df/dx) dx + (df/dy) dy And the rules for manipulating a differential like this are quite strange (e.g., dx on top cannot cancel the dx on bottom).
There's a language of differential forms that makes this notation a bit more rigorous, but I still think the notation is very misleading and confusing, especially for those who begin to learn calculus.
thetwiceler | 12 years ago | on: The State of the Tau
I like the explanation page for why tau a lot, and I think that page is much more important than actually using tau in real mathematical works. It's a great explanation of how 2*pi is the more natural quantity, and how it comes into play in the different functions. But it's just much easier to deal with factors of 2 than it is to explain alternate notation.
In fact, I think a lot of making mathematical works easy to understand is using commonly accepted notation...
thetwiceler | 12 years ago | on: 8th Grade Twins take Astrobiology and Einstein Courses on Coursera
In fact, I am kind of hoping that a community is made where people can get together and study any educational material themselves. Coursera still lacks upper-level classes; it would be great to use the great aggregation of the internet to find a group of people to study advanced material together.
thetwiceler | 12 years ago | on: The State of the Tau
The thing is, tau is really my go-to variable when I need a 2nd constant to compare with t. It is already used as a time constant, a dummy integration variable that substitutes for t... The manifesto has a pre-made counter-argument to this... But I'm not sure it's convincing.
The thing is that we like to say some variation of e^(-tau) a lot, and we also like to say e^(2 pi i) a lot. And often we combine these (rotation and exponential decay), and we get e^(-tau+2pi i). And this would ruin tau notation! It would be e^(-tau'+tau i)...
thetwiceler | 12 years ago | on: Intuition and Logic in Mathematics (1905)
But as Poincare mentions, we can also see that Euclid DID have much "intuition" in his works. In modern days, we do not consider Euclid's Elements a rigorous work of logic, mainly because his definitions are not rigorous definitions; he says a point is "that which has no part." Hilbert remedied Euclidean geometry in the 20th century, with a work that has some undefined objects (points and lines) and precisely defines the rest. He needs something like 24 axioms, as opposed to Euclid's 5.
It's interesting to see where Euclid's logic breaks down. Look at his very first proof - the construction of an equilateral triangle. He constructs two circles and looks at their intersection. How do we know we can build the circles to intersect? He draws us a picture and it seems obvious :).
But in Hilbert's Euclidean geometry, we need what I think of as a really nasty axiom in order to ensure the circles intersect: "Axiom of completeness. To a system of points, straight lines, and planes, it is impossible to add other elements in such a manner that the system thus generalized shall form a new geometry obeying all of the five groups of axioms. In other words, the elements of geometry form a system which is not susceptible of extension, if we regard the five groups of axioms as valid."
thetwiceler | 12 years ago | on: Straight-edge and Compass Construction Kit
Ahh! Well you can't say I didn't try. Must have accidentally hit one of the wrong points somewhere in the construction...
thetwiceler | 12 years ago | on: Straight-edge and Compass Construction Kit
thetwiceler | 12 years ago | on: Has the US become the type of nation from which you have to seek asylum?
Now, I'd argue the leak that Snowden made was much more acceptable than, if he had revealed the identities of CIA spies to foreign countries and caused them to be killed.
Nevertheless, we are a nation guided by rule of law, and it would be unreasonable to expect that Snowden wouldn't be prosecuted. Just because he did something which many people consider was a good thing, does not mean that he didn't do something illegal.
thetwiceler | 13 years ago | on: Bitcoin exchange rate reached $100 USD per BTC
I set up camp over the Golden Gate Bridge. I ask people if they'd be willing to throw their lunch into the water (use their computer's time/energy/money to mine Bitcoin), with no possibility of retrieving it, and in return I offer to give them a certificate stating that they've thrown their lunch away (a Bitcoin).
Now, I see no reason why anyone would expect why these certificates would have any value. But let's suppose they do have value, and people can trade their certificates, say, for a lunch (or perhaps half a lunch). NOW, suppose I have been secretly stashing away those lunches (really it doesn't need to be in secret - you don't care what happens to your lunch after you throw it away). Then there is now objectively more value in the world than there was before I set up shop - there are the same amount of lunches in the world, but there are also now certificates that have some positive value. We've made a free lunch!
This is a contradiction. If we could create value by this silly game, we could easily make as much value as we want, and we would have solved all the world's problem.
But this should be obvious in the first place! Why on earth would you expect to be rewarded with something of value (a Bitcoin) for doing something fundamentally useless to society (mining Bitcoin). Even though mining Bitcoin comes at a cost to you, this doesn't matter - it just means you're throwing your lunch away.
Now why does the US Dollar have value? Instead of throwing away our lunches for a certificate, we basically asked the government to store our lunches in Fort Knox. Today, lots of gold and other items of value are held there. Why are they held there? There is an implicit assumption that these items of value support the US Dollar. The government knows it could not start "eating those lunches" that it's got in its reserve! It would not yield society a free lunch - it would crash the US Dollar.