“In mathematics and theoretical computer science, we read research papers primarily to find research questions to work on, or find techniques we can use to prove new theorems.”
This is why figuring out an elegant, concise, and powerful set of mathematical models which apply to multiple domains, and then devoting effort to simplifying, organizing, and explaining those ideas in an accessible way is so important.
Incentives for researchers are mostly to push and prod at the boundaries of a field, but in my opinion mathematical ideas are only of marginal value in themselves; more important is the way they help us understand and interact with the physical universe, and for that building communities, developing effective languages and notations, codifying our understanding, and making it accessible both to newcomers and to outsiders is the most important task for a field, and perhaps for our society generally.
Just like with software projects or companies, the most “success” comes from helping a range of other people solve their problems and extend their abilities, not from making technically beautiful art projects for their own sake (not that there’s anything inherently wrong with those).
Perhaps more generally, while theorem proving has overwhelmingly dominated pure mathematics and related fields for the past 80–100 years, and has been an important tool since Euclid, theorem proving is only one way of approaching the world, and in my opinion is a mere tool, not an end in itself. Just like simulation is a tool, or drawing pictures is a tool, or statistical analysis is a tool.
But sometimes the connection between "beautiful art project" and "practical tools" is totally unexpected. We often invest time in projects that seem simply like "beautiful art", and then much later stumble upon something practical.
I think a great example of this is cryptography. The foundations of it come from number theory (prime numbers, modular arithmetic, elliptic curves), but the subject of number theory, before the advent of computing, was possibly the most useless kinds of mathematical 'art' that could have existed. I imagine it was the mathematical equivalent of frolicking in the fields.
Mathematicians explored Fermat's little theorem starting in 1640, but they didn't do it because they knew it'd be useful several hundred years later in RSA. They did it simply because math is worth exploring in itself.
Even if you don't subscribe to the idea that we should pursue math for math's sake, history shows us that it's very difficult to know what parts of math will be useful to humanity, especially hundreds of years later. Since people work best on what they find interesting, mathematicians should continue exploring the topics that most interest them, because we really can't say with any certainty what will prove useful (or even essential) to future generations.
> This is why figuring out an elegant, concise, and powerful set of mathematical models which apply to multiple domains, and then devoting effort to simplifying, organizing, and explaining those ideas in an accessible way is so important.
This reminds me of this Von Neumann quote about the importance of mathematics having an 'empirical source':
—
I think that it is a relatively good approximation to truth—which is much too complicated to allow anything but approximations—that mathematical ideas originate in empirics, although the genealogy is sometimes long and obscure. But, once they are so conceived, the subject begins to live a peculiar life of its own and is better compared to a creative one, governed by almost entirely aesthetical motivations, than to anything else and, in particular, to an empirical science. There is, however, a further point which, I believe, needs stressing. As a mathematical discipline travels far from its empirical source, or still more, if it is a second and third generation only indirectly inspired by ideas coming from "reality" it is beset with very grave dangers. It becomes more and more purely aestheticizing, more and more purely I'art pour I'art. This need not be bad, if the field is surrounded by correlated subjects, which still have closer empirical connections, or if the discipline is under the influence of men with an exceptionally well-developed taste. But there is a grave danger that the subject will develop along the line of least resistance, that the stream, so far from its source, will separate into a multitude of insignificant branches, and that the discipline will become a disorganized mass of details and complexities. In other words, at a great distance from its empirical source, or after much "abstract" inbreeding, a mathematical subject is in danger of degeneration. At the inception the style is usually classical; when it shows signs of becoming baroque, then the danger signal is up. It would be easy to give examples, to trace specific evolutions into the baroque and the very high baroque, but this, again, would be too technical.
In any event, whenever this stage is reached, the only remedy seems to me to be the rejuvenating return to the source: the re-injection of more or less directly empirical ideas. I am convinced that this was a necessary condition to conserve the freshness and the vitality of the subject and that this will remain equally true in the future.
The real shame is that academia funds a lot of research, but not a lot of scientific "scholarship". We should have academics dedicated to high quality exposition of advanced material. You don't see much of that beyond the never ending rewrites of basic calc and algebra textbooks.
This is a very good and thought-provoking essay for a short blog post, and I have already shared it in a Facebook community heavily populated by professional mathematicians (where the moderator, with a Ph. D. in math from Berkeley, has given it a thumbs up). Thanks for sharing.
I really like the overall point of the post that mathematics once known can be forgotten or neglected, and mathematics written up for mathematics journals can be difficult to understand. Professor John Stillwell writes, in the preface to his book Numbers and Geometry (New York: Springer-Verlag, 1998):
"What should every aspiring mathematician know? The answer for most of the 20th century has been: calculus. . . . Mathematics today is . . . much more than calculus; and the calculus now taught is, sadly, much less than it used to be. Little by little, calculus has been deprived of the algebra, geometry, and logic it needs to sustain it, until many institutions have had to put it on high-tech life-support systems. A subject struggling to survive is hardly a good introduction to the vigor of real mathematics.
". . . . In the current situation, we need to revive not only calculus, but also algebra, geometry, and the whole idea that mathematics is a rigorous, cumulative discipline in which each mathematician stands on the shoulders of giants.
"The best way to teach real mathematics, I believe, is to start deeper down, with the elementary ideas of number and space. Everyone concedes that these are fundamental, but they have been scandalously neglected, perhaps in the naive belief that anyone learning calculus has outgrown them. In fact, arithmetic, algebra, and geometry can never be outgrown, and the most rewarding path to higher mathematics sustains their development alongside the 'advanced' branches such as calculus. Also, by maintaining ties between these disciplines, it is possible to present a more unified view of mathematics, yet at the same time to include more spice and variety."
Stillwell demonstrates what he means about the interconnectedness and depth of "elementary" topics in the rest of his book, which is a delight to read and full of thought-provoking problems.
I think this is an interesting perspective to think about. I certainly remember a long time where I felt this way—that calculus was the entire reason for learning mathematics and the rest are a little silly or outdated. It took a long time for me to catch on to why the simple, silly stuff is where all of the fun is.
Today, calculus feels boring and dead to me. Obviously useful, but a mere tool instead of something greater. I spend my time thinking about things like topology where I work really hard to think about what it means for things to be close to one another and nothing more.
A younger me would not have understood. Which is a little scary.
> Everyone concedes that these are fundamental, but they have been scandalously neglected, perhaps in the naive belief that anyone learning calculus has outgrown them. In fact, arithmetic, algebra, and geometry can never be outgrown, and the most rewarding path to higher mathematics sustains their development alongside the 'advanced' branches such as calculus.
I doubt one's "outgrowth of certain branches of math" is the reason math as taught to non-math majors is a castrated mess it is. It's probably the market forces that reject real analysis, abstract algebra or anything at that level or higher. It's the same reason "some programming language du jour > fundamentals of CS, IRL".
>"The best way to teach real mathematics, I believe, is to start deeper down, with the elementary ideas of number and space. Everyone concedes that these are fundamental, but they have been scandalously neglected, perhaps in the naive belief that anyone learning calculus has outgrown them. In fact, arithmetic, algebra, and geometry can never be outgrown, and the most rewarding path to higher mathematics sustains their development alongside the 'advanced' branches such as calculus. Also, by maintaining ties between these disciplines, it is possible to present a more unified view of mathematics, yet at the same time to include more spice and variety."
While I do agree, we have to remember why most math classes actually exist: to teach calculus to physicists and engineers, and, as my stepfather's undergraduate advisor once said, "to keep the children from running in the halls".
(For the mathematician's extremely self-centered view of "children" as "anyone who has yet to ace two semesters of real analysis".)
I've been starting into real analysis myself via Pugh's textbook[1] after not taking a serious math class since multivariable calculus, and found that, once I get past the applied stuff, I really like the approach of building up calculus from its foundations in real numbers (taken as Dedekind cuts), limits (Cauchy-convergent sequences), the set-theoretic construction of functions, and the construction of topological and metric spaces "from scratch". But I can tell that I like it because, deep down, I have the mind of a theoretical computer scientist (which is what I like to be when I'm not writing firmware), which is a kind of mathematician. I appreciate that someone has to teach the applied classes to the people who aren't going to kvetch about "how can I trust that works!?" and who demand to just get their math over with as quickly as possible.
But the proofs survive because they are proofs; if they don't communicate the proof of the result then they have failed and should not be accepted by journals. At the extreme end, machine-checkable proofs are in standard, documented formats; an alien reading them in ten thousand years should still be able to understand what's going on, at least if they understand the notation and the axioms.
And codes does what code does. Given some binary executable, an alien reading it far in the future should be able to understand what's going on, at least if they understand the architecture. :P
I've written a few machine checked proofs, and there's really two ways that I've seen, either writing it for the next human to read, or just enough that the checker accepts it. The latter makes free use of tactics like `crush`, which brute force solutions out of current assumptions, exploring the search space automatically. That's really convenient, but can make reading the proof very un-enlightening.
Ludwik Fleck's "Genesis and development of a scientific fact" goes very much in the line of "[science] only exists in a living community of mathematicians that spreads understanding and breaths life into ideas both old and new." (written pre-WW2; it served as an inspiration for Khun). Its most eye-opening example is the history of [the concept/knowledge/science/... of] syphilis, from ancient to modern times.
Integrate concise and effect explanations into the relevant Wikipedia articles and you at least give future generations a good head start on understanding these things.
Most of the Wikipedia articles on technical subjects, and especially on mathematical topics, are terrible as introductory exposition. They are jargony, highly technical, and self referential. They usually contain much that is irrelevant, and they almost never properly explain the context for an idea.
The main problem is that Wikipedia articles are tiny and atomic, so it’s difficult to synthesize and organize ideas into a coherent story. The culture of Wikipedia frowns on the kind of exposition found in textbooks or lectures. And perhaps most importantly, no one is responsible for either individual articles or sets of related articles in a field. Working within those confines is not the best way to spend your time if the goal is to give future generations a leg up, in my opinion.
If you want to learn about mathematics, even a mediocre textbook is nearly always better than the relevant Wikipedia pages. The Wikipedia pages are then useful later, as a reference, for people who already understand their content.
Wikipedia is a horrible way to learn math. At most it works as a way to get initial pointers for literature. In most specialized fields, don't expect wikipedia to provide any understanding of mathematics beyond a summary of formulas without much explanation of what they are.
Of course. Most modern mathematicians aren't fluent with half the material in (the ~100 year-old text) Whittaker and Watson "A Course of Modern Analysis". This was standard material even 60 years ago. You can get a PhD in mathematics today without once seeing an elliptic function, because computers are good enough at numerically solving the problems they were once used to solve symbolically.
How many people know how to multiply two numbers expressed in Roman numeral format without reference to an algorism (not a typo!) or other methods based on Hindu-Arabic numerals?
How many people are fast at computing fifth roots without recourse to computational tools such as Hindu-Arabic numerals?
On the other hand, most modern mathematicians know stuff that would blow Whittaker and Watson's socks off.
Much mathematics is obsoleted. For example, there was a lot of incredibly difficult mathematics for finding areas under curves, which was all completely obsoleted with the discovery of the fundamental theorem of calculus. Nothing of value was lost.
Individual pieces of mathematics come and go from general awareness, but the overall trend is definitely one of increasing, not decreasing, understanding.
Whatever about Mathematics, this is certainly true of Computing. Well understood ideas are continually being reinvented, frequently badly. New programming languages and frameworks spring up like mushrooms and everyone wants to jump on board the next big thing.
Nowhere is it more true that those who don't know the past are condemned to repeat it.
I read the SA article the blog refers to and I couldn't decide if that particular colossal theory on symmetry was just an isolated incident or that "deterioration" is really happening to many disciplines/theories of math. It is certainly an obvious fact that things become popular and then eventually forgotten and then sometimes brought back. There is also different levels of understanding: breadth vs depth. I recall at one point there was concern of the opposite. That is too much depth and not enough breadth (the above theory is depth problem as many mathematicians know of the theory just not the exact proof).
I still think the unpublished problem ie "publication bias" is a bigger issue which I suppose is somewhat in similar vain. Supposedly google was working on that.
The theorem the SA article is talking about - CFSG, the Classification of Finite Simple Groups - is somewhat special in that respect. Lots of things in math fall out of fashion and get forgotten, often whole subfields. CFSG is different because the theorem itself is so basic and important that it's not likely to be forgotten in any foreseeable future. But its proof is so long and complicated that it's not even clear that there's one person who understands all of it, and the heap of details is not organised well enough for someone to just study it from books/articles without the help of people who lived through proving it back in the 70ies.
Suppose there just isn't enough interest in the younger generation of mathematicians to study the proof, even if the old guard are able to organize it better before they retire. Then we may reach a situation in which CFSG will still be used as a proved theorem and not a conjecture - because it's so powerful and important in many fields of math - but its proof will be lost to collective memory. I'm not sure, but I think that state of affairs might be without precedent.
(Here's a quote from Gian-Carlo Rota's _Indiscrete Thoughts_ on forgotten and rediscovered math:
"The history of mathematics is replete with injustice. There is a tendency
to exhibit toward the past a forgetful, oversimplifying, hero-worshiping
attitude that we have come to identify with mass behavior. Great
advances in science are pinned on a few extraordinary white-maned
individuals. [...]
One consequence of this sociological law is that whenever a
forgotten branch of mathematics comes back into fashion after a period
of neglect only the main outlines of the theory are remembered, those
you would find in the works of the Great Men. The bulk of the theory
is likely to be rediscovered from scratch by smart young
mathematicians who have realized that their future careers depend on publishing
research papers rather than on rummaging through dusty old journals.
In all mathematics, it would be hard to find a more blatant instance
of this regrettable state of affairs than the theory of symmetric
functions. Each generation rediscovers them and presents them in the latest
jargon. Today it is if-theory, yesterday it was categories and functors,
and the day before, group representations. Behind these and several
other attractive theories stands one immutable source: the ordinary,
crude definition of the symmetric functions and the identities they
satisfy.")
Not sure why the downvotes, it was an interesting course: a lot of tricks to show isomorphisms / reasons why certain groups couldn't exist. It just went up to order 1000, which was a lot of cases (30 lectures + exercises worth).
[+] [-] jacobolus|10 years ago|reply
This is why figuring out an elegant, concise, and powerful set of mathematical models which apply to multiple domains, and then devoting effort to simplifying, organizing, and explaining those ideas in an accessible way is so important.
Incentives for researchers are mostly to push and prod at the boundaries of a field, but in my opinion mathematical ideas are only of marginal value in themselves; more important is the way they help us understand and interact with the physical universe, and for that building communities, developing effective languages and notations, codifying our understanding, and making it accessible both to newcomers and to outsiders is the most important task for a field, and perhaps for our society generally.
Just like with software projects or companies, the most “success” comes from helping a range of other people solve their problems and extend their abilities, not from making technically beautiful art projects for their own sake (not that there’s anything inherently wrong with those).
Perhaps more generally, while theorem proving has overwhelmingly dominated pure mathematics and related fields for the past 80–100 years, and has been an important tool since Euclid, theorem proving is only one way of approaching the world, and in my opinion is a mere tool, not an end in itself. Just like simulation is a tool, or drawing pictures is a tool, or statistical analysis is a tool.
I like this bit from Feynman: https://www.youtube.com/watch?v=YaUlqXRPMmY
[+] [-] blintzing|10 years ago|reply
I think a great example of this is cryptography. The foundations of it come from number theory (prime numbers, modular arithmetic, elliptic curves), but the subject of number theory, before the advent of computing, was possibly the most useless kinds of mathematical 'art' that could have existed. I imagine it was the mathematical equivalent of frolicking in the fields.
Mathematicians explored Fermat's little theorem starting in 1640, but they didn't do it because they knew it'd be useful several hundred years later in RSA. They did it simply because math is worth exploring in itself.
Even if you don't subscribe to the idea that we should pursue math for math's sake, history shows us that it's very difficult to know what parts of math will be useful to humanity, especially hundreds of years later. Since people work best on what they find interesting, mathematicians should continue exploring the topics that most interest them, because we really can't say with any certainty what will prove useful (or even essential) to future generations.
[+] [-] rndn|10 years ago|reply
This reminds me of this Von Neumann quote about the importance of mathematics having an 'empirical source':
—
I think that it is a relatively good approximation to truth—which is much too complicated to allow anything but approximations—that mathematical ideas originate in empirics, although the genealogy is sometimes long and obscure. But, once they are so conceived, the subject begins to live a peculiar life of its own and is better compared to a creative one, governed by almost entirely aesthetical motivations, than to anything else and, in particular, to an empirical science. There is, however, a further point which, I believe, needs stressing. As a mathematical discipline travels far from its empirical source, or still more, if it is a second and third generation only indirectly inspired by ideas coming from "reality" it is beset with very grave dangers. It becomes more and more purely aestheticizing, more and more purely I'art pour I'art. This need not be bad, if the field is surrounded by correlated subjects, which still have closer empirical connections, or if the discipline is under the influence of men with an exceptionally well-developed taste. But there is a grave danger that the subject will develop along the line of least resistance, that the stream, so far from its source, will separate into a multitude of insignificant branches, and that the discipline will become a disorganized mass of details and complexities. In other words, at a great distance from its empirical source, or after much "abstract" inbreeding, a mathematical subject is in danger of degeneration. At the inception the style is usually classical; when it shows signs of becoming baroque, then the danger signal is up. It would be easy to give examples, to trace specific evolutions into the baroque and the very high baroque, but this, again, would be too technical.
In any event, whenever this stage is reached, the only remedy seems to me to be the rejuvenating return to the source: the re-injection of more or less directly empirical ideas. I am convinced that this was a necessary condition to conserve the freshness and the vitality of the subject and that this will remain equally true in the future.
[+] [-] reagency|10 years ago|reply
[+] [-] erje|10 years ago|reply
http://arxiv.org/abs/math/9404236
Your comment could very well be its abstract.
[+] [-] tokenadult|10 years ago|reply
I really like the overall point of the post that mathematics once known can be forgotten or neglected, and mathematics written up for mathematics journals can be difficult to understand. Professor John Stillwell writes, in the preface to his book Numbers and Geometry (New York: Springer-Verlag, 1998):
"What should every aspiring mathematician know? The answer for most of the 20th century has been: calculus. . . . Mathematics today is . . . much more than calculus; and the calculus now taught is, sadly, much less than it used to be. Little by little, calculus has been deprived of the algebra, geometry, and logic it needs to sustain it, until many institutions have had to put it on high-tech life-support systems. A subject struggling to survive is hardly a good introduction to the vigor of real mathematics.
". . . . In the current situation, we need to revive not only calculus, but also algebra, geometry, and the whole idea that mathematics is a rigorous, cumulative discipline in which each mathematician stands on the shoulders of giants.
"The best way to teach real mathematics, I believe, is to start deeper down, with the elementary ideas of number and space. Everyone concedes that these are fundamental, but they have been scandalously neglected, perhaps in the naive belief that anyone learning calculus has outgrown them. In fact, arithmetic, algebra, and geometry can never be outgrown, and the most rewarding path to higher mathematics sustains their development alongside the 'advanced' branches such as calculus. Also, by maintaining ties between these disciplines, it is possible to present a more unified view of mathematics, yet at the same time to include more spice and variety."
Stillwell demonstrates what he means about the interconnectedness and depth of "elementary" topics in the rest of his book, which is a delight to read and full of thought-provoking problems.
http://www.amazon.com/gp/product/0387982892/
[+] [-] tel|10 years ago|reply
Today, calculus feels boring and dead to me. Obviously useful, but a mere tool instead of something greater. I spend my time thinking about things like topology where I work really hard to think about what it means for things to be close to one another and nothing more.
A younger me would not have understood. Which is a little scary.
[+] [-] LimpWristed|10 years ago|reply
I doubt one's "outgrowth of certain branches of math" is the reason math as taught to non-math majors is a castrated mess it is. It's probably the market forces that reject real analysis, abstract algebra or anything at that level or higher. It's the same reason "some programming language du jour > fundamentals of CS, IRL".
[+] [-] eli_gottlieb|10 years ago|reply
While I do agree, we have to remember why most math classes actually exist: to teach calculus to physicists and engineers, and, as my stepfather's undergraduate advisor once said, "to keep the children from running in the halls".
(For the mathematician's extremely self-centered view of "children" as "anyone who has yet to ace two semesters of real analysis".)
I've been starting into real analysis myself via Pugh's textbook[1] after not taking a serious math class since multivariable calculus, and found that, once I get past the applied stuff, I really like the approach of building up calculus from its foundations in real numbers (taken as Dedekind cuts), limits (Cauchy-convergent sequences), the set-theoretic construction of functions, and the construction of topological and metric spaces "from scratch". But I can tell that I like it because, deep down, I have the mind of a theoretical computer scientist (which is what I like to be when I'm not writing firmware), which is a kind of mathematician. I appreciate that someone has to teach the applied classes to the people who aren't going to kvetch about "how can I trust that works!?" and who demand to just get their math over with as quickly as possible.
[1] -- http://www.amazon.com/Mathematical-Analysis-Undergraduate-Te...
[+] [-] lmm|10 years ago|reply
[+] [-] Tyr42|10 years ago|reply
I've written a few machine checked proofs, and there's really two ways that I've seen, either writing it for the next human to read, or just enough that the checker accepts it. The latter makes free use of tactics like `crush`, which brute force solutions out of current assumptions, exploring the search space automatically. That's really convenient, but can make reading the proof very un-enlightening.
[+] [-] stared|10 years ago|reply
PDF (of print from 1979): http://www.evolocus.com/Textbooks/Fleck1979.pdf
[+] [-] pc2g4d|10 years ago|reply
[+] [-] jacobolus|10 years ago|reply
The main problem is that Wikipedia articles are tiny and atomic, so it’s difficult to synthesize and organize ideas into a coherent story. The culture of Wikipedia frowns on the kind of exposition found in textbooks or lectures. And perhaps most importantly, no one is responsible for either individual articles or sets of related articles in a field. Working within those confines is not the best way to spend your time if the goal is to give future generations a leg up, in my opinion.
If you want to learn about mathematics, even a mediocre textbook is nearly always better than the relevant Wikipedia pages. The Wikipedia pages are then useful later, as a reference, for people who already understand their content.
[+] [-] coliveira|10 years ago|reply
[+] [-] stephencanon|10 years ago|reply
[+] [-] cbd1984|10 years ago|reply
How many people are fast at computing fifth roots without recourse to computational tools such as Hindu-Arabic numerals?
Are those things math or arithmetic?
[+] [-] xamuel|10 years ago|reply
Much mathematics is obsoleted. For example, there was a lot of incredibly difficult mathematics for finding areas under curves, which was all completely obsoleted with the discovery of the fundamental theorem of calculus. Nothing of value was lost.
Individual pieces of mathematics come and go from general awareness, but the overall trend is definitely one of increasing, not decreasing, understanding.
[+] [-] ripter|10 years ago|reply
[+] [-] thrownaway2424|10 years ago|reply
[+] [-] nemoniac|10 years ago|reply
Nowhere is it more true that those who don't know the past are condemned to repeat it.
[+] [-] spiritplumber|10 years ago|reply
Will Myron Aub give us the feeling of power back?
http://downlode.org/Etext/power.html by Isaac Asimov on just this topic.
[+] [-] LimpWristed|10 years ago|reply
[+] [-] agentgt|10 years ago|reply
I still think the unpublished problem ie "publication bias" is a bigger issue which I suppose is somewhat in similar vain. Supposedly google was working on that.
[+] [-] anatoly|10 years ago|reply
Suppose there just isn't enough interest in the younger generation of mathematicians to study the proof, even if the old guard are able to organize it better before they retire. Then we may reach a situation in which CFSG will still be used as a proved theorem and not a conjecture - because it's so powerful and important in many fields of math - but its proof will be lost to collective memory. I'm not sure, but I think that state of affairs might be without precedent.
(Here's a quote from Gian-Carlo Rota's _Indiscrete Thoughts_ on forgotten and rediscovered math:
"The history of mathematics is replete with injustice. There is a tendency to exhibit toward the past a forgetful, oversimplifying, hero-worshiping attitude that we have come to identify with mass behavior. Great advances in science are pinned on a few extraordinary white-maned individuals. [...]
One consequence of this sociological law is that whenever a forgotten branch of mathematics comes back into fashion after a period of neglect only the main outlines of the theory are remembered, those you would find in the works of the Great Men. The bulk of the theory is likely to be rediscovered from scratch by smart young mathematicians who have realized that their future careers depend on publishing research papers rather than on rummaging through dusty old journals.
In all mathematics, it would be hard to find a more blatant instance of this regrettable state of affairs than the theory of symmetric functions. Each generation rediscovers them and presents them in the latest jargon. Today it is if-theory, yesterday it was categories and functors, and the day before, group representations. Behind these and several other attractive theories stands one immutable source: the ordinary, crude definition of the symmetric functions and the identities they satisfy.")
[+] [-] sklogic|10 years ago|reply
[+] [-] hayd|10 years ago|reply
Joy.... "Like having your brains smashed out by a slice of lemon wrapped around a large gold brick."
[+] [-] hayd|10 years ago|reply