This seems to echo what Richard Hamming says in "You and your research". He recounts how his fellow scientists were working on unimportant problems in their field. The judgement of unimportant came not from Hamming, but from the fellow scientists when he asked them "What's the most important problem in your field right now?" His subsequent question was, "How come you're not working on it?" He says that didn't earn him many friends.
By "important", I had also mistakenly thought that he meant "grandiose". But in fact, he later defines "important problems" as "problems that you have a reasonable angle of attack to solve".
Some problems just aren't yet ripe, until all the pieces to solve it come together. Which makes the question, "Why now?" a very good question to answer when contemplating what problem is important.
When someone writes about how Silicon Valley is working on worthless problems, like photo sharing app, or the Yo app, or twitter--or when Groupon was on the up, and it seems like there were lots of daily deal sites, it makes me think of this. I wonder if they take into account that some problems that are considered worthwhile in their eyes simply aren't ripe to be solved yet.
I think it's useful to distinguish between "kinds of problems". The problem of understanding friction between two polished surfaces is a different category of problem from that solved by Yo or Groupon. The payoff of the former is the satisfaction of curiosity, and, arguably, an irreversible advancement of human knowledge. The payoff of the latter is, at best, a small improvement in utility of a small percent of humans over a relatively short amount of time.
The dirty little secret of the valley is that (most) people don't care about this distinction. There are plenty of people made rich by AOL buyouts and MySpace acquisitions - and they are just as rich even though their creations didn't last. There is a tacit acceptance of the truth that technology is fashion, not science.
Why is this distinction important to your point? Because "ripe to be solved" takes on a very different character in the fashion technology sense and the science technology sense. Indeed, I'm not sure it can even be applied to fashion technology like "Yo" since the elements of success are entirely self-referential, as with all fashionable things. Or do you think "Yo" was really just waiting for TCP/IP and Objective C and the iPhone etc. before it could be "realized"?
It's why I started working on the D programming language. I figured I only had so many productive years left, and what was I going to spend them on? I wanted to do something that mattered.
This note might deserve a little more emphasis. It's routine in history that problems are solved only once the tools become available, which is supported by a stunning list of coinciding discoveries:
Fermat's last theorem was an open problem for some 400 years, but was solved within two decades (1993) after the introduction of the Frey curve (1975).
What this stresses is the importance of finding your own problem, that is, choosing your own path instead of having an attitude of waiting to be assigned with a problem and then subsequently becoming unhappy, which is what Feynman calls out as a 'mistake' on his part: not letting or even, not DEMANDING from his student to choose his own problem.
Ultimately, it's the advice to change your frame of reference from the 'sum of human knowledge' to 'what problem can i solve today, immediately'. And to ask yourself that question first and find an answer.
That's not only smart but about the wisest thing I ever heard someone say regarding work.
There is also the component of being sad and not knowing why, while you work on what you think is interesting and important.
I remember reading how Feynman was unhappy with his work and then chose a seemingly useless but fun and interesting problem to solve - the physics of plate wobbling (If I remember correctly, the relation between the wobble rate and spin rate of a plate that has been tossed into the air).
His colleagues were a bit confused as to why he would do this, but he had fun and his love for physics was rekindled.
Some years later the mathematics he derived would be used when the first satellites where launched and wobbled as they spun, the wobbling not being desirable. Not bad for useless and fun work!
The letter is from an excellent collection of Feynman letters, "Perfectly Reasonable Deviations from the Beaten Track". There's a thoughtful review of the entire collection (by Freeman Dyson) here:
I asked somebody who can find out what Dr. Mano did with the advice: my Dad, who still communicates with some of Tomonaga's students, but he said would have to ask around.
"It seems that the influence of your teacher has been to give you a false idea of what are worthwhile problems. The worthwhile problems are the ones you can really solve or help solve, the ones you can really contribute something to. A problem is grand in science if it lies before us unsolved and we see some way for us to make some headway into it."
That is very good advice. You could replace teacher with your corporate manager and science with society and it becomes relevant to everyone working on a normal corporate job or a start-up.
NW: They say the most creative and challenging part of research is finding the right question to ask. Do you agree with that?
LA: I wouldn't characterize it as the most challenging thing, but it's of critical importance. Sometimes it's not hard to find the "right question'. For example the mathematics literature is full of very important unanswered questions. In this case, the real issue is: Has that question's time come? Have we reached a point where developments in the appropriate area of science give us some chance of breaking the problem? For example, I worked on a famous centuries old math problem called "Fermat's Last Theorem". I was not 'strong' enough to solve it, but I find some solace in the fact that my intuition that its 'time had come' was right. The problem was finally solved two years ago by Andrew Wiles of Princeton. It was one of the major events in the history of mathematics.
The other side is to generate new questions. That's a funny process. The way I seek to generate new questions is to start to look at whole new fields, like biology, immunology or physics. Since I come from a different field, mathematics, I bring an unusual point-of-view that sometimes allows me to generate questions different from the classical questions in those areas. Like the question of DNA computing.
For the young scientist, this question of choosing the right question to spend your valuable limited intellectual resources on is critical. I often sit for months and do no productive work that anybody can see, because I don't feel I have a good enough question to work on. Rather than take on some lesser question, I would prefer to read a mystery novel. The point is, sometimes it's important to lie fallow for a time waiting for the 'right question' to appear, rather than to engage in uninspiring work and miss the important opportunity when in comes.
But in the end, the real challenge of science is to make progress - to succeed, to contribute knowledge.
NW: Of course, in an academic setting, there's that drive to publish or perish...
LA: Yes, that's a problem, because you have to feed your family. But I always tell my students and junior faculty that they are better off following their inspiration and their hearts in what research they do, that they should always try to take on the most interesting and important problems, that they should not waste their time on little problems just to make another line on a vitae.
My philosophy is that it's important, in a curious way, for scientists to be courageous. Not physically courageous, but courageous in an intellectual way. I believe that by working on extremely hard problems, by being courageous, you may succeed. But even if you fail, you fail gloriously. And you will have learned immense amounts, you will have extended the envelope of what you can do. As a byproduct of failing on a great problem, I have always found that I could solve some lesser but still interesting problems - which then fill your vitae.
Beyond the great advice, I'm struck by the supreme kindness and humanity on display in this letter--it is itself great advice on how to treat others, all the more poignant coming from a man with many other wonderful opportunities competing for his time and attention.
> A problem is grand in science if it lies before us unsolved and we see some way for us to make some headway into it. I would advise you to take even simpler, or as you say, humbler, problems until you find some you can really solve easily, no matter how trivial.
Something that I think most novice programmers should take to heart, and something I wish I had known earlier...when you start out, you want to build something big and new, like a video game, or hell, a Rails site that you think will be the next Facebook clone. Not only is it beyond your ability as a novice, it may not even be a "problem" worth solving, because you don't yet know what's worth solving until you become a bit better at programming. I stopped programming after awhile when I couldn't come close to seeing what I thought were my goals...it's been much easier to do it day-to-day by focusing on the small steps...and after awhile, the big task doesn't seem hard after all.
Meanwhile, programming has a pretty distinct advantage...even if you spend your time mastering seemingly benign and trivial things, such as being better at parsing, function design, or just automation of what you've done before, you're not only learning, but making yourself more productive at the same time...something that's not nothing as you actually begin your grand plan.
Speaking of Feynman and computing and seemingly banal tasks...I've seen only scarce detail of his supervising the "computers" at Los Alamos:
> Richard's interest in computing went back to his days at Los Alamos, where he supervised the "computers," that is, the people who operated the mechanical calculators. There he was instrumental in setting up some of the first plug-programmable tabulating machines for physical simulation. His interest in the field was heightened in the late 1970's when his son, Carl, began studying computers at MIT.
It's not something he's famous for, but I wouldn't be surprised if such a task was critical to the success of the researchers...I've gone through both his memoirs and hadn't seen much mention of it though. Anyone else have more details?
Feynman did some interesting work on computers - he used his fine physics-trained calculus skills to analyze a chip and message passing - a discrete system:
> By the end of that summer of 1983, Richard had completed his analysis of the behavior of the router, and much to our surprise and amusement, he presented his answer in the form of a set of partial differential equations. To a physicist this may seem natural, but to a computer designer, treating a set of boolean circuits as a continuous, differentiable system is a bit strange. Feynman's router equations were in terms of variables representing continuous quantities such as "the average number of 1 bits in a message address." I was much more accustomed to seeing analysis in terms of inductive proof and case analysis than taking the derivative of "the number of 1's" with respect to time. Our discrete analysis said we needed seven buffers per chip; Feynman's equations suggested that we only needed five. We decided to play it safe and ignore Feynman.
> The decision to ignore Feynman's analysis was made in September, but by next spring we were up against a wall. The chips that we had designed were slightly too big to manufacture and the only way to solve the problem was to cut the number of buffers per chip back to five. Since Feynman's equations claimed we could do this safely, his unconventional methods of analysis started looking better and better to us. We decided to go ahead and make the chips with the smaller number of buffers.
> Fortunately, he was right. When we put together the chips the machine worked. The first program run on the machine in April of 1985 was Conway's game of Life.
He goes into detail about the topic in the chapter "Los Alamos from Below" in Surely You're Joking, Mr. Feynman! The rest of the book is a fantastic read as well. He talks about many interesting experiments he did (even ones involving ants), pranks he's done with safes, how to pick up women, etc.
Well, I personally agree, but I don't think everything in this matter is as objective as Feynman wants it to be seen, judging by the tone of his letter. I could definitely imagine the opposite opinions: "you need to listen to more experienced teachers as to know which problems to solve and which do not", or "you need to think big" and so on...
"No problem is too small or too trivial if we can really do something about it."
This part tells us to keep solving whatever comes in our way. In today's world, most successful start-ups don't always succeed with their first idea, but rather an iteration of it.
[+] [-] iamwil|11 years ago|reply
By "important", I had also mistakenly thought that he meant "grandiose". But in fact, he later defines "important problems" as "problems that you have a reasonable angle of attack to solve".
Some problems just aren't yet ripe, until all the pieces to solve it come together. Which makes the question, "Why now?" a very good question to answer when contemplating what problem is important.
When someone writes about how Silicon Valley is working on worthless problems, like photo sharing app, or the Yo app, or twitter--or when Groupon was on the up, and it seems like there were lots of daily deal sites, it makes me think of this. I wonder if they take into account that some problems that are considered worthwhile in their eyes simply aren't ripe to be solved yet.
[+] [-] javajosh|11 years ago|reply
The dirty little secret of the valley is that (most) people don't care about this distinction. There are plenty of people made rich by AOL buyouts and MySpace acquisitions - and they are just as rich even though their creations didn't last. There is a tacit acceptance of the truth that technology is fashion, not science.
Why is this distinction important to your point? Because "ripe to be solved" takes on a very different character in the fashion technology sense and the science technology sense. Indeed, I'm not sure it can even be applied to fashion technology like "Yo" since the elements of success are entirely self-referential, as with all fashionable things. Or do you think "Yo" was really just waiting for TCP/IP and Objective C and the iPhone etc. before it could be "realized"?
[+] [-] WalterBright|11 years ago|reply
[+] [-] scythe|11 years ago|reply
This note might deserve a little more emphasis. It's routine in history that problems are solved only once the tools become available, which is supported by a stunning list of coinciding discoveries:
http://en.wikipedia.org/wiki/List_of_multiple_discoveries
Fermat's last theorem was an open problem for some 400 years, but was solved within two decades (1993) after the introduction of the Frey curve (1975).
[+] [-] zeeed|11 years ago|reply
Ultimately, it's the advice to change your frame of reference from the 'sum of human knowledge' to 'what problem can i solve today, immediately'. And to ask yourself that question first and find an answer.
That's not only smart but about the wisest thing I ever heard someone say regarding work.
[+] [-] emp|11 years ago|reply
I remember reading how Feynman was unhappy with his work and then chose a seemingly useless but fun and interesting problem to solve - the physics of plate wobbling (If I remember correctly, the relation between the wobble rate and spin rate of a plate that has been tossed into the air).
His colleagues were a bit confused as to why he would do this, but he had fun and his love for physics was rekindled.
Some years later the mathematics he derived would be used when the first satellites where launched and wobbled as they spun, the wobbling not being desirable. Not bad for useless and fun work!
[+] [-] michael_nielsen|11 years ago|reply
http://johnen.shinshu-u.ac.jp/~mano/profile_e.html
The letter is from an excellent collection of Feynman letters, "Perfectly Reasonable Deviations from the Beaten Track". There's a thoughtful review of the entire collection (by Freeman Dyson) here:
http://www.nybooks.com/articles/archives/2005/oct/20/wise-ma...
[+] [-] gtani|11 years ago|reply
[+] [-] ratpik|11 years ago|reply
That is very good advice. You could replace teacher with your corporate manager and science with society and it becomes relevant to everyone working on a normal corporate job or a start-up.
[+] [-] thret|11 years ago|reply
"Dark pictures, thrones, the stones that pilgrims kiss,
poems that take a thousand years to die
but ape the immortality of this
red label on a little butterfly."
-- Vladimir Nabokov
[+] [-] wslh|11 years ago|reply
NW = The interviewer
LA = Leonard Adleman
"""
NW: They say the most creative and challenging part of research is finding the right question to ask. Do you agree with that?
LA: I wouldn't characterize it as the most challenging thing, but it's of critical importance. Sometimes it's not hard to find the "right question'. For example the mathematics literature is full of very important unanswered questions. In this case, the real issue is: Has that question's time come? Have we reached a point where developments in the appropriate area of science give us some chance of breaking the problem? For example, I worked on a famous centuries old math problem called "Fermat's Last Theorem". I was not 'strong' enough to solve it, but I find some solace in the fact that my intuition that its 'time had come' was right. The problem was finally solved two years ago by Andrew Wiles of Princeton. It was one of the major events in the history of mathematics.
The other side is to generate new questions. That's a funny process. The way I seek to generate new questions is to start to look at whole new fields, like biology, immunology or physics. Since I come from a different field, mathematics, I bring an unusual point-of-view that sometimes allows me to generate questions different from the classical questions in those areas. Like the question of DNA computing.
For the young scientist, this question of choosing the right question to spend your valuable limited intellectual resources on is critical. I often sit for months and do no productive work that anybody can see, because I don't feel I have a good enough question to work on. Rather than take on some lesser question, I would prefer to read a mystery novel. The point is, sometimes it's important to lie fallow for a time waiting for the 'right question' to appear, rather than to engage in uninspiring work and miss the important opportunity when in comes.
But in the end, the real challenge of science is to make progress - to succeed, to contribute knowledge.
NW: Of course, in an academic setting, there's that drive to publish or perish...
LA: Yes, that's a problem, because you have to feed your family. But I always tell my students and junior faculty that they are better off following their inspiration and their hearts in what research they do, that they should always try to take on the most interesting and important problems, that they should not waste their time on little problems just to make another line on a vitae.
My philosophy is that it's important, in a curious way, for scientists to be courageous. Not physically courageous, but courageous in an intellectual way. I believe that by working on extremely hard problems, by being courageous, you may succeed. But even if you fail, you fail gloriously. And you will have learned immense amounts, you will have extended the envelope of what you can do. As a byproduct of failing on a great problem, I have always found that I could solve some lesser but still interesting problems - which then fill your vitae.
"""
[+] [-] thisjepisje|11 years ago|reply
In Surely You're Joking, Mr. Feynman! there is a whole chapter dedicated to his work at an electroplating company.
[+] [-] quisquous|11 years ago|reply
[+] [-] danso|11 years ago|reply
Something that I think most novice programmers should take to heart, and something I wish I had known earlier...when you start out, you want to build something big and new, like a video game, or hell, a Rails site that you think will be the next Facebook clone. Not only is it beyond your ability as a novice, it may not even be a "problem" worth solving, because you don't yet know what's worth solving until you become a bit better at programming. I stopped programming after awhile when I couldn't come close to seeing what I thought were my goals...it's been much easier to do it day-to-day by focusing on the small steps...and after awhile, the big task doesn't seem hard after all.
Meanwhile, programming has a pretty distinct advantage...even if you spend your time mastering seemingly benign and trivial things, such as being better at parsing, function design, or just automation of what you've done before, you're not only learning, but making yourself more productive at the same time...something that's not nothing as you actually begin your grand plan.
Speaking of Feynman and computing and seemingly banal tasks...I've seen only scarce detail of his supervising the "computers" at Los Alamos:
http://longnow.org/essays/richard-feynman-connection-machine...
> Richard's interest in computing went back to his days at Los Alamos, where he supervised the "computers," that is, the people who operated the mechanical calculators. There he was instrumental in setting up some of the first plug-programmable tabulating machines for physical simulation. His interest in the field was heightened in the late 1970's when his son, Carl, began studying computers at MIT.
It's not something he's famous for, but I wouldn't be surprised if such a task was critical to the success of the researchers...I've gone through both his memoirs and hadn't seen much mention of it though. Anyone else have more details?
[+] [-] jkbyc|11 years ago|reply
> By the end of that summer of 1983, Richard had completed his analysis of the behavior of the router, and much to our surprise and amusement, he presented his answer in the form of a set of partial differential equations. To a physicist this may seem natural, but to a computer designer, treating a set of boolean circuits as a continuous, differentiable system is a bit strange. Feynman's router equations were in terms of variables representing continuous quantities such as "the average number of 1 bits in a message address." I was much more accustomed to seeing analysis in terms of inductive proof and case analysis than taking the derivative of "the number of 1's" with respect to time. Our discrete analysis said we needed seven buffers per chip; Feynman's equations suggested that we only needed five. We decided to play it safe and ignore Feynman.
> The decision to ignore Feynman's analysis was made in September, but by next spring we were up against a wall. The chips that we had designed were slightly too big to manufacture and the only way to solve the problem was to cut the number of buffers per chip back to five. Since Feynman's equations claimed we could do this safely, his unconventional methods of analysis started looking better and better to us. We decided to go ahead and make the chips with the smaller number of buffers.
> Fortunately, he was right. When we put together the chips the machine worked. The first program run on the machine in April of 1985 was Conway's game of Life.
http://longnow.org/essays/richard-feynman-connection-machine...
[+] [-] rjsw|11 years ago|reply
[+] [-] ctchocula|11 years ago|reply
[+] [-] evanb|11 years ago|reply
https://www.youtube.com/watch?v=EKWGGDXe5MA
[+] [-] yvsong|11 years ago|reply
[+] [-] Radecky|11 years ago|reply
[+] [-] walterbell|11 years ago|reply
http://kiriakakis.net/comics/mused/a-day-at-the-park
[+] [-] nodata|11 years ago|reply
[+] [-] WalterBright|11 years ago|reply
[+] [-] vishveshs|11 years ago|reply
This part tells us to keep solving whatever comes in our way. In today's world, most successful start-ups don't always succeed with their first idea, but rather an iteration of it.
Good advice! Thanks!
[+] [-] _craft|11 years ago|reply
[+] [-] thomasahle|11 years ago|reply
[+] [-] floor_|11 years ago|reply