(I wrote this piece on Richard Feynman over
two decades ago. Since then, the Internet has ‘changed everything,’ from the way
we interact with one another to the way we shop to the way some of us live in
our smartphones and much more. What has not changed is our fascination with
Feynman. Who was this ‘best brain since Einstein’ who took pleasure in acting
the buffoon but was as fluent in deciphering nature’s codes as he was in
picking locks? Books and articles on Feynman show up almost every year. The
latest (as of this writing) is "The Quantum Labyrinth" by Paul
Halpern, published in October 2017. It most likely won’t be the last, because
from the perspective of publishers, Feynman sells! I have expanded the original
article a bit in the hope that a new set of readers will get some pleasures out
of reading it. We Feynman fans are determined never to fade away! – Hasan
Zillur Rahim, San Jose, California, December 2017.)
Bill Gates respects and admires "individuals who achieve something inspirational or who possess extraordinary character." Of these, one name comes up more often than others: the late great Nobel prize-winning physicist Richard Feynman.
Gates had planned on meeting with Feynman
in 1988 but didn't get a chance. Feynman died of cancer in February of that
year. "It’s an opportunity I’m sorry I missed," wrote Gates in
the New York Times in 1995. "His book, Surely
You’re Joking, Mr. Feynman, is a favorite of mine."
Feynman was a hero because, as Gates put
it, "he was incredibly inspirational. He was an independent thinker and
gifted teacher who pushed himself to understand new things. I have enjoyed
everything I’ve read about him and by him. I admired him deeply…"
In 1964, Feynman gave a series of lectures
at Cornell University, the Messenger Lectures, under the title "The Character of Physical Law."
Topics ranged from symmetry, probability and uncertainty in physical laws to
techniques by which physicists seek new laws.
The lectures were recognized for
their extraordinary quality. "I have videotapes of physics lectures
Feynman gave at Cornell decades ago," said Gates. "They are the best
lectures I’ve seen on any subject. He shared his enthusiasm and clarity
energetically and persuasively."
During an interview with CIO magazine in
September 1997, Gates was asked: "Who would you invite to a dinner
party?" Feynman was on the list, along with Einstein and Leonardo da
Vinci. As recently as June, 2017, Gates said in an interview with TIME magazine that "One
person I'm sorry I never got to meet is the physicist Richard Feynman. He had a
brilliant mind and was a phenomenal teacher."
On July 14, 2009, Microsoft Research, in
collaboration with Gates, launched a Web site called Project Tuva that makes The Messenger Lectures
freely available to the public for the first time. Gates purchased the rights
to the seven lectures in the series to help kids get excited about physics and
science. “I think someone who can make science interesting is magical. And the
person who did that better than anybody was Richard Feynman. He took the
mystery of science, the importance of science, the strangeness of science, and
made it fun and interesting and approachable,” said Gates.
At the time of his death, Feynman had
become everyone’s favorite physicist, thanks to the popularity of Surely
You’re Joking, Mr. Feynman and What Do You Care
What Other People Think. With these books, transcribed by his friend and
drumming partner Ralph Leighton from taped conversations over several years,
Feynman captured the public imagination as no other physicists had before him,
with the possible exceptions of Albert Einstein and Enrico Fermi
The Feynman Lectures on Physics, a set of lectures Feynman gave to undergraduates at
Caltech in '62-63, is now a classic. (For a description of how the lectures
came about, see the definitive article by Feynman's colleague Matthew Sands in Physics
Today, April 2005. The lectures are now available online for free on
the Caltech website.
(Feynman fans are all over the world! In the mid ‘60s,
I was a student in the physics department at Dhaka University, Bangladesh, when
I came across the Lectures at a book
fair on campus. I was instantly hooked. After a week or so, I gathered enough
courage to write a letter to Feynman expressing my admiration for his unique
perspective on physics. I also lamented that in our country (Pakistan at the
time), there were no world-class physicists due to lack of quality scientific teaching
and training. I never expected a reply but to my surprise, Feynman responded,
his generous spirit evident in what he wrote, a quality not shared by his
envious and narrow-minded Caltech colleague Murray Gell-Mann. “You do your
country an injustice,” Feynman wrote. “Abdus Salam is one of the finest
theoretical physicists I know. Every time I talk with him, I learn something
new.” Note: Abdus Salam won the Nobel Prize for physics in 1979.)
Feynman’s fame grew when he was appointed
to the Rogers commission in 1986 to investigate the Challenger shuttle
explosion. His dramatic demonstration on TV of the loss of resiliency in O-ring
at freezing temperature as a principal cause of the Challenger accident made
him a national celebrity.
In applauding his performance, the physicist Freeman
Dyson said: "The public saw with their own eyes how science is done, how a
great scientist thinks with his hands, how nature gives a clear answer when a
scientist asks a clear question."
Since his passing in 1988, Feynman lore
has continued to grow. Several books have been published,
including Genius: The Life and Science of Richard Feynman (James
Gleick, 1992), Most of the Good Stuff: Memories of Richard
Feynman (American Institute of Physics, 1993), No Ordinary
Genius (Christopher Sykes, 1994), The Beat of a Different Drum (Jagdish
Mehra, 1994), Feynman’s Lost Lecture: The Motion of Planets Around the
Sun (W.W. Norton, 1996), The Meaning of It All (Helix
Books, 1998), The Pleasure of Finding Things Out (Perseus
Books, 1999) Feynman's Rainbow (Leonard Mlodinow, 2003), Perfectly
Reasonable Deviations from the Beaten Track: The Letters of Richard
Feynman (Edited by Michelle Feynman, Basic Books, 2005), Quantum
Man (Lawrence M. Krauss, 2011), and The Quantum
Labyrinth (Paul Halpern, Basic Books, 2017).
Reminiscences by colleagues also appear
from time to time in Physics journals, such as "Capturing the wisdom of
Feynman" by Matthew Sands (Physics Today, April 2006) and "Memories
of Feynman" by Theodore A. Welton (Physics Today, February 2007). A
fascinating article on how Feynman approached the subject of piano tuning
("Stiff-string theory: Richard Feynman on piano tuning" by John C.
Bryner) appeared in the December 2009 issue of Physics Today. Two articles in
the May 11, 2017, issue of Physics Today, “A Look Inside Feynman’s calculus
notebook” by Melinda Baldwin and “The doctoral students of Richard Feynman” by
T.S. Van Kortryk shed new lights on the physicist’s early development and his
later years.
Feynman even made it into the billboards!
When Apple Computer began its "Think Different" series of ads
featuring great scientists, artists, humanitarians and the like, (brainchild of
the late Steve Jobs), the company chose Einstein and Gandhi among its first
examples of the uncommon rewards awaiting those who dared to follow the beat of
a different drum. "Can Feynman be far behind?" I wondered.
In November '98, I was driving in San
Francisco's Mission District when I suddenly saw that familiar face with the
knowing grin inviting commuters to ponder the mysteries of ... what? The
photograph showed Feynman wearing the corporate T-shirt of Thinking Machine, a
Boston-based company where he had briefly worked as a consultant in 1983. The
shirt bore a schematic representation of Connection Machine - a cube of cubes -
that he helped design for Thinking Machine. (It's the same photograph on the
cover of What Do You Care What Other People Think?)
Then, in April '99, Feynman "came" to Silicon Valley where I live.
Anyone driving along Highway 101 in the South Bay could "see" Feynman
teaching quantum mechanics at the California Institute of Technology, in front
of a blackboard on which he had written matrices and differential equations.
According to Caltech archives, the photograph was taken on May 2, 1963, during
his "Lectures on Physics" period. (Feynman has many fans in Silicon
Valley, so there was disappointment when Apple "replaced" him with an
image of an iMac five months later.)
Gates may have been most pleased, however,
with the publications of “The Feynman Lectures on Computation” (edited
by Anthony J. G. Hey and Robin W. Allen - 1996) and “Feynman and Computation” (edited by A. J. G. Hey - 1998.)
The first is a collection of lectures
Feynman gave at Caltech from 1983 to 1986 as part of an interdisciplinary
course called "Potentialities and Limitations of Computing Machines."
The second contains contributions by distinguished computer scientists and
physicists who were guest lecturers in Feynman's interdisciplinary course. It
also contains reprints of Feynman's prescient articles on the physics of
computing: "There's Plenty of Room at the Bottom" (1959!) and
"Simulating Physics with Computers" (1982). Anyone reading these two
books will agree that Feynman's insight and ingenuity make his lectures on
computation almost as timeless as his physics lectures.
Feynman even has his own 37-cent
first-class stamp! The US Postal Service has honored four American scientists -
physicists Richard Feynman and Josiah Willard Gibbs, mathematician John von
Neumann and geneticist Barbara McClintock. The stamps were issued on 4 May
2005.
The Feynman stamp shows the physicist in his 30s, framed by the unmistakable Feynman diagrams. The stamp came about thanks to the decade-long lobbying effort by Ralph Leighton, who organized a celebration on May 11, 2005, at the post office in Far Rockaway, the New York City neighborhood where Feynman grew up. On the same day, the street in Far Rockaway where Feynman lived, two blocks from the post office - was renamed in his honor, from Cornaga Ave to "Richard Feynman Way." The date was appropriately chosen: May 11 is Feynman's birthday.
The Feynman stamp shows the physicist in his 30s, framed by the unmistakable Feynman diagrams. The stamp came about thanks to the decade-long lobbying effort by Ralph Leighton, who organized a celebration on May 11, 2005, at the post office in Far Rockaway, the New York City neighborhood where Feynman grew up. On the same day, the street in Far Rockaway where Feynman lived, two blocks from the post office - was renamed in his honor, from Cornaga Ave to "Richard Feynman Way." The date was appropriately chosen: May 11 is Feynman's birthday.
Gates never met Feynman but it is
fascinating to imagine a meeting between the two. Here is the whiz kid
transformed into a wide-eyed pupil, marveling at the master’s facility with
ideas and insights, wondering about the source of that magical genius that was
uniquely Feynman’s. What does Feynman think of the current state of computing?
How does he envision its future? Are any architectural breakthroughs in
software imminent? Where is the limit and why?
An autumn afternoon. Gates is at the
Feynman house in Altadena, Southern California. Feynman introduces Gates to his
menagerie - one horse, two dogs, one cat, and five rabbits. Gates smiles as
Feynman addresses each animal by name and inquires of its health.
Afterwards, they settle down in the
book-lined living room to talk.
Gates: When did you first take an interest
in computers?
Feynman: My interest in computers really
grew with my interest in physics, which is to say, very early. I recall reading
in high school about mathematical machines, tide predictors, area measuring
devices, and all kinds of wonderful things about computing in the Encyclopedia Britannica.
Gates: How have computers helped you in
physics?
Feynman: When I try to solve a difficult
problem, I ask myself: What can I compute that will explain the
properties displayed by a physical system under certain conditions? It’s an approach
that has served me well. I am interested in useful results, not abstractions,
whether it’s in understanding how light interacts with matter or why helium
behaves so strangely at low temperatures. I get useful results by coming up
with numbers that can be experimentally verified. Computers play an important
role in this verification process.
Gates: But isn’t your kind of computing
different from the computing most of us are used to?
Feynman: It’s true the kind of computing I
am interested in is based mostly on physical insights and mathematics -
mathematics always in the service of physics - but computers are a big help.
They can perform millions of important calculations a scientist may never have
the time for. Sometimes computers can even suggest ideas one hasn’t thought of
before. I find this exciting. For anyone curious about how nature works at the
deepest level, a computer can be a valuable tool. I certainly find it so.
Gates: Back in 1942, when you and other
scientists were working on the Manhattan Project, digital computers were
several years away. What was computing like then?
Feynman: We had these Marchant and Monroe
computers - hand calculators with numbers, really - that were good for adding,
multiplying, dividing, and so on. They were about a foot across and several
inches high, with all kinds of levers on them that you pushed to get results.
Unfortunately, they broke down often. Metal parts wore thin and came out of
alignment because of the pounding they took, and had to be sent back to the
factory for repair. We just couldn’t afford the downtime - this was a wartime
effort after all - and so some of us began to tinker. We would take the covers
off and try to figure out ways to fix them. Pretty soon we got good at it and
kept things going.
Then the calculations became complicated,
way beyond the capacity of the Marchants. We had to get IBM machines - multipliers,
tabulator, verifier, keypunch, sorter, collator and what have you. They were
the best machines at the time. We managed to assemble them ourselves and came
up with results that turned out to be very important.
Gates: I am curious about some of those
calculations …
Feynman: The biggest challenge was to
figure out how much energy would be released from various designs of the bomb.
Then we had to narrow it down to how much energy would be released from
specific designs that would be used in the actual bomb, and how much fissile
material would be needed in each case. This was very complicated, nonlinear
equations and all, and we had so little time! The experimentalists couldn’t
help; they needed our results to carry on their work. We
computed by simulation, using a primitive form of what you would now call
parallel processing. But we rose to the challenge. Our calculations told us
what we could and couldn’t do. Lots of important results, very accurate.
Gates: Fifty years later, how do you view
your wartime efforts?
Feynman becomes pensive. "At Los
Alamos, we were doing what we had to. We started for a good reason. We worked
hard. It was exciting. We discovered, invented, and pushed the limits of
science and technology to create a bomb to help us win the war. We won the war
but afterwards many of us weren’t so sure about the bomb itself. We had second
thoughts. The bomb took on a life all its own. I remember sitting in a
restaurant in New York shortly after returning from Los Alamos to teach at
Cornell, wondering: what if New York City were to become another Hiroshima!
Everything around me would be smashed. What was the point of life, of all this
creativity? It didn’t make any sense at all. It took me a while to shake off
this feeling. Eventually I got busy with physics and moved to Caltech, but
that’s another story.
Gates: What about your experience with
computers, though?
Feynman brightens. "There’s no second
thought about that," he replied. "The main idea I came
away with from Los Alamos was that even simple, primitive machines could be
used to calculate important results. And it keeps getting better!"
Gates: I just finished reading Feynman
Lectures on Computation. I am intrigued by a remark you made at the
end of one of the chapters: "In 2050, or before, we may have computers
that we can’t even see!"
Feynman: What I was investigating in those
lectures was the answer to a fundamental question: What is it that we can and cannot do with computers today,
and why? One issue was, how small could you make a computer? Was there any
physical limitation to its size due to laws of physics? That led me to
investigate the characteristics of a computer operating according to the laws
of quantum mechanics. If we want to make extremely small computers, no more
than the size of a few atoms, we would have to use the laws of quantum
mechanics, not classical mechanics. So I began to analyze what you would call
quantum computers. People thought that the uncertainty principle would be a
limitation: that is, you wouldn’t be able to make a computer as small as you
wanted because of the way time and energy, for instance, were related. I found
to my surprise that quantum mechanics didn’t impose any limitations on the smallness
of a computer, over and above those due to statistical and classical mechanics.
Gates: So you don’t have to worry about
any unavoidable limitation arising from natural laws when you are trying to
build the smallest possible computer?
Feynman: Exactly. Nature is quantum, not
classical, so if you are trying to simulate nature, it had better be built on
quantum laws. And if there’s no restriction there, you’re on solid ground! Of
course, you have to consider the second law of thermodynamics, reversible
computing and so on.
Gates: How would you write software for
such a computer?
Feynman: That’s for you to
figure out! I did my part!
Laughter.
Gates: I noticed in your lectures that you
derived Shannon’s Theorem in three different ways, using concepts from
statistics, geometry and physics. How come?
Feynman: I am an explorer. I like to find
things out for myself. That’s how I understand anything. What I
cannot create, I do not understand. If I can derive a theorem or a result
independently, even when I know it has been discovered before, it means I
understand it. That’s the only way I know how to learn. In the case of
Shannon’s theorem, each method I used to derive it taught me something new.
"Besides," Feynman lowers his
voice conspiratorially as Gates instinctively draws nearer, "what one fool
can do, so can another!"
More laughter.
Gates: The topics of your lectures -
coding and communication theory, Shannon’s theorem, quantum computers and such
- are fundamental research topics. We have a growing research department at
Microsoft but our focus is somewhat different. We are primarily interested in
such things as: How do you increase people’s creativity through software? What
will make computers easier to use, more responsive to the needs of the user,
more natural? Can computers extend human cognition by assimilating
speech and linguistics? These are the issues that interest us. Do you have any
interest in these aspects of computing?
Feynman: Of course I do! Any tool that can
make computers easier to use and more natural, as you say, is
important. My own work in physics reflects this. I invented something called
Feynman diagrams that allowed me to make complicated quantum calculations in
one evening that used to take physicists six months! It was my moment of
triumph, to realize that I had succeeded in working out something worthwhile.
So if you are inventing tools and products
that simplify computing and at the same time open up complex problems for intelligent
analysis, you are doing right by me.
Gates: Is there anything we should look
out for?
Feynman: You need to make sure that the
tools you create do not become more complex than the problems they are designed
to solve.
Gates: One thing that concerns me very
much is trying to anticipate the nature of computing ten or twenty years from
now. We want to be as intelligent about it as we can, so that we can begin
laying the foundation for it now, if that’s possible. Personally, I am looking
for some good ideas to take us there …
Feynman: Well, it seems to me you need a
new model for writing software, considering how important
software has become in everything we do. I think that the next generation of
software ought to be modeled after natural objects defined by natural laws. If
you can model software objects after objects of nature, I think you will have
moved on to something new and significant.
Gates: How so?
Feynman: For one thing, you can be sure
that the software will do what it is meant to. If you push this idea further,
the same software should be able to transform itself appropriately if the
boundary conditions were to change. No matter how well-thought out a computer
program is, there’s always some unforeseen error in it. It’s not the fault of
the designer or the developer, it’s the model on which the
program is written. A large software system seems to me to be like an elaborate
sandcastle one builds at the shore. Suddenly a big wave hits and it’s gone!
Gates: So we have to change the
foundation?
Feynman: I think so. You need to bind
software development to criteria higher than standards and protocols. Numerical
data modified by computers should be treated much as laws of nature govern the
way objects behave in the real world.
Gates: But doesn’t that mean people who
write software have to be physicists as well?
Feynman: Not really. If you are talking
about physical laws,
the fundamental laws of nature are simple. That’s where their power and beauty
come from. You go through all these complicated calculations and what comes out
in the end is unbelievably simple. That’s what I talked about in those videos
you have of mine. Besides, it can’t hurt to know a little physics. It’s a part
of our culture!
Gates: That’s true. Perhaps the new model could help
in the area of testing too. Software testing has become critical. We are always
fighting product release deadlines against testing!
Feynman: Exactly. How do you test software against all
possible failures? I don’t think you can using current methods, unless you have
some sort of
self-correcting software that cleans itself as millions of users simultaneously uses it. If software objects can be modeled after natural objects, testing becomes more straightforward. You have more confidence in the result. If the test fails, you may end up discovering something new and unexpected. That’s how it is in physics. There’s no fooling natural laws!
self-correcting software that cleans itself as millions of users simultaneously uses it. If software objects can be modeled after natural objects, testing becomes more straightforward. You have more confidence in the result. If the test fails, you may end up discovering something new and unexpected. That’s how it is in physics. There’s no fooling natural laws!
Gates: I remember the last sentence in your personal
report on the Challenger accident: "For a successful technology, reality
must take precedence over public relations, for nature cannot be fooled."
Feynman (obviously pleased): You get the idea!
Gates: Any example modeling software after natural
objects?
Feynman: Some years ago, a bunch of guys were trying
to build superfast computers using parallel processors. The company was called
"Thinking Machines." My son Carl, who is interested in such things,
joined them. Since the kids running the company didn’t know any better, they
ended up hiring me too!
Anyway, the problem was to design a router that
delivered messages from one processor to another. There were a million
processors in that machine and it wasn’t practical to connect every pair of
them. We chose a model where each processor needed to talk directly to only a
few of its neighbors. The problem came down to figuring out the minimum number
of buffers to hold messages for the router to operate efficiently.
I analyzed the problem by treating the router circuit
diagrams as if they were objects of nature, the kind of stuff I do all the time
in physics, and came up with a set of partial differential equations. The
equations said five buffers per chip would do. Others predicted seven but in
the end, it turned out to be five, which made building these computers easier.
My equations apparently saved the day!
Gates: We use certain types of object models for
writing software. Coming to think of it, you can say these objects are quantized
software.
Feynman: Yes! Small, self-contained software that
solves just one specific problem that makes life easier for people can be
called quantized software. See, you are already using concepts from physics,
only you don't know!
Gates: What you are suggesting, then, is that future
software should include ideas and concepts from physics as well as from
computer science.
Feynman: Yes, but I think ideas should also come from
biology. In fact, I think the intersection of biology and computer science
could prove even more fruitful for developing software than physics.
Gates: It’s already happening. Bioinformatics is an
emerging field that brings together ideas from biotechnology and computers. You
studied biology for a while, didn’t you?
Feynman: Yes. I actually worked in the field during my
sabbatical year at Caltech. This was after Watson and Crick’s discovery of the
DNA spiral. My big moment came when Watson himself invited me to give a seminar
on my work at Harvard. I am convinced biology has a lot to offer to computer
science, especially in writing software.
Gates: In what specific ways?
Feynman: Well, insights can come from understanding
how living organism function, from their adaptive, fault-tolerant,
error-handling traits. It can come from studying how human genes are laid out,
how the 'book of life' actually unfolds … There are thousands of possibilities,
really.
Gates: That’s exciting! If I were not in computers, I
would most certainly be working in biotechnology. I think we are only scratching
the surface here. Have you been following the Genome project?
Feynman: Yes, and I think what scientists have
accomplished is remarkable. It's a historic milestone to have mapped the
genetic blueprint for human life and make it available to researchers!
Gates: The way we treat diseases and prevent them will
be revolutionized!
Feynman: I certainly hope so. We can end suffering, at
least to some extent, only if we know what disease really is. And when that
happens, perhaps we can start talking more about health and less about disease.
Gates: One of my goals in life is to help eradicate
diseases like smallpox, malaria, cholera and polio from the world. It's
terrible that so many people still die from these diseases in our time. The
success of the Genome project should certainly help.
Feynman: That will make it all worthwhile, won't it?
In many ways, the Genome project reminds me of the Manhattan Project. I feel
the same sense of excitement, the same anticipation. I wish I could start over
again!
Gates: I am sure there's much you can contribute
still.
Feynman: Is there any database that can store the
Genome and sift through its data quickly?
Gates (momentarily taken aback): We are working on it.
It is extremely important to create a database system that can meet these types
of challenges.
Feynman: That should be a milestone for Microsoft. If
biological data managers find widespread use, database research will pick up
speed too. I want to see this applied to neuroscience as well. If we can track
and analyze the activities of billions of neurons simultaneously, we will have
made inroads into the working of the brain, perhaps our ultimate frontier.
They ponder the implications of the coming revolution
in genetics and neuroscience. Both see beneficial possibilities but recognize
that there are important moral and ethical issues to consider too.
Gates: It seems to me a really good software engineer
should be able to derive inspiration from different disciplines.
Feynman: Yes! I made the point in my Nobel lecture
that a good physicist might find it useful to have a wide range physical
viewpoints and mathematical expressions available to him. If everyone follows
the current fashion in expressing and thinking about the generally understood
areas, then understanding the open problems is limited. It’s possible that the
truth lies in the fashionable direction. But if it is in another direction,
who will find it?
I would make the same point to the new generation of
software engineers. Don’t limit yourself to what you know or what already
exists. Be an explorer, not a tourist. Look across disciplines. Dare to follow
the beat of a different drum. Your inspiration may come from the dance of molecules
on a wave in the sea, the complexity of a beehive or an ant colony, the march
of stars across the heavens, the nature of memory and language, the symmetry of
a snowflake, and so on. There’s no end to it! After all, nature’s imagination
is richer than ours, so why not use what we know of it to our advantage?
Gates: It all goes back to childhood curiosity,
doesn’t it?
Feynman: Right! And it’s a tragedy we can’t hold onto
some of that curiosity as we grow older!
Gates: Who are your scientific heroes?
Feynman (after a pause): There are three, really. Sadi
Carnot, James Maxwell, and Paul Dirac.
He explains: "Carnot obtained a general principle
of nature from the nuts and bolts of the thermal efficiency of steam engines.
In one stroke, Maxwell unified electric, magnetic, and optical phenomena. And
Dirac, a hero of mine ever since I read his book The Principles of
Quantum Mechanics, discovered the relativistic equation for the
electron."
Gates: Can you explain your “sum of histories” idea in
simple terms? I can’t quite grasp it. Does it have anything to do with the
nature of time?
Feynman: Yes, it does. We are all familiar with
cyclical time like the seasons: spring, summer, fall, winter, then spring again
to continue the cycle. You know, “To everything there is a season.” Then there
is linear time, time that we experience as always moving forward. We call it
the arrow of time. It’s irreversible,
like the irreversible disorder in a thermodynamic system that we call entropy.
But there’s another way of looking at time: as a combination of all possible
alternatives. In the classical picture, we witness only the arrow of time but
at the quantum level, this arrow has other arrows tangled up with it that must
also be considered. When light travels, it travels along a straight line that
takes the least amount of time. That’s what we see. But nature also allows for
less probable paths at the quantum level. They are like ghosts that we don’t
see, yet they influence how the final path appears to us.
Gates: Are you saying that time can fork into multiple
branches at the quantum level?
Feynman: Yes, you can say that. I believe in the specific
over the general, so I will give you this example: Any interaction between elementary
particles happens in all conceivable ways, not just in one, as we think in
classical physics, in the collision of two billiard balls, for example. So, to
find the final state or path at the quantum level, we must take into
consideration the effect of all possible alternatives. One way to look at this
is to allow for the possibility that each instant of time can split into many branches,
some of which reaches into the future and some even into the past. And these
branches can get entangled in all sorts of crazy ways, they can merge and twist
and speed up and slow down and form knots and wrinkles. Time at the quantum
scale is nothing less than a labyrinth! It’s like you are reading a book, then
you put it down, pick up another and start reading at, say, page 50, then you
put that down after reading 5 pages and pick up another and start reading it in
reverse, the last page first. It’s like there is a fork in the road every few
steps. So it is with time at the microscopic level.
Gates: Someone described “sum over histories” as the idea
that “reality proceeds by an awareness of all the possibilities before you
arrive at actuality.”
Feynman: I like that!
Gates: What do you consider your most interesting
discovery since winning the Nobel Prize?
Feynman: I liked my work on the theory of liquid helium.
Another was discovering the laws of weak interaction with Gell-Mann, discovered
simultaneously by another pair of scientists as well. I also worked out
something called the theory of partons to explain some of the properties of
protons. Right now I am standing back. I am playing around with some ideas in
my mind and I don’t have a clue where I'll come out. I guess that’s what makes
it exciting, not knowing what strange territory one will end up in!
Gates: Any disappointments?
Feynman: Certainly. I’ve spent years trying to solve
some difficult problems without success. The theory of turbulence is one. In
fact, it is still unsolved. Another was my inability to understand
superconductivity in which I worked for a couple of years. I should have grasped
that one after my success with liquid helium, but I didn’t.
Gates: Do the disappointments linger?
Feynman: Not at all! Even where I failed, I worked
very hard and had a terrific time. People only hear of successes but not of
failures. The important thing is to decide which problems are important and
give them your best shots. If you succeed, fine. If not, you almost always end
up learning something new!
Gates: You mentioned turbulence. Is that a part of the
study of complexity?
Feynman: Yes. There are many phenomena we do not
understand yet, from the flight of a swarm of bees to the self-organizing
properties of neural networks in the human brain. How is it that a few basic
rules can lead to such extraordinarily complex behavior? This is the fundamental
question any theory of complexity must answer. And it’s proving to be a real
challenge!
Gates: Does that reflect a failure of conventional
methods for understanding complex systems?
Feynman: Probably, or else we would have solved these
problems long ago! It’s the same idea in software: How do you make sure a
software system consisting of 20 or 30 million lines of code works coherently
and correctly? You must maintain a certain amount of skepticism about accepted
ideas and keep an open mind about ideas that appear flaky. It also means any
serious study of complexity will require us to explore the fundamental
relationship between physics and biology to computation.
Gates: Meaning?
Feynman: Meaning whether there is a computational model
of the universe and of biological systems that we need to consider and
understand. The field is wide open and it’s undoubtedly going to require
interdisciplinary research.
Gates: During the Manhattan Project you came in
contact with some of the most powerful minds of the twentieth century - Enrico
Fermi, Niels Bohr, Hans Bethe, John Von Neumann, Stanislaw Ulam, Robert
Oppenheimer and others. Do you think there will be such a gathering of
"monster minds" again?
Feynman: That was a different situation at Los Alamos.
The project came about because we had to win a war. Besides, the science was
also good, very good. We were discovering and inventing as went along. I don’t
know that such a situation will reappear. Times have changed. It’s more
structured now, more what you call "market-driven." But it really
doesn’t matter, because there are many great minds about, brilliant people
working in your field who are pushing the limits of technology, worrying about
how people interact with computers, how software is written, and so on.
This was the moment Gates was waiting for, the real reason
he had come to Altadena. Everything else was a prelude to this moment.
Gates: Will you consult for Microsoft?
Feynman’s response is instantaneous and unequivocal.
"That’s the wackiest idea I ever heard!"
Gates is relieved. He has read enough about Feynman to
know that the response really means Feynman is interested.
Gates smiles and suggests that his son Carl can
perhaps join him too.
Feynman’s face glows. "Ain’t a bad idea at
all," he says in his best Brooklyn accent.
Gates doesn’t want to leave any loose ends behind.
"How about next month?" he asks. "We will give you a tour of the
Microsoft campus and you can meet the people working on new ideas. I think you
will like what they are doing."
But Feynman stops him. "Not possible," he
says. "I am going on a trip to," he pauses dramatically and then says
with a flourish, "Tannu Tuva!"
Gates is vaguely familiar with the planned trip to
this place deep in Central Asia, outside of Outer Mongolia, with a capital
named Kyzyl. One reason Feynman is interested in going to Tuva is because, in
his own words, "any place that’s got a capital named K-Y-Z-Y-L has just
got to be interesting!"
For a moment Gates thinks of asking Feynman if he can
come along too. Then he sighs. Too many commitments! With a shock he realizes
he too is a prisoner of schedules and deadlines, just like other worker bees!
"After you return from Tuva, then?" he asks,
almost plaintively.
Feynman looks out the window. The day has about an
hour of sunlight left. Shadows are lengthening, and a small wind is stirring
the sycamore leaves.
"Why not?" he says, grinning, and extends
his hand. Gates gratefully shakes it.
"There’s
a revolution coming in software and I think you will enjoy being there when it
arrives," says Gates.
"I’m
sure I will."
Gates presses on. "Information is our most
important asset. We plan to create great stuff out of it."
"No," replies Feynman softly,
"imagination is."
-----------
On the flight to Seattle, Gates is restless. He picks up an airline magazine and is immediately repulsed by its banal content. He wishes he can run up and down the aisle to work off the intellectual energy that has gripped him. A mechanical problem has grounded his private jet, so that's not a realistic option.
He reclines as far back as the seat allows and closes
his eyes. "Must quickly put a team of good people together," he
thinks. "Feynman is coming to Microsoft. Seek inspiration from across
disciplines. Learn to tap into nature’s imagination. It’s time to take the
company to another dimension."
(c)
Hasan Zillur Rahim
No comments:
Post a Comment