Where Are the Edisons? They Are An Endangered Species, Warns M.I.T.'s Paul Gray
What is happening to the spirit of innovation in America?
The increasing complexity of the systems we work with makes innovation ever more difficult. It requires larger investments in laboratories, equipment and people—and more sophistication in all of them. Not that inventing a practical light bulb looked simple to Edison around 1879 when he did it; but it was physically a lot less complex than, say, what Edwin Land faced when he invented instant photography in 1947. And that, in turn, seems simple in comparison with some of the challenges facing us today.
So there will be no more individual inventors like Edison?
Well, in the last decade or two it's become harder for an inventor to bring a new idea into the marketplace. It's not just a matter of the light bulb turning on over somebody's head, as in the cartoons. The innovator has to think about the problems of marketing, sales, controlling the manufacturing process and, not least, meeting the demands of government regulatory agencies.
Then who is replacing the old-fashioned inventor?
Small companies like Alza Corp., a pharmaceutical company in California, and Florida's LaserColor Laboratories. The large corporations have the means to innovate, but they develop an investment in the present—a mindset which values stability and resists the introduction of radically different ideas. Take the transistor, or semiconductor, as an example. None of the companies that made vacuum tubes 30 years ago is significant in semiconductors today. The ability to invent and the ability to capitalize on invention are often two radically different things.
How can innovation be encouraged?
Changing the tax laws is one way to get a substantial positive effect. Anyone who puts capital into a venture enterprise recognizes, first of all, that the odds of success are only one or two in 100. So if the capital gains tax is as high as it was up to 1978, he's not going to act on initiative because there's no potential for gain. For almost 10 years inventors were unable to get venture capital funding, and it was borne out in the rate of creation of new enterprises. Already, though, the lower capital gains tax has encouraged more enterprise. The tax should be lowered still more for new enterprises.
Are government regulations strangling the incentive of would-be inventors?
I think there is an accountant's perspective in government. Some of the federal funding agencies have focused almost entirely on the accounting details of university research, not the intellectual objective. We are much more involved than 10 or 20 years ago in accounting and explaining for what we do. And government has become a major regulator of scholarships and loans to students. Just in the distribution of aid to MIT students—about half qualify for some sort of assistance—there's a straitjacket of reports, forms and procedures.
What is wrong with accountability?
Some of it is necessary, like the regulations on the use of human subjects in experimentation. But the government tends to push the solution a little beyond the problem. The proliferation of regulations on the use of animals in experimentation, for instance, is astonishing. We at MIT are at midstream in what looks like a $15 million program of upgrading care facilities for everything from mice to baboons. And then there's Affirmative Action. I'm for it, but it could be done with less bureaucracy and paperwork.
But isn't Washington generous?
The fact is, federal support of research and development has declined in constant dollar terms for nearly a decade. Granted, President Carter authorized budget increases for research and development which he intended to be large enough to produce real growth. However, inflation has been so much higher than anyone thought that there is a continual decline.
But why has research money been steadily dwindling over the past 10 years?
The best way to explain that is to offer a historical perspective. In World War II, public involvement in research and development and university funding went way up. Science and technology were harnessed in the war effort and made major contributions to scientific understanding. Then in 1957, when Sputnik went up, everyone got scared the Russians would get to the moon first. So science and technology got another boost. It became highly glamorous. There were new programs in chemistry, physics and biology in high schools. Vast numbers of people were moving into these areas. At the height of the space program, scientists were put on a pedestal and regarded as omnipotent, because they could manipulate the physical world in an almost godlike way. That was unrealistic. Scientists can have feet of clay too.
What burst the bubble?
In the politically turbulent late '60s the public got disenchanted with higher education because it looked like the kids were just going off to college to raise hell. This was coupled with a public skepticism about institutions, which, had its roots in Watergate and Vietnam. And part of what people saw as evil in the Vietnam war was the high technology we applied to the killing—napalm, defoliants, B-52 bombers and such. So the pendulum swung the other way.
Was that shift in public opinion justified?
To a degree. The last decade has brought a largely constructive recognition that scientists and technologists are like everybody else—they have blind spots and limitations. But it doesn't make sense to tar with a broad brush all science and technology because of Three Mile Island.
What lessons should be learned from Three Mile Island?
We ought to learn what it tells us about reactor safety and especially the training of reactor operators. We simply cannot, however, abandon nuclear energy. We won't make it to the year 2000 without using the nuclear generating stations already in place and building more. That raises the question of nuclear waste disposal. We don't dispose of nuclear waste, we store it. But it's a problem amenable to technological solution, and we ought to move ahead in a sensitive way.
Is nuclear the only answer?
No, solar will become economically feasible with more research. In the short run, we should find new ways to exploit our coal resources, which are the world's largest. If you include shale and tar sands, there are thousands of years of energy available. There are serious environmental problems in getting at this resource and in burning it or making synthetic fuels from it. Industry could begin right now to apply technology to those problems.
In terms of innovation, is the U.S.S.R. gaining on us?
Basic science there in many respects is very good. In certain areas, such as fusion research, they've been in the forefront. But that's not true of their technology. Why is the U.S.S.R. so interested in buying large-scale and medium-size computers? Because they can't make their own. They can hand-tailor a few for military installations, but they can't produce modern, fast digital processors like we can. The Soviets will not be our competitors in any high-technology world market in the foreseeable future. The same thing is true of China, in spades. I visited Peking last summer, and their computer, chemical and engineering sciences are 20 years or more behind. They'll have a tough time catching up.
Who will be our competitors?
A few Western European countries, principally Germany. Also Japan. And maybe in the near future some other Far East nations. Taiwan, for instance, has developed at a tremendous clip. So has South Korea.
Are you optimistic about the future of science in the U.S.?
Someone once said that the difference between the optimist and the pessimist is that the pessimist understands things better. I'm not sure I agree. I guess I am an optimist. A large part of the answer lies in training the scientists and engineers of the 1980s and 1990s to deal with novelty and uncertainty. At MIT, we're in the right place at the right point in history to make a difference. I hope we can.