Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Engines Of Tomorrow: How The Worlds Best Companies Are Using Their Research Labs To Win The Future
Engines Of Tomorrow: How The Worlds Best Companies Are Using Their Research Labs To Win The Future
Engines Of Tomorrow: How The Worlds Best Companies Are Using Their Research Labs To Win The Future
Ebook769 pages18 hours

Engines Of Tomorrow: How The Worlds Best Companies Are Using Their Research Labs To Win The Future

Rating: 0 out of 5 stars

()

Read preview

About this ebook

The U.S. economy is the envy of the world, and the key to its success is technological innovation. In this fascinating and in-depth account reported from three continents, Robert Buderi turns the spotlight on corporate research and the management of innovation that is helping drive the economy's robust growth. Here are firsthand communiqués from inside the labs of a reborn IBM, resurgent GE and Lucent, research upstarts Intel and Microsoft, and other leading American firms -- as well as top European and Japanese competitors.
It was only a few years ago that competitiveness experts -- U.S. well-wishers and naysayers alike -- concluded that America had lost its business and technological edge. The nation's companies, they asserted, couldn't match the development and manufacturing efficiency of overseas rivals. Yet now the nation is humming along, riding an unparalleled wave of innovation.
Buderi tells us this turnaround has come on many fronts -- in marketing, sales, manufacturing, and the creation of start-up companies. But Engines of Tomorrow deals with a central element that has gone largely unexamined: corporate research. It's the research process that provides the technologies that spur growth. Research is behind the renaissance of IBM, the stunning growth of Lucent, and much of the steamrolling American recovery.
Focusing on the fast-moving communications-computer-electronics sector, Buderi profiles some of the world's leading thinkers on innovation, talks with top inventors, and describes the exciting technologies coming down the pike -- from information appliances to electronic security and quantum computing. In the process, he examines the vital strategic issues in which central labs play a determining role, including:
  • How IBM's eight labs around the world figure in Lou Gerstner's plans to achieve consistent double-digit growth -- and to join GE as a $100 billion concern.
  • Why Xerox's famed Palo Alto Research Center is vying to resuscitate its company's lagging fortunes by sending anthropologists into the field to study the hidden ways people really work.
  • What Hewlett-Packard will do without its original instrument business, recently spun off as Agilent Technologies. The business was central to HP Labs' MC2 philosophy of merging research expertise in measurement, computation, and communication -- and its departure removed a lot that was unique about HP.
  • How the November 1999 federal court finding that Microsoft operates a monopoly hinders the Seattle giant's acquisition plans and makes it increasingly vital for nine-year-old Microsoft Research to lead the way in innovating from within. Could this be the next great lab for the twenty-first century?

With authority and undaunted optimism about the underlying vitality of the research process, Buderi discusses these issues and reveals the future of some of the world's best and most powerful companies.
LanguageEnglish
Release dateJul 14, 2000
ISBN9780743212489
Engines Of Tomorrow: How The Worlds Best Companies Are Using Their Research Labs To Win The Future
Author

Robert Buderi

Robert Buderi, a Fellow in MIT's Center for International Studies, is the author of two acclaimed books, Engines of Tomorrow, about corporate innovation, and The Invention That Changed the World, about a secret lab at MIT in World War II. He lives in Cambridge, Massachusetts.

Read more from Robert Buderi

Related to Engines Of Tomorrow

Related ebooks

Business For You

View More

Related articles

Reviews for Engines Of Tomorrow

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Engines Of Tomorrow - Robert Buderi

    ALSO BY ROBERT BUDERI

    The Invention That Changed the World: How a Small

    Group of Radar Pioneers Won the Second World War

    and Launched a Technological Revolution

    ENGINES OF TOMORROW

    How the World’s Best Companies

    Are Using Their Research Labs

    to Win the Future

    ROBERT BUDERI

    Copyright © 2000 by Robert Buderi

    All rights reserved,

    including the right of reproduction

    in whole or in part in any form.

    First Touchstone Edition 2001

    TOUCHSTONE and colophon are

    registered trademarks of Simon & Schuster, Inc.

    Manufactured in the United States of America

    10 9 8 7 6 5 4 3 2 1

    The Library of Congress has cataloged the

    Simon & Schuster edition as follows:

    Buderi, Robert.

    Engines of tomorrow: how the world’s best companies are using their research labs to win the future / Robert Buderi.

    p. cm.

    Includes bibliographical references and index.

    1. Research, Industrial. 2. Technological

    innovations. I. Title.

    T175 .B83 2000

    607’.2—dc21 99-059910

    ISBN 0-684-83900-8

    0-7432-0022-5 (Pbk)

    A portion of Chapter 5 appeared in modified form in The Virus Wars, The Atlantic Monthly, April 1999. © 1999 Robert Buderi, as first published in The Atlantic Monthly.

    ACKNOWLEDGMENTS

    John Armstrong patiently provided his perspective and sound and gracious advice from the planning stages through the final copy.

    I am (again) greatly indebted to the Alfred P. Sloan Foundation. Art Singer proved critical in getting things going, and Doron Weber helped carry them to fruition. Frank Mayadas talked candidly about his IBM days. And Ralph Gomory, giving generously of his time and rich experience, really carried the day.

    On the other end of the spectrum, thanks for administering the Sloan grant goes to Herb Bernstein and ISIS, The Institute for Science and Interdisciplinary Studies. This is an outstanding (and thankfully not greedy) program devoted to public understanding of science based at Hampshire College in Amherst, Massachusetts.

    It is not easy to live as a writer. Special thanks to John Benditt, editor-in-chief of Technology Review, and Richard Brandt and Galen Gruman at Upside, all of whom afforded me the opportunity to develop and adapt aspects of this book in feature articles for their publications. In the case of Upside, portions also appeared in my monthly Lab Watch column. Thanks as well to my hands-on (and even hands-off) editors at these publications: Rebecca Zacks and Herb Brody at TR and Chuck Lenatti and Deborah Branscum at Upside. They all did much to make things better. Similar, much-appreciated assistance came from Mike Wolff at Research Technology Management and Jack Beatty and colleagues at The Atlantic Monthly. Last but not least on this front, John Carey gave willingly of his time to provide valuable additional advice about key sections of the manuscript.

    All through the long course of researching and writing this book, a host of people shared their experiences, provided information, and helped me understand it. Warm thanks to all who gave interviews—and to the almost-always underappreciated and incredible support staffs around the world who made it all happen. Several researchers went above and beyond the call. Tony Tyson not only helped me comprehend his work, he led me to a better understanding of what fires innovation. Peter Price helped organize a strong and thoughtful contingent from the early days of IBM Research, including Alan Fowler and the late Rolf Landauer. In England, Jeremy Gunawardena arranged a memorable dinner discussion.

    Friendly librarians, archivists, and researchers proved indispensable to combing through a wealth of data, especially about the history of corporate research. Special thanks go out to Lynn Labun at the Mellon Institute Library; Dr. Margarete Busch at Bayer archives; George Wise, formerly of GE; Joan A’Hearn from the Schenectady Museum Archives; Helmuth Trischler at the Deutsches Museum; and Diane Currens D’Almeida at M.I.T. Alex de Ravel helped with the early research. At different times, Vivien Marx, Manuela Ventu, Pino Oppo, and Ruth Buderi pitched in with German translations.

    Nor can the social front be ignored. In Germany, Hartmut Runge introduced me to the Bavarian Alps. Aston Bridgeman and Bob Neff showed me the real Japan. In England, Alun and Rie Anderson proved especially generous hosts—from gardening to margarita-making. In California, it was Ottavia Bassetti; Charlie Fager and Sandy Geller; and Trip and Lisa Hawkins, who got more than they bargained for in a guest with the flu.

    Thanks as well to my agent, Rafe Sagalyn, editor Alice Mayhew, and associate editor Roger Labrie.

    And, as always, to my family: Kacey, Robbie, and Nancy.

    TO DICK THOMPSON

    The author acknowledges with great gratitude the

    support of the Alfred P. Sloan Foundation.

    CONTENTS

    Introduction: Change

    ONE       A Matter of Death and Life

    TWO      The Invention of Invention

    THREE    Houses of Magic

    FOUR     Out of the Plush-lined Rut

    FIVE       IBM: Taking the Asylum

    SIX         House of Siemens

    SEVEN    NEC: Balancing East and West

    EIGHT    The Pioneers: General Electric and Bell Labs

    NINE      Children of the Sixties: Xerox and Hewlett-Packard

    TEN       The New Pioneers: Intel and Microsoft

    Conclusion: The Innovation Marathon

    Appendix

    Notes

    Interviews

    Bibliography

    Index

    ENGINES OF TOMORROW

    INTRODUCTION

    CHANGE

    The value and even the mark of true science consists, in my opinion, in the useful inventions which can be derived from it.

    —G. W. LEIBNIZ

    Science today is everybody’s business.

    —I. BERNARD COHEN

    THE PROLIFIC rags-to-riches inventor Charles F. Kettering, backbone of the mighty General Motors research machine that gave the world four-wheel brakes and the refrigerant Freon, liked to call a company’s research house its change-making department. In a 1929 speech to members of the United States Chamber of Commerce, he put it this way: I am not pleading with you to make changes. I am telling you you have got to make them—not because I say so, but because old Father Time will take care of you if you don’t change. Advancing waves of other people’s progress sweep over the unchanging man and wash him out. Consequently, you need to organize a department of systematic change-making.

    Corporate change-making, in the form of a company’s central research operation, is the focus of this book. As a pivotal force behind a plethora of key industries, research can be crucial not only to winning in the marketplace but to national economic viability. Yet serious discussion of the subject—examining the nature and evolution of successful research, and who still manages to pull it off—has been missing from virtually every popular management treatise in vogue today. This failure has become all the more glaring amidst the sweeping cutbacks in research spending that marked the early and mid-1990s. The upheaval has led to the widespread perception—among Congress, the President, leading policymakers, academics, and the press—that companies have almost unilaterally, and shortsightedly, pulled back from far-ranging and potentially revolutionary studies to concentrate on incremental improvements to existing products.

    This book challenges that perception by throwing open the spotlight on corporate research—laying out its history, identifying the real issues, and delving inside the labs of some of the most noble practitioners to illuminate the various approaches to managing innovation in today’s quicksilver global economy. The focus is on the rapidly blurring electronics-computer-telecommunications industries. It can be argued that especially with the rise of the Internet, which is bringing together voice, data, telephony, computers, and a variety of office technologies, these fields impact our daily lives more profoundly, or at least far more visibly, than other industries. But more than that, these sectors have undergone the most upheaval in recent years, in the process becoming the focal point of the vigorous public debate about research and development policy and the future of competitiveness. The hard choices companies have made—about spending, targeting resources, and managing innovation across geographic and cultural boundaries—hold crucial lessons for what lies ahead in a fast-changing world.

    Industrial research first appeared in the 1870s and quickly took its place as a hallmark of the industrial age. Companies learned that, when successful, systematic research could provide a huge edge. It helped fight fires, protect core business lines in a myriad of small and often hidden ways, and even create powerful new industries—like the radio, wireless communications, and television empires that grew out of early twentieth-century electrical investigations at General Electric and telephony studies at American Telephone & Telegraph.

    Taken as a whole, these innovations helped forge a strong sense, not just from Charles Kettering, but many other scions of industry and government, that corporate research could play a vital role in competitiveness. But despite a widely accepted view that research was important to corporate well-being and vigor, it wasn’t until the 1950s and early 1960s that things really took off. That was when young Massachusetts Institute of Technology assistant professor Robert Solow showed mathematically—the basis for his 1987 Nobel Prize—that in the long run growth in gross national product per worker is not due so much to capital investment, the traditional pillar of classic economic analyses. Instead, such growth is in very large part the result of technological progress. In a nutshell, advanced economies grow largely on the strength of what one can call New Knowledge—in most cases the high-technology fruits of research and development.

    Although it was not high in the minds of research leaders, Solow’s work debuted amidst the Cold War and the resulting surge in government and industrial R&D spending that helped shift inventive activity away from straightforward mechanical engineering to an affair deeply rooted in science—primarily physics and chemistry. With that transformation came a dramatic increase in the growth of corporate labs, to the point that the Washington-based Industrial Research Institute now counts close to 15,000 corporate labs in the United States alone. These organizations employ an estimated 750,000 scientists and engineers—some 70 percent of the total number of these professionals—and spend roughly $150 billion annually on research and development. All told, industry funds about 65 percent of the R&D conducted in the country, and performs closer to three-quarters. And this is only part of the story. In rough terms, the U.S. spending is matched by the rest of the world combined—meaning the saga of corporate research today is truly a worldwide tale.

    The research story goes far beyond corporations, of course. The drive for successful innovation encompasses government labs, universities, and even still the basement inventor—with the outcome dependent on a staggering array of factors from grade school education to state and federal tax credits, national R&D spending policies, and patent laws. However, companies reside on the leading edge of this ongoing, global struggle. They are the prime venue where New Knowledge is converted into Useful Products, and where success and failure can be most plainly gauged in terms of patents, market share, sales, stock prices, and the like. By focusing on this front, I hope to provide deep insights into invention and innovation today—from the trials and tribulations of management to what’s coming down the pike, and who has the edge in key technology-driven industries that will shape the way we live in the twenty-first century. To do this, I’ve gone inside the labs of some of the biggest names in innovation on three continents: IBM, Siemens, NEC, General Electric, Lucent Technologies, Xerox, Hewlett-Packard, Intel, and Microsoft.

    Of course, there is absolutely no assurance that the companies, or even the nations, whose labs generate key inventions are the ones who will reap the economic rewards. On the back of key discoveries, Britain served as the initial center of the chemical industry; but it was the Germans who rose to control global markets for the half century leading up to the First World War. Similarly, the United States became a leading economic power long before it achieved its present position of research and scientific dominance. The annals of industrial research history are rife with examples of usurpers riding to glory on the waves of other people’s innovations.

    Marketing, sales, service, packaging, distribution, manufacturing, and development, the close sibling of research, all remain vital cogs in the wheel of corporate fortune. Which is most important? Well, it’s like the case of the drunken juggler, notes John A. Armstrong, a former IBM vice president for science and technology. Let’s say the juggler has two tennis balls and a flaming sword, and a club, he supposes. Now you say, which one of those things is most important? Well the thing that’s most important is the next one he’s got to catch. And success is keeping them all up in the air.

    The point is that a host of factors can outstrip the benefits of the strongest research program, as indicated by recent woes at Microsoft, Xerox, and Lucent, whose labs are profiled here. Yet if wielded effectively, research can sharpen the vision of an otherwise all-too-hazy future, raising the chances for long-term success. This book focuses on the management philosophies, funding paradigms, incentives, and all the rest employed by top labs to do just that. There is no single formula; there exist many paths to maintaining vital research organizations. Which one is most successful depends on such factors as the economy, the industry, and the firm’s role in it—as well as corporate culture.

    Indeed, fitting into that culture—not just of the research organization but of the entire company—and helping scientists and engineers see themselves as part of a larger whole, turns out to be one of the keys to successful industrial research. Some company research arms, like Hewlett-Packard’s, possess a strong sense of history and a deep connection to the rest of the organization—an enviable situation made easier because the corporation itself is only fifty years old, and by the fact that the founders and all the chief executives until Carly Fiorina took over in mid-1999 have been engineers. An outfit such as Bell Labs, by contrast, nurtures an equally strong research culture but became increasingly disconnected from AT&T, partly as a result of the sweeping success of its scientific investigations that transformed a corporate laboratory into a university-like situation. Combined with years of uncertainty over AT&T’s future, the result was a kind of slow death that was only averted by research managers taking drastic action. Even then, it wasn’t until the Bell Labs core spun off into a separate company, Lucent Technologies, that it rekindled the old spirit of innovation and reclaimed its place as one of the world’s great industrial laboratories, a position it maintains despite Lucent’s recent troubles.

    In many ways, the book is an extension of my work as Business-Week ’s technology editor, where I oversaw coverage of corporate labs worldwide and planned and wrote major stories on innovation and research strategy, including those involving the now-defunct R&D Scoreboard, which tracked the research and development expenditures of individual firms in a variety of industries. But it is also greatly influenced by my research for a previous book, The Invention That Changed the World, which told the story of World War II radar development at M.I.T.’s top-secret Radiation Laboratory.

    In a sense, the Rad Lab was the first Manhattan project. The nation’s top physicists were recruited to the enterprise, mixing with engineers, mathematicians, and even biologists and astronomers, to develop radar systems that had a dramatic effect on the course of the war. Even more than that, though, the radar work played a critical role in the evolution of postwar science and technology. Besides having a direct impact on specific landmark creations and discoveries—including the transistor, nuclear magnetic resonance, wireless communications, microwave ovens, radio astronomy, and the maser and laser—the radar effort helped spark a whole new attitude about the management of research as a fast-paced, collaborative, and cross-disciplinary affair. Seeing this crucial aspect of the evolution of corporate research greatly increased my insight into what went on in corporate labs throughout the 1990s. Indeed, a lack of historical and evolutionary focus has far too often caused corporate research to be viewed simplistically. Many of the common perceptions are old, or wrong—and myths and easy stereotypes abound about the innovation process, as well as the very role and importance of research in modern corporations.

    One of the most fundamental of these misconceptions is the treatment of research and development as if they were one thing—namely, an enterprise called R&D. This is an easy trap to fall into, because spending on these two variables is always tracked together, and the goal of nearly every corporation is to fuse them better, so that products flow more quickly out of the lab into the marketplace. Nevertheless, the two are vastly different—in both substance and management approach.

    Development is the stage where ideas or prototypes are taken and engineered into real-life products, ready to be affordably mass-produced and able to meet reliability standards and fit into the existing infrastructure. It is by far and away the bulk of the R&D equation—typically somewhere around 90 percent in the industries examined here. It is a sprawling, sweeping mass of a subject almost impossible to get one’s arms around in a comprehensible way.

    Research, by contrast, is where ideas are investigated, refined, and shaped into the beginnings of a new product, system, or process. Though it is an extremely small part of R&D, I’ve homed in on it because in the companies with a long-standing reputation for innovation—those that continually blaze trails and create new industries—invariably it is the research side that lights the way into the future. Although they often work on small-scale improvements to specific products, central research arms also concern themselves with the farther-out problems that often apply to different businesses and product lines. They garner a plurality of patents—usually the more important ones—and win the occasional Nobel Prize. Even more than that, central research is the focal point of most debates surrounding R&D policy. So, perhaps, there is more about it that needs to be illuminated.

    An additional widespread misperception lies in the idea that we progress linearly from science to technology, from research breakthrough to developing products and then the market. If this were true, then managing research might be far more straightforward than it really is. In actuality, science and technology constantly feed into each other on all sorts of levels. The semiconductors at the heart of modern computers trace their roots to advances in solid-state physics that gave birth to the transistor. Yet these breakthroughs owed a large debt to attempts to analyze and control the silicon and germanium crystals used as radar detectors during World War II. More recently, semiconductors have evolved technically—becoming ever smaller, faster, and more powerful. But now these devices are becoming so small that researchers envision building chips on the atomic scale—spurring places like Hewlett-Packard, IBM, and NEC to devote more resources to studies of basic quantum physics. Staying atop this interplay between science and technology, finding novel ways to channel the fruits of one into the other, and motivating people to do it, form key parts of the research challenge.

    Still another common pitfall is that in considering and debating research issues we tend to tackle the subject in sweeping terms, as if the same rules and conditions apply to every industry. But the uses of science and technology actually vary widely. In chemicals, one basic patent—say for nylon or Kevlar—can spawn an industry. But more usual is the situation found in the telecommunications industry. Breakthroughs do occur, but advances typically hinge on evolutionary change, so that it takes a multitude of piecemeal improvements to add up to a revolution. From the research and development standpoint, then, invention must be laid on top of invention—oftentimes other people’s inventions. And with companies today operating research labs on several continents, managing this give-and-take across geographic and cultural chasms can be a daunting challenge.

    Not to beat a dead horse, but a final dangerous practice comes in reading too much into the numbers. The general feeling seems to be that more research is better research. Therefore, when corporate R&D budgets declined sharply in the early 1990s (they have risen dramatically since 1994), it was perceived as a crisis. However, in some regards this was simply a spending readjustment after the boom times of the 1960s and 1970s—when it was easier to increase funding across the board—and therefore better reflective of reality. It also illustrates a refocusing of some resources. In the computer industry, for example, as hardware lines mature software has emerged as the driving force in sales and profits. AT&T recognized the trend back in 1990, when it slashed research and development spending 8 percent. Part of those cuts stemmed from shifting its focus from the infrastructure-heavy physical sciences to more streamlined software research. But while the move made sense, the pundits didn’t much care. The cuts were viewed almost as a national tragedy, even though in AT&T’s view it was doing what it was supposed to do: adapting to reflect the real world.

    Even if Bell Labs slashed its budget, so what? Merely spending money is no guarantee for success—and can even be a sign of poor R&D planning. IBM’s research and development outlays were once greater than the sum of the R&D budgets of its next dozen or so largest competitors—including DEC, Hewlett-Packard, Hitachi, and Fujitsu—yet Big Blue was far from a sure bet in the marketplace. Big budgets can mean the research pie is spread too thin, or locked on the wrong target, so slashing them actually can be a positive sign that a company is tightening its focus.

    To provide a more accurate portrait of the research endeavor, and to unearth the secrets of standout management, I have profiled labs in Europe, Japan, and the United States. Along the way, I have visited with everyone from top policymakers to individual scientists, from long-time veterans to fresh hires. And I have looked at projects running the gamut from creating a better electric range to fashioning transistors from individual atoms. In this way, I hope to illustrate a range of actions and strategies relevant to the debate, while also bringing readers inside the labs and showcasing the often exciting technologies making their way, ever faster, toward the market, the office, and the living room.

    The nine companies profiled have been selected on the basis of extensive background reading, site visits, and scores of interviews as being among the world’s best in the computer-telecommunications-electronics sectors. Although Siemens and NEC mark the only corporations examined that are based outside the United States, nearly all the companies operate labs on multiple continents. All told, I visited more than two dozen facilities in Asia, Europe, and the U.S., including several run by firms that were not profiled but still contributed greatly to my overall understanding of how research operates.

    Often, I try to illustrate management philosophies through descriptions of individual projects. I look mainly at success stories. However, the research organization that does not have a significant portion of failed projects is not pushing the envelope enough. Therefore, I have also included examples of some seeming failures—including IBM’s well-over-$100-million effort to develop a revolutionary class of computers based on Josephson junction principles and Microsoft’s Talisman technology for rendering high-quality PC graphics—as well as a fresh look at the lessons learned from Xerox’s legendary fumble of the pioneering personal computing advances from its famous Palo Alto Research Center (PARC), several of which provide insight into the firm’s current problems.

    Because central research arms are so big—often more than a thousand people—I have focused on projects and organizational issues that stand out as unique or especially evocative of the management philosophy. General Electric’s research arm, for example, is unusual for leading one aspect of the company’s Six Sigma quality initiative: a program called Design for Six Sigma. The idea is that achieving the highest quality products cannot be guaranteed only through traditional manufacturing initiatives, but must be built into products from the get-go—and that often means starting in the research lab. At Lucent Technologies, the focus is on Bell Labs’ famous Physical Research Laboratory, home to its farthest-out projects—everything from new kinds of lasers to mapping the dark matter of the universe. The Xerox profile concentrates largely on the role of anthropology and the social sciences at PARC.

    The book is divided into three main sections. The first sets the table, surveying the global situation and then laying out the basically optimistic theme—that industrial innovation continues with more vigor than is typically suspected. To more fully develop this picture, I have tried to place modern corporate research in its rarely understood historical context—a vital framework for understanding change and distinguishing fundamental shifts from periodic swings of the pendulum. This task involves tracing the rise of industrial research, from its origins in the German dye industry of the late 1800s through the establishment of GE’s pioneering facility in 1900 and the Bell Telephone Laboratories on New Year’s Day 1925. It chronicles the tremendous influence of World War II on today’s labs and management style, and shows how the first two postwar decades—a time of unprecedented American hegemony—warped the common view of what corporate research should be, a view that ended abruptly with the resurgence of European firms and the rise of the Japanese. The section concludes with a detailed examination of the reasons behind what former HP Labs director Joel Birnbaum once termed the research bloodbath that hit labs in the late 1980s and early 1990s, with an eye on newly evolving strategies for harnessing and encouraging research in the twenty-first century.

    The second part consists of detailed looks inside the research operations of three electronics and computer giants. One is IBM, arguably the world’s most powerful corporate research house when it comes to the physical sciences. Of Big Blue’s eight research arms, I have visited four—the main Thomas J. Watson Research Center in Yorktown Heights, New York, and three satellites—in Zurich, Tokyo, and San Jose. And I have discussed the work at the other four, in Delhi, Tel Aviv, Beijing, and Austin, Texas.

    The remaining two firms are staunch, longtime competitors from different continents—Siemens in Germany, the largest research spender outside the United States, and NEC in Japan, which has shown the most dramatic gains of all Japanese firms in its patent portfolio during the 1990s while also swimming against the tide by pursuing an exciting array of basic studies—both at home and at its unique lab in Princeton, New Jersey. Like IBM, both Siemens and NEC maintain facilities and laboratories around the world—and I have visited the key sites and spoken to all the top research leaders.

    The book’s final section consists of three additional chapters, each containing shorter profiles of two companies paired together either for their historical connection, intense competition, or some combination of these factors.

    The Pioneers chapter looks at General Electric and Bell Labs. Their venerable laboratories helped define the institution of corporate research early in the century and competed heavily for decades in the infant days of radio and telephony, but have since drifted onto separate paths as their companies evolved apart. Both have endured major trauma and make-overs in recent years, but have emerged as leading centers of innovation. The next chapter, Children of the Sixties, examines the central research arms of Xerox and Hewlett-Packard, two much newer organizations that arose during the heyday of science in the 1960s. These two enterprises followed a path diametrically opposed to that taken by GE and Bell Labs: they did not compete for more than two decades but have recently become fierce antagonists in copiers, facsimile machines, laser printers, and general office technology. Finally, The New Pioneers offers the first detailed looks inside the upstart research labs of Intel and Microsoft. These modern-day giants followed the path of the original Pioneers by waiting until achieving dominance in their industries before creating long-range research ventures: both opened labs in the 1990s that are seeking to shape the future of personal computing.

    I am not, nor have I ever been while researching and writing this book, an investor in any of the firms I have profiled. I have not invested in or been associated with any of their direct competitors—nor have I ever served as an employee or consultant to any of them. In July 1999, I participated in a Bell Labs panel for its Global Science Scholars Summit, for which I received an honorarium. Otherwise, the extent of my financial indebtedness to them runs to a few meals and one tee-shirt. However, I am greatly beholden to all these companies for their open cooperation, as well as their willingness to discuss complicated issues and patiently answer all my follow-up questions in an effort to cultivate a deeper understanding of managing innovation in modern times.

    Although I have covered business issues and technology for the better part of two decades, I have never been a particular fan of corporate management or practices. I have always approached the companies I cover with a somewhat wary, even adversarial, eye—one that recognizes they are striving to put their best foot forward and are not about to tell an outsider complete details of their future projects and plans. All that aside, however, I learned a lot. Writing a book is far different from banging out a magazine or newspaper article. Because of my project’s long time frame—nearly three years all told—I was able to visit top managers time and time again and really listen to and test what they were saying, luxuries virtually impossible with normal news deadlines and a job that requires constant flitting between subjects. The same held true when dealing with the researchers. In some cases, I followed their work for two years, checking things periodically to stay on top of developments.

    The overall picture is one of optimism—not in the sense that research operations are wonderful entities that don’t make mistakes, or even that those profiled here will stay atop their fields. Rather, I’m optimistic in the broader sense—that no matter the state of any individual firm, the enterprise as a whole is advancing. Progress is not always pretty, or even steady. But as a rule, the best corporations learn from their mistakes and strive continuously to improve the innovation machine, whipping inventions into shape faster and tailoring them to better meet customer needs—while still maintaining the balance between so-called incremental improvements in existing products and more pathbreaking work that can create new lines of business, even entire industries.

    This last issue, concerning the balance between short-term and longer-range studies, basic or strategic research and applied projects, lies at the heart of much of the debate over the direction of industrial research today. It comes up repeatedly throughout the book. Basic research has always been a small part of corporate labs, despite all the hullabaloo. Nevertheless, my finding of health flies in the face of conventional perception.

    This is not to deny that corporate labs have undergone a major transformation in recent years. Arno Penzias, who led Bell Labs through its darkest days in the early 1990s, says the heyday of basic industrial research—at least in the physical sciences—actually came in the 1960s and 1970s, around the time U.S. cars lost their fins. Since then the field has undergone a series of ups and downs that include the dramatic cutbacks of the late 1980s and early 1990s, followed by a slow and cautious rebirth in recent years. It’s hardly like the old days, though. Research arms have become a lot more focused on corporate needs and goals. As with all other aspects of business today, they are judged increasingly by productivity and quality measures—and must be accountable for their actions.

    All this is necessary and good. But while facing this reality, the best companies—and all those profiled in this book—remain keenly aware that without a program of bold, longer-term research they run the risk of falling dramatically behind and never catching up. As Charles Kettering said when talking about his change-making department: The Lord has given a fellow the right to choose the kind of troubles he will have. He can have either those that go with being a pioneer or those that go with being a trailer.

    This book is mainly about the pioneers—old and new—who chose the troubles that go with managing innovation on a variety of time frames, including those far beyond the horizon. They provide the framework and engines of tomorrow.

    CHAPTER ONE

    A MATTER OF DEATH AND LIFE

    Research is a high-hat word that scares a lot of people. It needn’t. It is rather simple. Essentially, it is nothing but a state of mind—a friendly, welcoming attitude toward change….

    —CHARLES KETTERING

    THE RESURRECTION of Bell Labs research was conceived on a warm August night, in a flooding of Arno Penzias’s soul almost as swift and total as an epiphany. The year was 1989. The lab’s Nobel Prize–winning research vice president was vacationing at Tanglewood, home to a series of summer music concerts in the Berkshires town of Lenox, Massachusetts. But he wasn’t relaxing. A few months earlier, AT&T president Robert Allen had outlined plans for breaking the company into a series of business units designed to help it compete better in fast-moving markets—and Penzias had spent much of the day on the phone, preparing for his upcoming annual budget presentation and fretting over how to keep his scientists comfortable while also proving to senior management that research would pull its weight under the new structure. That evening, Penzias tried to relax by attending a Boston Symphony Orchestra performance. Only instead of enjoying the music, he couldn’t shake the feeling that something had gone terribly wrong with his great research organization.

    My gut was starting to grab, Penzias remembers. I was listening to Beethoven’s Third and my stomach was churning and I didn’t know what to do. Suddenly, it hit him: the core research side of Bell Labs—about a tenth of the overall operation—had in many ways become outdated. Since its 1925 founding, the great enterprise had enjoyed a long and glorious ride highlighted by thousands of patents and seven Nobel Prize winners, including himself. But AT&T was no longer a regulated monopoly, able to run research at a leisurely, academic-like pace—and instead of worrying about writing scientific papers or building the world’s smallest laser, it needed to focus far better on business objectives.

    Penzias had resisted accepting this reality for several years. But everything changed for him that summer evening, in light of Allen’s recently announced decentralization plans and surrounded by Beethoven’s powerful music. I just couldn’t do it anymore, Penzias relates. Somewhere along the line the business units were going to need help, and we were going to have to do something about it…. And I realized I had more power to make that happen than I had thought.

    Within a few weeks, Penzias had begun plotting a dramatic overhaul of AT&T’s research activities. It took months of careful and exhaustive study. Seeking to identify redundant programs and inefficient practices, he launched a massive review of all endeavors—calling in a management consultant for strategic guidance. Then, on July 20, 1990, he presented his make-over in Bell Lab’s big auditorium, stunning those assembled by vowing to reverse the trend toward university-like research that had come to characterize the place. Penzias declared that he was eliminating entire laboratories and that the organization would shift many resources to software studies—curtailing much of the physical sciences explorations that had made Bell Labs famous. AT&T’s senior management continues to invest more than one million dollars every working day in our work, and they give us the freedom to use it as we think best, he explained to his researchers. That gives me a deep sense of responsibility to use our resources and our freedom wisely.

    The changes put in effect that day shook Bell Labs to its soul. Soon, Penzias’s policies were attracting headlines, inciting cries of outrage from national policy leaders that the legendary enterprise was abandoning the kind of fundamental scientific investigations that had given rise to the transistor and other great advances.

    Even as the Bell Labs research head carried out his plans, a similar story was unfolding at IBM’s Thomas J. Watson Research Center in Yorktown Heights, New York. There, at the helm of another of the world’s great corporate research organizations, James McGroddy was spearheading his own brutal self-examination with that same bad feeling in his gut: a growing awareness that despite its own sparkling history of five Nobel Prize winners, research was not making much difference to the company.

    Ever since 1990, much like his counterpart Penzias, McGroddy had been working to couple activities more tightly to IBM’s various businesses—concentrating more on providing customers with solutions rather than just technology. By late 1992, with the coming year’s budget under fire, he had also begun to streamline support services, eliminate redundant or seemingly dead-end programs, and find outside sources of funding by bringing in government contracts or launching spinoff companies. Prophetically, with the company in dire straits, McGroddy felt certain chairman John Akers would soon resign. Since any replacement likely would know little of the research operation, he prepared a complete documentation of the value the division created for and provided to IBM.

    The sixteen-page document was only a few months old when new chairman and chief executive officer Louis Gerstner, Jr., arrived on the scene. The former head of American Express and RJR Nabisco liked to move fast. On April 1, 1993, the very day his appointment was announced, he called senior executives, including McGroddy, to the big board room at IBM’s Armonk headquarters. The new boss asked his execs for a ten- or twelve-page description of their segment of the company to bring him up to speed on Big Blue’s place in the world—a first step in determining the hard measures needed to put the lethargic giant back on the fast track. In particular, he wanted to know the details of each business, the economic model it followed, its customers and competition, strengths and weaknesses. The reports were to be submitted within a few weeks—and after Gerstner had time to study the documents, he would schedule a meeting with each executive to discuss things in more depth.

    In essence, Gerstner sought the very kind of report McGroddy had recently wrapped up on his own—a well-written statement of organizational practices and goals, coupled with a frank assessment of strengths and weaknesses, and an action plan for the future. McGroddy went back to the Watson center and slashed four pages from the report—turning the document over to Gerstner a few days later. Research was handily the first group to complete its assignment—and the Watson lab was Gerstner’s first IBM visit outside headquarters. The chairman liked what he saw. At the end of the five-hour visit, he told McGroddy that he had long served on the AT&T board, where he had witnessed the troubles associated with a great research lab that made little impact on customers or the bottom line. I want to thank you for not putting that problem on my plate here at IBM.

    Over the next two years, driven partly by McGroddy’s vision, Gerstner eliminated one-third of IBM’s total R&D budget, moving from about $5.1 billion a year to a tad under $3.4 billion. It didn’t matter that inside IBM the battle had been won—and that the new chief executive began featuring research as a key to the company’s revitalization. Just as with AT&T before it, many leading academics and policy wonks concluded Big Blue was abandoning science, and with it the future. You could hear the screams from Harvard to Washington, D.C.

    Jump forward almost a decade, to the last years of the twentieth century. Arno Penzias and Jim McGroddy are both retired, though active in a multitude of research and development issues. Picture Penzias, still a Bell Labs consultant, just arrived at lab headquarters in Murray Hill, New Jersey, from his home in San Francisco. The cathedral-like lobby harbors a firsthand compendium of the electronics age. Display cases contain the original transistor and a model of Alexander Graham Bell’s pioneering telephone. A Telstar 1 satellite hangs from the ceiling, while a series of plaques describe the pioneering work of the lab’s now eleven Nobel laureates—including Penzias’s 1965 discovery, with colleague Robert Wilson, of the cosmic background radiation presumably left over from the Big Bang.

    But the former research director pays them no heed. Still fit in his late sixties from swimming and running, a laptop computer slung over one shoulder, he strides purposefully past the guard counter into the facility itself—where it’s a whole new day. Whereas many once considered Bell Labs a national asset, today’s operation belongs solidly to its stockholders. Fundamental investigations have been scaled back, especially in Penzias’s beloved physics. Arun Netravali, his chosen successor, later named head of all Bell Labs, emphasizes connection to business goals and customers. Projects lean to the shorter term, with many managers attuned far more to products than scientific accomplishments. And it’s working. Under the tutelage of rising star Lucent Technologies—one of three companies formed by AT&T’s 1996 self-inflicted breakup—the lab has unleashed a cavalcade of innovations that Penzias asserts by my count have added tens of billions of dollars to Lucent. He gets good feelings from Beethoven these days.

    Jim McGroddy seems just as content. While serving on a variety of small company boards and government or professional committees, he devotes much of his time to a nonprofit business—Advanced Networking and Services—that promotes high school science education. At the research chief’s retirement party in late 1996, chairman Lou Gerstner told how he had considered breaking up the Research division and parceling its resources and personnel out to individual business units so that those assets could be brought directly to bear on Big Blue’s product needs. But McGroddy’s tremendous first impression convinced him to stay the course. Within a few years, the research ship had turned around almost completely. Three new labs opened. Meanwhile, despite what was widely perceived as savaging its R&D budget in 1993 and 1994, IBM began winning more U.S. patents than any group in the world—a streak that in 1999 reached its seventh consecutive year. People bitch about input measures, states McGroddy. But where’s the missing output?

    Corporate research is dead. Long live corporate research. From the hallowed, warren-like corridors of Bell Labs to the graceful stone-and-glass crescent of the Thomas J. Watson Research Center, from Xerox’s famed Palo Alto Research Center (PARC) cascading down the sunny California hills to the glimmering high-tech sheen of NEC’s basic research lab in Tsukuba or the fish-stocked ponds of Siemens’s sprawling Erlangen facility, come stories of trauma and renewal, death and rebirth.

    The dark days lingered for years—as the research bloodbath in many ways spurred by the big early-1990s upheavals at Bell Labs and IBM rolled around the world. In 1992, after decades of growth that generally far outshot inflation, industrial research spending dropped in real terms. It continued to spiral down the next two years, an unprecedented period of decline that affected nearly every major research lab in computers, telecommunications, and… name-your-industry.

    Those sweeping cutbacks have drawn impassioned cries of lament—and outrage. The biggest concern is nothing less than the fate of national economies. With research costs soaring, and global competition intensifying, various experts have warned repeatedly that corporations are putting the brakes on science. The gravest danger, the argument goes, is that companies have focused so much on the D side of R&D that they are forsaking the more fundamental R work that creates the breakthroughs needed to spawn new industries. The New York Times summed up these sentiments with a front page article late in 1996: Basic Research is Losing Out as Companies Stress Results. The Times warned the resulting shortfall of new technology could one day shackle the economy.

    But the various forces at work have been widely misunderstood. The axed laboratories and slashed budgets that characterized much of the 1990s obscured a vitally needed realignment—and the revival of corporate research has been almost universally missed. Rid of many of their vestigial ways and bad habits, the best labs today have moved research into another dimension—shifting their orientation beyond the old standards of merely inventing things to the more ambitious problem of innovation, artfully described by Xerox PARC director John Seely Brown as invention implemented.

    This drive for innovation takes a different face at every company and evolved at varying times and rates for each. But everywhere the aim is to vanquish the old linear model that says ideas progress from research to development to manufacturing to market in favor of a much more dynamic enterprise that includes constant interactions along this entire chain. Innovation requires researchers to simultaneously seize responsibility for escorting creations through corporate and marketplace barriers, listen to advice from inside and outside the company, retool their work based on that feedback, and do anything else necessary to get products to the customer.

    It’s true that in the face of new realities, including higher costs and stiffer competition, company research arms have had to scale back some longer-range projects. A hard line was especially needed in the ferociously competitive and tumultuous computer, telecommunications, and electronics industries, where technologies have converged at lightspeed to bring about what economist Raymond Vernon calls the spectacular shrinkage of space.

    But while the pressures of rapid change have forced research to be far more tightly coupled to the here and now—or at least the sooner rather than the later—the change is actually positive. In Europe, Japan, and around the United States, companies are evaluating projects with greater care, finding more effective ways of conducting research, and bringing innovations to market faster, spurring economic growth in the process. Tom Anthony, who with more than 160 patents ranks as the third-most-prolific inventor in General Electric’s glorious research history, puts it this way: In the old era, if you did something the chance of a business using it was zilch, so you couldn’t get that satisfaction. Now, if something works, there’s a really good chance it will be used.

    Even more, the best labs today have regained their equilibrium and begun to see potential opportunity and wealth in all the mayhem. As colleagues complain about brutal and often unfair competition from the far corners of the globe, these leaders point to untapped resources, collaborators, and ultimately new markets. And while some pundits scream about a shortsighted focus on incremental improvements to existing products that shortchanges the kind of path-breaking investigations that gave the world the transistor, the new guard stresses the payoffs from making hard choices that fit the times.

    But basic research—a better term in the case of corporations is pioneering or strategic studies—is far from dead. In fact, as companies recover from the financially brutal early and mid-1990s—industrial research spending in the United States has actually climbed almost 5 percent annually since 1996—one can even find clear signs of a resurgence in more fundamental pursuits. After big cuts early in the decade, IBM is once again cautiously increasing its long-term research. Hewlett-Packard has nearly tripled research spending in recent years while ramping up basic or pioneering investigations into atomic structure and chaotic systems. Microsoft plans to boldly expand its fledgling research organization to six hundred staffers by mid-2000 as it explores such far-out software and computer science issues as advanced interactivity and decision systems. Meanwhile, in 1995 Intel launched a research arm devoted to long-term studies in such areas as computer architectures and user interfaces.

    What’s different today is that such far-out prospecting exists in an era of increased attention to corporate relevance in which projects are chosen more carefully to bear on areas likely to benefit the firm. Xerox’s John Seely Brown likes to talk of a bold but grounded approach to research. You get steeped in the real problems of a corporation and then go to the root and reframe when necessary, he explains. To pull off such a feat, researchers must combine fundamental studies with routine interaction with customers and counterparts from all around the company in order to develop products that not only work, but are needed in the marketplace. The way things used to be, Brown notes, We were these elite scientists sitting in this building inventing the future. Already, talking about ‘inventing the future’ smacks exactly of the ontological problem. We don’t invent the future. We can help enact the future—but we must work with others in making that happen.

    And what a future is in the works. The world unfolding today is at once fully wired—and wireless. It’s an era of super-smart cars that anticipate when a driver’s about to turn or change lanes and make sure it’s safe—and of calm computing, when network computers and web servers will be quietly and invisibly embedded in appliances and walls, allowing clocks to reset themselves after a power failure, or paint to detect intruders. It’s a world of atomic scale transistors, spurred to life by that universal high-tech mantra: smaller, better, faster, cheaper. But it’s also a time where the transistor has taken another path: bigger, worse, slower—only way cheaper. Under this vision, low-tech plastic transistors that can be mass printed without clean rooms, etching, lithography, and all the rest are used to smarten up new generations of toys, luggage tags, and appliances—or even teamed with sensors to flash up-to-the-minute freshness dates for drinks and vitamins that take into account storage conditions.

    Above all, the world being created today is friendlier. Computers turn on as users sit down, talk to their owners, even recognize basic moods and do things like hold calls after detecting a look of annoyance when the phone rings. It’s a world of easier scrolling and easier searching through the Internet, based on intuitive combinations of words and pictures. Web surfers have been freed from the computer terminal by systems that enable people to listen to the Internet and check e-mail over their cell phones or car radios. Then, too, there’s the vision of personal area networks, in which people carry a specialized smart card that transmits a digital aura and securely conveys their bank account, driver’s license, access codes, and so forth without the need to swipe cards through readers. Imagine: you walk onto a train or plane and are billed automatically, or rent cars and check into hotels without ever stopping in a line, slowing down only to scan a computer screen to find your parking space or room number.

    By all indications that’s just a start of the fun that may be in store, for change is coming fast—and, like some benevolent rust, good research today never sleeps. Those at the best labs not only accept the fact, they like it that way. At General Electric’s research center overlooking the Mohawk River outside Schenectady, materials scientist Minyoung Lee says tough competitors drive him to greater accomplishments. It’s no fun when you compete with a dummy, the Korea native asserts. Then you got no work to do.

    The days of one-shift isolation, where scientists ponder problems all day—and maybe into the night—in a university-like lab and worry mainly about beating scientific competitors to publication, are fast dying out. In select areas like programming the new era is moving closer to three-shift innovation. Computer whizzes attack a problem throughout the day, hand it off to counterparts a few time zones away as their stint in the lab begins, only to have it relayed eight hours later to a third shift in yet another part of the globe before returning to the roost to begin the next morning. You have the program developed as the Earth turns, marvels Bell Labs president Arun Netravali, and that’s a very exciting concept. Exults Robert Spinrad, vice president of technology strategy for Xerox, I visualize an image of the globe turning in space, and we begin at the interface between morning and day as people dash off to work on the same problem. It’s fabulous what’s going on.

    If this portrait of surging vitality and renewed spirit flies in the face of headlines bemoaning the effects of all the early and mid-1990s funding cutbacks and reconfigurations kicked off by research leaders like Arno Penzias and Jim McGroddy, it’s because too many people don’t understand the evolution that has taken place and continue to mourn the old regime.

    Corporate labs evolved not to produce scientific breakthroughs, but to bridge the gap between science and technology and create useful products. The first industrial research houses arose in the German dye industry of the 1870s, when manufacturers grew wary of their dependence on buying patent rights from independent chemists and realized the advantages of establishing their own laboratories. The practice soon spread beyond Europe. In late 1900, with Thomas Edison no longer a presence in company affairs and his key patents expiring, General Electric decided that its own lab could be critical to maintaining leadership in the electric lighting arena.

    On the heels of the GE move, DuPont, Eastman Kodak, and others opened research arms. World War I saw a host of firms going full-out to fulfill military contracts and dramatically increase the scope of corporate research—largely to reduce U.S. dependence on imported dyes, chemicals, drugs, metals, and other products. The push to better control the future through research continued during the interwar years. New Year’s Day 1925 saw the creation of the Bell Telephone Laboratories on the western border of Greenwich Village. By then more than five hundred American corporations ran their own research shops.

    Occasionally these labs engendered fundamental scientific breakthroughs. In 1932, for instance, GE’s Irving Langmuir won the Nobel Prize for his contributions to surface state chemistry. But science was never the real aim of industrial labs. Rather, writes historian of technology George Basalla in The Evolution of Technology, the point was to employ research scientists to advance industrial goals.

    World War II, though, fueled another era in research. The success of the atomic bomb effort at Los Alamos and the even bigger project to develop microwave radar at the M.I.T. Radiation Laboratory set the tone for an upbeat style of research: fast-paced and involving large interdisciplinary teams of engineers, biologists, mathematicians, and physicists—both experimentalists and theorists. The war was all about applying their skills to turn out new and better weapons and technologies—and in the process scientists became national heroes, holders of the keys to the future. As the Cold War heated up, warming with the first Soviet nuclear explosion in August 1949 and then sizzling with the 1957 Sputnik launch, the United States witnessed an unprecedented federal openness toward funding basic science in universities, government labs, and industry, especially when it came to farther-out projects with possible military applications.

    Such factors helped spur phenomenal growth in the numbers of industrial labs—and with the great expansion came a far more

    Enjoying the preview?
    Page 1 of 1