International Business Machines turns 100 today without much fanfare. But its much younger competitors owe a lot to Big Blue.
After all, where would Groupon be without the supermarket bar code? Or Google without the mainframe computer?
“They were kind of like a cornerstone of that whole enterprise that has become the heart of the computer industry in the U.S.,” says Bob Djurdjevic, a former IBM employee and president of Annex Research.
IBM dates to June 16, 1911, when three companies that made scales, punch-clocks for work and other machines merged to form the Computing Tabulating Recording Co. The modern-day name followed in 1924.
With a plant in Endicott, N.Y., the new business also made cheese slicers and — significantly for its future — machines that read data stored on punch cards. By the 1930s, IBM’s cards were keeping track of 26 million Americans for the newly launched Social Security program.
The force behind IBM’s early growth was Thomas Watson, a demanding boss with exacting standards for everything from office wear (white shirts, ties) to creativity (his slogan: “Think”).
Watson, and later his son, Thomas Watson Jr., guided IBM into the computer age. Its machines were used to calculate everything from banking transactions to space shots. As the company swelled after World War II, IBM threw its considerable resources at research to maintain its dominance in the market for mainframes, the hulking computers that power whole offices.
“When we did semiconductors, we had thousands and thousands of people,” says Donald Seraphim, who worked at IBM from 1957 until 1986 and was named a fellow, the company’s highest honor for technical achievement. “They just know how to put the force behind the entrepreneurial things.”
By the late ’60s, IBM was consistently the only high-tech company in the Fortune 500’s top 10.
It introduced the magnetic hard drive in 1956 and the floppy disk in 1971. In the 1960s, IBM developed the first bar code, paving the way for automated supermarket checkouts. IBM introduced a high-speed processing system that allowed ATM transactions. It created magnetic strip technology for credit cards.
For much of the 20th century, IBM was the model of a dominant, paternalistic corporation. It was among the first to give workers paid holidays and life insurance.
But by the 1980s, Big Blue found itself adrift in a changing technology environment.
IBM had slipped with the rise of cheap microprocessors and rapid changes in the industry. In an infamous blunder, IBM introduced its influential personal computer in 1981, but it passed on buying the rights to the software that ran it — made by a startup called Microsoft.
IBM helped make the PC a mainstream product, but it quickly found itself outmatched in a market it helped create. It relied on Intel for chips and Microsoft for software, leaving it vulnerable when the PC industry took off and rivals began using the same technology.
With its legacy and very survival at stake, the company was forced to embark on a wrenching restructuring.
One of its achievements turned out to be re-engineering itself during the upheavals of the 1990s. Viewed as too bureaucratic to compete in fast-changing times, IBM tapped an outsider as CEO in 1993 to help with a turnaround.
Louis Gerstner, a former executive with American Express and RJR Nabisco, had little knowledge of technology or IBM culture. In his first meeting with top IBM executives, he was the only one in the room with a blue shirt.
Gerstner resisted pressure to break up the company and instead focused on services, such as data storage and technical support. The change in strategy was risky for a company that helped create the PC industry, yet IBM rose to become the world’s biggest technology services provider.
With around $100 billion in annual revenue today, IBM is ranked 18th in the Fortune 500. It’s three times the size of Google and almost twice as big as Apple. Its market capitalization of around $200 billion beats Google and allowed IBM last month to briefly surpass its old nemesis, Microsoft.