The 1980's

US History

The Economic Transformation of the 1980’s: Back to the Future

The changing character of the economy at the century’s end exemplified the sometimes dazzling rapidity of movement in modern American society.

 

As early as 1956, "white-collar" workers had outnumbered "blue-collar" workers, marking the passage from an industrial to a "postindustrial" era.

 

In the following decades, employment in the older manufacturing industries increased only modestly and by the 1970s was even decreasing in depressed "smokestack" industries like steel.

 

The fastest-growing employment opportunities were in the service sector—notably information processing, medical care, communications, teaching, merchandising, and finance.

 

Growing especially lustily was government, which, despite Reagan-era cutbacks, employed about one in seven working Americans in the 1990s.

 

Contrary to popular belief, federal payrolls grew only modestly in the post-World War II years, whereas the number of state and local government employees more than tripled.

 

White-collar workers constituted some 80 percent of the U.S. work force in the 1990s.

 

They proved far less inclined to join labor unions than their blue-collar cousins, and by 1990 only about 16 percent of workers were unionized, down from a high point of nearly 35 percent in the 1950s.

 

Organized labor withered along with the smokestack industries in which it had previously flourished.

 

Some observers concluded that the trade-union movement had played out its historic role of empowering workers and ensuring economic justice in the industrial age,

 

and that it would gradually disappear altogether in the new postindustrial era.

 

America’s economic well-being in the new century would depend as never before on harnessing scientific knowledge.

 

High-technology industries like aerospace, biological engineering, and especially electronics defined the business frontier.

 

From its crude beginnings with Germany’s "V-2" rockets at the end of World War II,

 

rocketry had advanced to place astronauts on the moon in the 1960s, produce a reusable space shuttle in the 1980s, and inspire talk of a manned Mars landing by century’s end.

 

But the breakthroughs had also equipped the United States and the Soviet Union with bristling arsenals of intercontinental nuclear weapons.

 

When scientists unlocked the secrets of molecular genetic structure in the 1950s, the road lay open to breeding new strains of high-yield, bug-and weather-resistant crops;

 

to curing hereditary diseases; and also, unfortunately, to unleashing genetic mutations that might threaten the fragile ecological balance of the wondrous biosphere in which humankind was delicately suspended.

 

As technical mastery of biological and medical techniques advanced, unprecedented ethical questions emerged.

 

Should the human gene pool itself be "engineered"?

 

What principles should govern the allocation of human organs for lifesaving transplants, or of scarce dialysis machines, or artificial hearts?

 

Was it wise in the first place to spend money on such costly devices rather than devoting society’s resources to improved sanitation, maternal and infant care, and nutritional and health education?

 

Who was the rightful parent of a child born to a "surrogate mother" or conceived by artificial insemination?

 

The invention of the transistor in 1948 touched off a revolution in electronics, and especially in computers.

 

This revolution was less ethically vexing than the upheaval in biology, but it had profound social and economic consequences.

 

The first electronic computers assembled in the 1940s were massive machines with hundreds of miles of wiring and thousands of fickle cathode ray tubes.

 

Transistors and, later, printed circuits on silicon wafers, made possible dramatic miniaturization and phenomenal computational speed.

 

By the 1990s an inexpensive pocket calculator contained more computing power than the room-size early models.

 

Computers utterly transformed age-old business practices like billing and inventory control and opened new frontiers in areas like airline scheduling, high-speed printing, telecommunications, and space navigation—and complex military weapons systems like Ronald Reagan’s Strategic Defense Initiative (SDI) as well.

United States Steel Corporation, formed in 1901, had been the flagship company of America’s early twentieth-century industrial revolution,

 

which emphasized heavy industry and the building of the nation’s economic infrastructure.

 

Companies that could efficiently produce the basic building blocks of an industrial civilization reaped the greatest rewards.

 

In turn, General Motors was the leading firm in the shift to the economy of mass consumerism that first appeared in the 1920s and flourished in the 1950s.

 

In this phase of economic development, success depended on the high-volume production of inexpensive, standardized consumer products.

 

In the postindustrial economic order emerging by the 1970s the awesome rise of International Business Machines (IBM) symbolized yet another shift.

 

The computer’s capacity to store, manipulate, and communicate vast quantities of data heralded the birth of the "information age."

 

In the century’s last decade, even mighty IBM showed signs of obsolescence.

 

The electronics revolution accelerated fantastically in the information age, spawning hundreds of relatively small firms like Apple, Microsoft, Intel, and Sun Microsystems.

 

These nimble newcomers consistently out-maneuvered and out-competed the lumbering, bureaucracy-bound IBM in the furiously innovating information and telecommunications marketplaces.

 

Their successes suggested that the business landscape of the future would not be dominated by the kind of giant corporations that had first appeared in the nineteenth century,

 

but by a myriad of lean, entrepreneurial companies able to adapt quickly to the dizzying pace of technological change"

 

The nuclear family, once prized as the foundation of society and the nursery of the Republic, suffered heavy blows in postwar America.

 

Divorce rates doubled in the decade after 1965, and by the 1990s one out of every two marriages ended in divorce.

 

Seven times more children were affected by divorce than at the turn of the century, and kids who commuted between separated parents were becoming commonplace.

 

The 1950s image of a family with two parents, only one of whom worked, now provided a virtually useless way to think about America.

 

Traditional families were not only falling apart at an alarming rate but were also increasingly slow to form in the first place.

 

The proportion of adults living alone tripled in the three decades after 1950, and by 1990 nearly one-third of women aged twenty-five to twenty-nine had never married.

 

In 1960, 5 percent of all births were to unmarried women,

 

but by 1990 one out of six white babies, one out of three Hispanic babies,

 

and an astounding two out of three African-American babies were born to single mothers.

 

Every fourth child in America was growing up in a household that lacked two parents.

 

Some critics claimed that this collapse of the traditional family was a deeper cause of poverty than any shortcomings in the economic or the political system.

Child-rearing, the family's foremost function, was being increasingly assigned to "parent-substitutes" at day-care centers or schools--or to television, the modern age's "electronic babysitter."

 

Estimates were that the average child by age sixteen had watched up to fifteen thousand hours of TV--more time than was spent in the classroom.

 

Born and raised without the family support enjoyed by their forebears, Americans were also increasingly likely to be lonely in their later years.

 

Most elderly people in the 1990s depended on government Social Security payments, not on their loved ones, for their daily bread.

 

The great majority of them drew their last breaths not in their own homes but in hospitals and nursing facilities.

 

From youth to old age, the role of the family was dwindling.

 

Old age was increasingly likely to be a lengthier experience for Americans, who were living longer than ever before.

 

A person born at the dawn of the century could expect to survive less than fifty years, but a white male born in the 1990s could anticipate a life span of more than seventy-two years.

 

His white female counterpart would probably outlive him by seven years. (The figures were slightly lower for nonwhites, reflecting differences in living standards, especially diet and health care.)

 

The census of 1950 recorded that women for the first time made up a majority of Americans, thanks largely to greater female longevity.

 

Miraculous medical advances lengthened and strengthened lives.

 

Noteworthy were the development of antibiotics after 1940 and Dr. Jonas Salk's discovery in 1953 of a vaccine against a dreaded crippler, polio.

 

Longer lives spelled more older people.

 

One American in nine was over sixty-five years of age in the 1990s,

 

and projections were that one of every six people would be in the "sunset years" by 2030, as the median age rose toward forty.

 

This aging of the population raised a host of political, social, and economic questions.

 

Elderly people formed a potent electoral bloc that successfully lobbied for government favors.

 

In 1977 the "wrinkled radicals" scored a major victory when the California legislature abolished mandatory retirement at age sixty-five, and the following year the federal Congress passed similar legislation.

 

These triumphs for senior citizens symbolized the fading of the "youth culture" that had colored the American scene in the first three post-World War II decades.

 

Medical gains also brought fiscal strains, especially in the Social Security system, established in 1935 to provide income for retired workers.

 

When Social Security began, most of the labor force continued to work after age sixty-five.

 

By century's end only 20 percent did, and a majority of the elderly population relied exclusively on Social Security checks for their living expenses.

 

Benefits had risen so high, and the ratio of active workers to retirees had dropped so low, that drastic adjustments were necessary.

 

The problem was intensified in the 1970s, when a compassionate Congress dramatically increased retirement benefits at a time when productivity growth was stalled.

 

Without greater productivity, larger payments to retirees could only mean smaller paychecks for workers.

 

Three-quarters of all employees in the 1990s paid higher Social Security taxes than income taxes.

 

(An individual paid a maximum of $5,238 in Social Security taxes in 1992, matched by an identical employer contribution.)

 

Thanks to the political clout of the elderly, the share of GNP spent on health care for people over sixty-five almost doubled in the twenty years after the enactment of Medicare legislation in 1965.

 

This growth in medical payments for the old far outstripped the growth of educational expenditures for the young.

 

A war between the generations loomed, as the ratio of workers to retirees fell to about two to one by the end of the century.

 

Extending the working lifetime of still-active oldsters was one way of averting the crisis"