Growth of U.S. Population

The territories of the United States spread across many geographic regions and climates. The land stretches from the tropics to the edges of the Arctic. These varied terrains have attracted, challenged, and supported many different groups of people. America’s relatively low population density and its relatively high standard of living, along with opportunities for free expression, continue to fuel immigration to the United States. The nation remains a magnet for immigrants, despite the fact that substantial disparities exist in wealth and in access to resources between recent immigrants and more established Americans.

Growth Through Immigration

Colonizers and conquerors, wanderers and settlers have long been attracted to America’s abundant resources. Since 1820, when national record keeping began, more than 65 million people have come to the United States; 660,000 immigrants arrived in 1998 alone. The vast majority of Americans trace their ancestry to one or more of these immigrant groups. The various ethnic and racial origins of the residents and immigrants remain important sources of personal identity. Of the 224 million people reporting their ancestry in the 1990 census, only 13 million, or 6 percent, identified themselves as Americans only. The rest chose one or more broad racial or linguistic groupings (such as African American or Hispanic) or national heritages (German, English, Irish, and Italian were most common) to define their origins. Most Americans possess varied national, ethnic, and racial identities that reflect both the origins of their ancestors and their own affiliation to the United States.

Until the late 19th century, immigration to the United States was unrestricted, and immigrants came freely from all parts of the world. However, the areas of the world contributing the largest share of immigrants have shifted during the course of America’s history. In the 1790s the largest numbers of immigrants came from Great Britain, Ireland, western and central Africa, and the Caribbean. A hundred years later, most immigrants came from southern, eastern, and central Europe. In 1996 they were most likely to come from Mexico, the Philippines, India, Vietnam, and China—indicating a recent increase in Asian immigration. Not all immigrants stay in the United States. Although 46 million immigrants arrived in the United States from 1901 to 1999, nearly a third later returned to their homelands. In earlier years, a similar proportion of migrants returned.

The 1990 census indicated that nearly 20 million inhabitants had been born outside the United States, about 8 percent of the total population. Eight million, or 40 percent, of those born overseas became naturalized citizens. Early in the 20th century it took immigrants three generations to switch from their native language to English. At the end of the 20th century, the shift to English was taking only two generations. This is not only because of the daily exposure to English-language movies, television, and newspapers, but because entry-level jobs in service industries require more communication skills than did the factory jobs that immigrants took a century or more ago.

Growth Through Immigration - Ancient Immigrants and Early Cultures

The earliest arrivals of humans into the territory that is now the United States are poorly documented, but archaeological work provides an idea of when human settlement began in the Americas. Most anthropologists believe that small groups of hunters and foragers began migrating from northeastern Asia to northwestern North America at least 15,000 years ago. These ancient migrants crossed to North America during the most recent of the ice ages, when glaciers had frozen much of the world's water on land. At that time, sea levels were lower than they are today, and a natural land bridge, called Beringia, linked present-day Siberia and Alaska. The earliest archaeological sites in North America, dated at more than 11,000 years old, indicate that humans quickly spread south and east across the continent. Separate waves of peoples migrated to the Americas over thousands of years. The last of these occurred around 4,000 years ago when the Inuit and Aleut peoples arrived in what is now Alaska from northeastern Asia. Other migrations include the Hawaiian people, who arrived from the island of Raiatea, near Tahiti in Polynesia, in the 7th century ad. More migrations to Hawaii from the same region occurred through the 13th century. For more information on the peopling of the Americas, see Migration to the Americas.

By the 15th century thousands of separate groups lived in North America, speaking hundreds of mutually incomprehensible languages and many more related languages and dialects. The cultures were as varied as the languages, ranging from agricultural, mound-building cultures in the Southeast and in the Mississippi and Ohio river valleys to the cliff dwellers in the Southwest, and from the complex fishing societies in the Northwest to the foragers of the northern Great Lakes. These various groups were neither static nor homogeneous populations. They only seemed alike to the later-arriving Europeans, who mistakenly labeled all these groups as “Indians.” In fact, recent histories of native America show that towns and cultures emerged, prospered, and sometimes fell because of changes in climate, technology, or available resources. Warfare, diplomacy, and trade also affected native cultures and settlements. The peoples of America have always exhibited social, political, and economic diversity, and American history did not begin with European settlement.

The arrival of Europeans and Africans starting in the late 16th century brought irreversible changes. As the European population grew, conflicts developed between Europeans and Native Americans over the control of the land. From the early 17th century to the late 19th century, war, disease, and the confiscation of land, resources, and livelihoods took a severe toll on all native populations. In what are now the lower 48 states, a native population that ranged from 1.5 million to 8 million or more prior to European conquest was reduced to 243,000 by 1900. On the Hawaiian Islands, the native Hawaiian people numbered 300,000 when Europeans arrived in 1778 and only 135,000 by 1820. In Alaska, 20,000 Aleutian natives existed before contact with Europeans in the 18th century and only 1,400 by 1848.

Entire peoples and ways of life disappeared, and straggling survivors formed new nations or tribal groups with the remnants of other groups, moved to new territories, and adopted various social, economic, and military strategies for survival. Some migrated west, ahead of the advance of European migration. Some went to Canada, where westward settlement by European Canadians occurred somewhat later and where government relations with native peoples were somewhat less harsh. The overall decline of native populations masks periods of recovery and the continued resistance of native peoples, but the dominant trend was one of a steep decline in numbers. This trend was not reversed until the second half of the 20th century—by the 2000 census, 2.5 million Native Americans, including Inuits and Aleuts, lived in the United States.

Growth Through Immigration - European and African Immigration in the Colonies

The Europeans and Africans added new layers of complexity to the territories named the New World. European military technology, commercial wealth, and immunity to diseases such as smallpox, influenza, and tuberculosis generally gave Europeans an advantage over the original inhabitants. Yet the Native Americans knew the land and were skilled negotiators, eloquent orators, and fierce fighters. Wresting control of the land from the indigenous peoples took the newcomers some 300 years to accomplish.

Colonists established a variety of outposts for their European empires. By the 17th and 18th centuries, the French had settlements around the Great Lakes and the upper Mississippi River, and at New Orleans. The Spanish established settlements in Florida, the Southwest, and California. The British entrenched themselves in New England and the South, while the Russians settled on the West Coast, and the Swedes and the Dutch on the East Coast. This short list fails to capture the ethnic complexity of early European settlement in what is now the United States. The various settlements included Scots, Welsh, Irish, Germans, Finns, Greeks, and Italians, as well as Maya, Aztec, and African slaves.

European settlements, both in the North and the South, depended on the skills and labor of these indentured European servants and, particularly after 1700, of enslaved Africans. The majority of the early European immigrants were not free—60 percent in the 17th century and 51 percent in the 18th century arrived as indentured servants or prisoners. However, these Europeans could hope to achieve freedom at the end of their servitude. Africans were treated differently; neither they nor their children could realistically hope to attain freedom. A few Africans arriving in the New World were free men sailing the Atlantic as part of an economic network connecting Europe, Africa, and the Americas. The vast majority, however, were enslaved, purchased in various parts of Africa to work on European plantations, farms, and homesteads. Most Africans came from coastal West Africa and the Niger River region. Smaller numbers came from central, southern, and eastern Africa. Twenty-one percent of the population on the eve of the American Revolution (1775-1783) was of African descent, almost all working as slaves.

Ethnically and linguistically the African migration was as diverse as the European; culturally it was more so. Most Africans caught in the slave trade were skilled farmers, weavers, and metallurgists; smaller numbers were herders, hunters, foragers, or city dwellers. Some had been enslaved in their homelands and some were African royalty. They included Muslims, Christians, and others who worshiped one god, as well as those who worshiped multiple deities, such as animists and ancestor worshipers. These involuntary immigrants faced a hard life in the New World. Their labor and skills were exploited, their specific national origins were forgotten, and their cultural traditions were partially suppressed. Yet Africans in America constructed flexible family networks that allowed their population to grow and expand in spite of enslavement. The family protected its members from some of the harshest features of enslavement and preserved elements of religious belief, vocabulary, poetic tradition, family values, craft and artistic practice, and other aspects of African heritage.

European American populations generally thrived as they expanded their control over the continent. The predominately British Protestant settlements on the East Coast grew rapidly during the colonial period because of the immigration of women and men, nearly all of whom married and had many children. Colonial American women, free and enslaved, gave birth every two years on average, pushing the natural increase (the surplus of births over deaths) of Britain’s American colonies to twice that of the Old World. In addition, Britain absorbed the smaller Dutch and Swedish colonies on the East Coast before the end of the 17th century. The more isolated French, Russian, and Spanish Roman Catholic settlements to the west remained relatively small, in part because few women resided at these military posts, missionary compounds, and trading stations. Their geographic isolation inhibited immigration, keeping growth rates low and populations small.

Growth Through Immigration - Diversity and Assimilation in American Society

The American victory in the Revolutionary War united 13 of the English-speaking settlements into the largest and most powerful political unit in the territory, even though those first 13 states hugging the eastern coast seem small compared with the country’s eventual size. As a result of the Revolution, approximately 71,500 people out of a populace of some 2.5 million fled the new United States. Some were Loyalists—political or economic refugees whose loyalties to Great Britain remained strong; others were blacks seeking refuge from slavery. Immigration and the commercial slave trade after the war quickly restored the population to its former level. The Revolution also opened up the area west of the Appalachian mountains to settlement, as fur traders and farmers were no longer confined by British settlement restrictions. Pioneering citizens, immigrants, and slaves moved west, displacing Native Americans who had hoped to preserve their cultures undisturbed by the expanding United States.

The 17th and 18th centuries saw a growing importation of Africans into North America. After 1808 U.S. law forbade the importation of slaves from abroad, although some smuggling of slaves continued. Few people from Africa chose to come to the United States voluntarily, where the free African population was small, considered second-class citizens, and confined largely to the northern states. Large numbers of Europeans migrated to the United States in the early national period, drawn by the promise of freedom, cheap land in the West, and jobs in the first factories of the emerging industrial age. The influx of Europeans, the end of the slave trade, and the ongoing wars removing Native Americans meant that some of the racial diversity of the population was diminishing. By the early decades of the 19th century, a greater proportion of Americans were of western European and Protestant heritage than at the time of the Revolution.

Over the course of the 19th century, the United States gradually absorbed the French colonists in the upper Midwest and in New Orleans, Louisiana; the Spanish and Russian colonists in the South, West, and Northwest; and the territories of the Hawaiian people and other indigenous groups. Sometimes these territories were added by diplomacy, sometimes by brute force. European visitors were surprised at the diversity in nationalities and in religious and secular beliefs in early America, as well as the number of intermarriages between people of differing European heritages. There were also cross-racial births, sometimes voluntary and sometimes by force, but rarely within legal marriages. The population continued to grow through migration as well, driven in part by English, Irish, and German settlers who came in large numbers around 1848 to escape political repression and food shortages in Europe.

By 1860, 86 percent of Americans counted in the census were white (72 percent native-born white) and 14 percent black. (Most Native Americans were not included in census figures until the late 19th century.) Although the country had become more uniform, it was not homogeneous enough for some citizens. They sought at various times between the Revolution and the American Civil War (1861-1865) to delay the naturalization of foreign immigrants, to send African Americans to Liberia or elsewhere, or to discriminate against Roman Catholics. But the German and Irish immigrants of the midcentury gradually won acceptance, and free African Americans insisted on an American identity, pushing for an end to slavery and for full citizenship.

The insecure status of even free African Americans in the middle decades of the 19th century caused thousands of blacks to emigrate from the United States to Canada, especially after the Fugitive Slave Law was passed in 1850. This law required that slaves who escaped to free states be returned to their masters. Within a year, 10,000 black Americans fled to safety in Canada. By 1861, on the eve of the Civil War, 50,000 African Americans resided in Canada.

The American Civil War briefly interrupted European immigration. At the end of the war some slaveowners moved to Brazil and other places where slavery was still legal. With slavery abolished in the United States and former slaves’ status as American citizens constitutionally guaranteed, 30,000 African Americans returned from Canada to rejoin family and friends. The constitutional promises of the post-Civil War era were soon discarded. Racism in both the North and the South confined African Americans to second-class citizenship in which political and civil rights were ignored. Discrimination by race was declared constitutional in 1896.

The immigrant population changed dramatically after the Civil War. The majority of white immigrants had traditionally come from western Europe, but during the second half of the 19th century, many immigrants came from central, southern, eastern, and northern Europe. This influx brought in larger numbers of Roman Catholics. And for the first time there were substantial communities of Orthodox Christians and Jews. On the West Coast, Chinese and Japanese immigrants, mostly men, arrived to work in agriculture and on the railroads.

From 1880 to 1914, peak years of immigration, more than 22 million people migrated to the United States. As with earlier arrivals, some immigrants returned home after a few years. Some maintained separate ethnic and religious identities in urban neighborhoods as well as in the smaller towns of the West, while others blended into American society through marriage, education, language, employment, politics, and sometimes religion.

Growth Through Immigration - Restrictions on Immigration

Late-19th-century immigrants, with their different ways and seemingly strange religions, made American voters anxious enough to enact the first laws restricting immigration. Social Darwinism, with its beliefs that national characteristics or ethnic traditions were inherited, led Americans to view immigrants from nonindustrialized nations as not only economically backward but biologically inferior. It gave more-established, native-born Americans a supposedly scientific excuse for blocking immigration. Convicts and prostitutes were barred in 1875. Then paupers, the so-called mentally defective, and all Chinese immigrants were excluded in 1882. Contract workers, who were often Italian or Chinese, were also banned in the 1880s. Japanese immigration was stopped in 1907.

By 1910 African Americans made up only 11 percent of the population, and Native Americans constituted only 0.3 percent, their smallest proportions ever. For Native Americans, the population decline was due in part to the military defeat of the last of the independent nations and in part to their impoverishment on reservations. Segregation, lynching campaigns, and poverty slowed the growth of the African American population. Even though more than three-quarters of Americans were native-born whites in 1910, many citizens still felt insecure. The settlement house movement, whose most prominent advocate was social reformer Jane Addams, sought to speed the Americanization of foreign-born urban residents through education and social services. This was an insufficient response for some American citizens, and additional restrictions were placed on immigration. After 1917 only literate individuals were admitted.

The Russian Revolution of 1917 convinced many Americans that all foreigners were Bolsheviks, anarchists, or criminals. Fearing the importation of radical political ideas, labor unrest, and attempts at subversion, many Americans retreated into isolationism, the idea that America should separate itself from the rest of the world. In 1921 and 1924 Congress mandated a quota system for immigration, which soon became based on European ethnicities present in the United States in 1890, before many eastern Europeans had arrived. This granted 80 percent of the 150,000 annual visas to immigrants from western Europe, leaving only 30,000 visas for immigrants from other countries.

The Great Depression of the 1930s only sharpened feelings against foreigners in America. With anti-immigrant feelings running high and with jobs being scarce, more people emigrated from the United States than arrived during the 1930s, the first period of negative migration since the Revolution. The emigrants included an estimated 500,000 Mexican Americans, many of them U.S. citizens or legal immigrants, who were forced out of the country on the grounds they were taking jobs from supposedly real Americans, that is, those of western European descent. This decade also saw the lowest population growth rate in the history of the United States.

Not only did old-stock Americans fear eastern and southern Europeans, Hispanics, and Asians, but anti-Semitism was also commonplace in the early 20th century. This was especially true after the turn of the century, when immigration produced a substantial eastern European Jewish presence in the cities. After World War I (1914-1918), the children of these immigrants sought admission to high schools and colleges, and they entered skilled and professional occupations, and many Christians responded with fear. Quotas enforced during the 1920s limited immigration from countries with large numbers of Jewish emigrants. Colleges, professional schools, and businesses barred Jews entirely or admitted only a few during this period. Through the first half of the 20th century, towns and individual householders barred Jews from buying real estate by including restrictive covenants in property deeds, a practice known as “gentleman’s clauses.” Although 102,000 Jewish refugees escaping Nazi Germany were admitted into the United States before World War II (1939-1945), many more were refused entrance. As a consequence of this policy, some died in German labor and death camps.

Growth Through Immigration - Immigration in 20th-Century America

After the war, revelations about the full extent of Nazi racism in Europe led to reevaluations of American immigration policy and to special legislative exemptions to the quota system. More than 93,000 Jews immigrated to the United States from 1946 to 1949. War brides, displaced persons, refugees, orphans, and other people caught in postwar political changes or in the later conflicts of the Cold War were also granted permission to enter the country. At first these were Russians, Czechs, and Belorussians, but later they included Cubans, Vietnamese, Cambodians, Hmong, Iranians, and others. The number of immigrants was relatively small, and Americans thought of themselves as relatively homogeneous in the 1950s and 1960s, a feeling bolstered by the all-white images dominating the nation’s television screens. In 1960, 83 percent of Americans were native-born whites.

The civil rights movement, which peaked from 1955 to 1965, renewed concerns about racism and issued a clear call to fulfill constitutional guarantees of human equality. Racial prejudice, anti-Semitism, anti-Catholic sentiment, and other forms of discrimination became less acceptable, as did the image of the true American as white, northern European, and Protestant. This change in attitude helped bring an end to national quotas for immigrants. In 1965 family members of those already living in the United States were given priority in immigrating without regard to national origin, as were highly skilled individuals, but migration from Asia was placed under a separate quota system that applied only to the Far East. By 1978 this provision was lifted, and all immigrants were treated equally.

Because of changes in U.S. immigration law and in economic and political conditions worldwide, the number of immigrants to America resurged in the last quarter of the 20th century. Immigrants from the Pacific Rim, including Filipinos, Chinese, and Koreans, as well as immigrants from American dependencies in the South Pacific, arrived on the West Coast and settled throughout the United States. Mexicans, Guatemalans, Costa Ricans, Caribbean peoples, and South Americans sought asylum and opportunity in the United States, particularly in the Southeast and Southwest and in large cities. Indians, Pakistanis, Arabs, Iranians, and others sought an outlet for their skills. These new flows of immigrants added Buddhism, Islam, and Hinduism to the mix of religious beliefs in America. Hispanic Americans became the fastest-growing segment of the population by the end of the 20th century. The effect of immigration was not felt uniformly throughout the United States. Immigrants tended to congregate in the more densely populated areas of the United States: California, Florida, Texas, and the Northeast.

Although most immigrants entered the country legally, some did not. According to official estimates, approximately 5 million illegal immigrants resided in the United States in 1996, most from Mexico, El Salvador, Guatemala, and Canada. Concern over immigration, particularly illegal immigration, increased during the 1980s and 1990s. In the last decades of the 20th century, immigration laws were amended to restrict the flow of all immigrants, to deport illegal aliens, and to deny benefits to those already living in the country legally. This wave of antiforeign sentiment was based on fears of tax increases for schooling immigrant children, for social services, and for health care, although illegal immigrants who work (albeit without legal status) pay wage and sales taxes that help support education and social services. Some citizens were also concerned about increased competition for jobs, even though immigrants frequently fill positions that American citizens do not want.

Other Americans, however, welcomed these new additions to American culture. Some employers depended on immigrants to harvest the nation’s crops, sew garments, or wash dishes in restaurants, jobs that many U.S. citizens found unattractive. Doctors and health professionals recruited from overseas were often hired to staff small-town hospitals in places where American professionals felt socially isolated. Businesses and universities welcomed foreign-born engineers and computer programmers because relatively few American students pursued these fields of study. A lottery system for entrance visas was designed to maintain the diversity of the immigrant pool by selecting a limited number of migrants by chance rather than by national origin.

According to the 2000 census, 70.9 percent of Americans were non-Hispanic whites, and the populations of blacks, Hispanics (who may be of any race), Native Americans, and Asian and Pacific Islanders were increasing. The Native American and African American populations grew, reversing 19th-century declines in their share of the total population. Migration from the Caribbean and smaller flows from various parts of Africa created the first substantial influx of free people of African descent in the nation’s history.

Growth Through Immigration - Racism

These broad categories only hint at the full ethnic and racial diversity of the American population; conversely, the use of separate categories masks the many characteristics Americans share. The United States has been described as a melting pot where ethnic and racial groups shed their specific traits and join with other Americans to create a new identity. The nation has also been described as a salad bowl where people of different backgrounds mingle at work and school, in civic responsibilities, and as consumers, but where cultural traits remain distinct.

In the 18th century American statesman Benjamin Franklin feared that Germans could never be assimilated because of their foreign ways. In the middle of the 19th century many thought that Irish Catholics would subvert the American way of life. At the end of the 19th century the Chinese, Japanese, Jews, Italians, and others were mistrusted. Yet these groups eventually became part of mainstream America. At the end of the 20th century, many people consider newer Asian immigrants, Spanish-speaking peoples, and Muslims as permanently alien presences. If the past is a guide, these groups too will meld into the general American citizenry.

The main exceptions to full acceptance remain Native Americans and African Americans. Native Americans have a dual status based both on the existence of reservations and vibrant tribal traditions, and on the prejudices of many non-Native Americans. African Americans bear the brunt of the oldest and most deeply rooted of American prejudices.

Initial contacts between Africans and Europeans often began with misunderstanding. Africans at first thought white-skinned people were ghosts looking for people to eat, since white was the color of death in much of Africa. Europeans sometimes assumed black-skinned peoples were followers of the devil and therefore sinful, since black was the traditional color associated with lies, sin, and evil in the Western world. Differences in religion, language, and customs also led to misunderstandings, even while economic similarities favored trade between African kingdoms and European empires.

When European merchants brought the first enslaved Africans to work in their New World, they justified the enslavement on the premise that Africans were not Christian and were supposedly not civilized—in other words, the Africans were considered culturally inferior. By the 18th century, many enslaved African Americans had converted to Protestant Christianity, spoke English, and expressed a desire for freedom. A few people of African descent had, against all the odds, become poets, doctors, almanac publishers, plantation owners, and antislavery activists. It became harder for whites to claim that Africans would always be culturally inferior. Pro-slavery whites then began to justify permanent enslavement by asserting that Africans were somehow biologically inferior to Europeans. Whites claimed that anything accomplished by people with black skin was inferior, that blacks were intellectually and morally incapable of self-government, and that blacks needed to be controlled by whites. This so-called scientific racism based on presumed biological differences was useful in slaveholding areas for protecting the economic interests of slaveholders and useful in non-slaveholding areas for uniting all the different, and potentially conflicting, European ethnicities under the label “white.”

Racial discrimination grew out of the practice of enslavement but outlasted the institution of slavery. European newcomers could find common ground with the majority of Americans by joining in the denigration of African Americans. Poorer whites or socially marginal whites could feel superior by virtue of their skin color, even if they were not economically successful or fully accepted by their peers. Racism helped to create a sense of unity among white Americans by defining who was a full citizen. Racism also united African Americans through shared experiences of discrimination and suffering. As a consequence, white racism also promoted a sense of unity among black Americans, no matter what their backgrounds.

Freedom in the wake of the Civil War was a first step in eradicating this prejudice. The civil rights era of the mid-20th century saw even more advancement, but prejudice against black Americans has not been entirely eliminated. At the beginning of the 21st century, a relatively small number of white people still opposed a race-blind America that would deny them a feeling of racial superiority. Some of these people form militia, fascist, and vigilante groups that use violence against African Americans, the federal government, and others who challenge their restrictive views. The majority of Americans, however, while sometimes reluctant to change, believe that all people are created equal.

Americans tend to think in terms of a biracial, separated society, even though whites and blacks have jointly built the United States, and even though the family histories of whites, blacks, and other races are often intermixed. In addition, the two groups share many beliefs (such as freedom, liberty, and civil rights) and customs (from poetry to sports and from work to holidays). Yet the idea of racial difference, of superiority and inferiority, still provides the basis for many social, cultural, political, economic, and religious divisions in the United States.

Growth through Natural Increase: Births

While the influx of immigrants contributed to the growth of the American population and helped build American society, the major factor affecting population growth in the United States has always been the surplus of births over deaths, or the natural increase of the population. American women at the beginning of the 21st century bear an average of two children over the course of their lives. Their great-grandmothers and great-great-grandmothers in 1890 had an average of four children, because in the 19th century fewer women had access to reliable methods for controlling fertility. A century earlier, around 1790, women might expect seven births throughout their lives, if they survived into their late 40s.

Growth through Natural Increase: Births - Birthrates in Early America

Little is known of the birthrates of Native American societies before the arrival of Europeans. There are hints that the birthrate was relatively low because Native American women often breastfed their infants for three or four years. Since breastfeeding has a contraceptive effect, it appears that women gave birth about every four years. On the other hand, since many Native American women traditionally married soon after the onset of puberty, at around age 15, they might have had six or seven children if they lived to at least age 45. Some researchers have suggested that when European diseases and warfare killed large numbers of native peoples, women increased their childbearing in order to compensate for the excessive deaths in the community. Native Americans may have gone from low birthrates to high birthrates, but any increases in fertility could not make up for the deaths from disease, starvation, and war. The birthrate among Native Americans would not produce population growth until the 20th century.

European colonists had high birthrates compared with the birthrates in Europe at the time. Free, white colonial women typically bore children every two years and had an average of eight children, four of whom might survive to adulthood. This was twice as many children as European families had. Fertility was higher in the colonies because of the need for labor on colonial farms, the availability of land to support the larger numbers of children, and early and nearly universal marriage.

The enslaved African American population in the 17th century had more men than women and more deaths than births. By the 18th century the ratio of black men to black women was more equal and the population was holding its own. By the early 19th century the African American population was growing rapidly, but because of higher death rates and the absence of immigration after 1808, the overall growth of the African American population remained lower than that of the white population. African Americans became an increasingly smaller proportion of the population from the late 18th century to the early 20th century.

Growth through Natural Increase: Births - Declining Birthrates

The European American population doubled every 20 to 25 years until late in the 18th century, after which birthrates began to decrease and growth rates slowed. This decline in fertility rates early in America’s history is a distinctive characteristic of American society. In the early 19th century white women who lived through their childbearing years were bearing 7 children over the course of their lives; by 1850 it was 5.4 children, by 1950 it was 3.0, and in 2003 it was 2.1. While the longer-established American population experienced a decline in fertility and family size during the 19th century, newer immigrants had higher birthrates. It took two or three generations for these immigrants to conform to the prevailing American fertility standards.

Growth through Natural Increase: Births - Declining Birthrates - Women’s Education and Birthrates

Decisions to limit family size are based on complex personal, social, and economic factors. The beginning of any fertility decline is most strongly linked to increased education for women. Female academies appeared after the American Revolution, public schooling became common in the early 19th century, and the first women’s colleges and coeducational institutions were created in the mid-19th century. Women read novels, newspapers, magazines, and religious tracts. Women learned about individuality and self-control and about planning for the future, and they applied these concepts to fertility. They established reform groups and literary and religious societies, indicating their interest in the world outside of marriage and childbearing.

Although wives in early America had been most concerned with the production of food and clothing, 19th-century families became child-centered, and motherhood was exalted as a special calling requiring education. Women had fewer children in order to provide each child individualized attention and the best possible upbringing. Declining fertility rates also reflected the increased cost of child-rearing during the industrial age, as advanced education became increasingly necessary; housing, food, and clothing costs rose; land became scarcer and more expensive; and child labor became less acceptable. Instead of being a potential source of income, children became a major expense, as well as more cherished individuals who deserved every opportunity. African American birthrates, which were high under slavery, fell rapidly once freedom was achieved in the wake of the Civil War, when families could hope to provide the best possible education for their children. By the end of the 19th century, most families were investing substantial amounts of time and money in each child’s future. Parents did not want to shortchange their children and so had fewer.

Growth through Natural Increase: Births - Declining Birthrates - Birth Control

Women attempted to control child bearing in various ways, including prolonged breastfeeding, abstaining from sex, taking herbal remedies, jumping rope, horseback riding, and having abortions. By the early 19th century, condoms, originally intended to prevent the spread of sexually transmitted diseases, were being used to prevent pregnancy. The vulcanization of rubber after 1839 and the invention of latex in World War I (1914-1918) made condoms, cervical caps, and diaphragms, more widely available. From 19th-century newspaper advertisements, it seems that abortion was a common method of controlling family size. These were usually performed by untrained men and women, some of whom were skilled but many of whom were not. Doctors, who were organizing the first state and national professional organizations during the mid-19th century, saw these abortionists as unprofessional competitors and a public danger. Concern about the safety of abortion led to the first state laws, enacted just before the Civil War, restricting the practice.

By the 1870s religious reformers who were worried about prostitution and the perceived spread of vice and sin began to connect contraception and abortion with immorality. The Comstock Law of 1873 declared birth control and abortion information obscene and banned it from the U.S. mail. Many states passed laws against contraception. One reason people supported bans on birth control was the fear that immigrant groups, who tended to have larger numbers of children than native-born white Americans, would come to dominate society if white, Protestant women did not have more babies. Despite the Comstock Law, birthrates continued to fall.

A small number of reformers spoke out publicly in favor of birth control. The most famous of these advocates was Margaret Sanger, who in 1921 founded the organization that would become Planned Parenthood. Sanger worked to help poorer women obtain what was still illegal information on birth control. Planned Parenthood led the fight to have the Comstock Law overturned.

The Comstock Law was declared unconstitutional in 1938, although state laws against birth control remained. In 1965 the Supreme Court of the United States struck down the last of state laws against contraception, asserting that married men and women have a right to privacy. That right was extended to unmarried persons in 1971. In 1973 abortion was legalized in the United States. Since then various restrictions have been placed on abortion, and the issue is one of the most divisive in contemporary America.

Growth through Natural Increase: Births - Birthrates Since World War II

Birthrates decreased steadily until the Great Depression in the 1930s, when they suddenly dropped 24 percent in a decade, reaching unprecedented lows in the mid-1930s. Families felt they could not afford more children during this prolonged economic crisis. There were also relatively few births during the crisis of World War II as couples feared for the future and as husbands and wives were separated because of military service. 

Growth through Natural Increase: Births - Birthrates Since World War II

Baby Boom

After World War II birthrates shot up, and by the mid-1950s were 30 percent higher than during the depths of the depression. This unprecedented upward movement in fertility levels produced a baby boom that was both a result of postwar prosperity and a reaction against the deprivations of the depression and war years. This boom helped fuel the growth of suburbs in the postwar period. The baby-boom generation had lasting effects on America. Education costs soared as this generation of children reached school age. The youth culture of the 1960s reflected, in part, the dominance of adolescent and young adult baby boomers. And recognizing that baby boomers will begin retiring in the early decades of the 21st century, the solvency of the Social Security system has become a major concern. Fertility rates declined again after the mid-1950s, although the 76 million baby boomers born between 1946 and 1964 contributed to a second, smaller baby boom in the 1970s and 1980s as they reached adulthood and started having children of their own.

A number of changes affected fertility rates in the 1950s. Many married women who had taken temporary jobs during the crisis of World War II now sought permanent positions. As these women moved into the workforce, they demanded more effective methods of birth control. By the 1960s new forms of contraception were available, including the birth control pill, intrauterine devices, and surgical techniques for permanently inducing infertility, such as tubal ligation and vasectomy. At the end of the 20th century, 64 percent of women between the ages of 15 and 44 reported using birth control. Since 1957, the trend in the total birthrate has been downward.

Growth through Natural Increase: Births - Birthrates Since World War II

New Attitudes Toward Sexuality

While these new technologies offered more effective control over fertility, new attitudes toward sexuality in the 1950s stressed impulsiveness, innovation, and experimentation—all of which discouraged the use of birth control devices, especially among young, unmarried couples. One result was that teenage pregnancies and births outside marriage soared in the 1950s. Teenage pregnancies declined in the 1960s and 1970s, surged again in the late 1980s, and then declined sharply in the 1990s. By 2000, teenage birth rate was down to 49 births per 1,000. Out-of-wedlock births, once comparatively rare, increased dramatically after World War II, and more than a third of all infants in the United States are now born outside of marriage.  

Growth through Natural Increase: Births - Educational and Racial Differences in Birthrates

Fertility rates declined among all major groups of Americans in the last decades of the 20th century, in keeping with the trend since the late 18th century. One reason for this trend has been the increase in educational opportunities for women. Women’s educational levels affect births. Most college-educated women who have children wait until their 30s to do so, after finishing their education and establishing a career. Other women begin bearing children earlier and continue bearing children later in life.

The education level of parents also affects childbearing. The children of college-educated parents are less likely to be sexually active at age 15 than the children of those who have not completed high school.

Births outside marriage among American subcultures differ significantly. From the 1930s through the 1970s, the rates for unmarried white women giving birth remained below 10 percent. This rate increased, but was still under 20 percent in the 1980s. It increased in the 1990s, reaching 26.7 percent in 1999. The rate of black children born out of wedlock in 1999 was 68.8 percent; this is high in part because married black women have few children. A desire to enhance the opportunities available to their children and fears about the discrimination their children might face inhibit many married African American couples. Unmarried couples of all races tend to be more impulsive about sexuality and childbearing. The percentage of births to unmarried Hispanics in 1999 was 42.1 percent.

Better-educated women and men of all groups—black, white, or Hispanic—are more likely to bear children within marriage than individuals with less education. Black women, married and unmarried, have a far higher rate of unintended or unwanted pregnancies than other groups, more than half of all pregnancies. This may indicate less access to suitable birth control technologies. Hispanic women have the largest number of children among major groups—3.1 children on average, compared to 2.2 for blacks, 2.1 for Native Americans, 2.1 for Asians and Pacific Islanders, and 2.1 for whites.

The causes for the recent changes in births and marriage are poorly understood. But because births outside of marriage, early sexual experimentation, and early childbearing are so strongly linked to educational levels, and because educational achievement is itself linked to wealth, the rise in out-of-wedlock births may be a function of the changing U.S. economy. Since the 1970s the industrial base of the United States has been eroding, and with it many good-paying jobs. In 1979 the typical middle-class worker earned $498 a week. In 1995 he or she earned $475 a week (adjusted for inflation). Income for the poorest fifth of Americans fell .78 percent a year between 1973 and 1993. Industrial employment has been replaced by service work, which rewards highly educated, computer-savvy workers well but which tends to pay the majority of workers low wages. Rapid economic change, financial stress, and anxiety about the future may undermine the ability of couples to form more stable unions and have children within marriage.

Growth through Natural Increase: Deaths

Fertility rates are not the only factor influencing population growth. The population also grows when people are healthier and therefore live longer. Just as the birthrate has been steadily declining in the United States, so, too, has the death rate.

American babies are healthier than ever before in this country’s history and 99.3 percent will survive to their first birthday. Although the records from a century ago are incomplete, they indicate that only 84 percent of infants survived their first year. And a century before that, about 80 percent of infants may have lived to their first birthday. Most of the improvement in infant health has come in the 20th century and is due to improved childcare, better medical care for mothers and children, better sanitation, and the development of antibiotics.

Children born in 2003 can expect to reach age 74.4 if they are male and age 80 if they are female. Around the turn of the 20th century, the average life expectancy for women was 48, and for men it was 46. A century earlier, when childbirth was more dangerous, women had the lower life expectancy, around 35, compared with 37 for men.

Americans are living longer because medical care and public sanitation have improved substantially. However, infant survival and life expectancies are lower in the United States than in other developed countries because of disparities in wealth, education, and access to health care. In Japan in 2003, men could expect to live to age 77.6 and women to 84.4; in Sweden men could expect to live to age 77.3 and women to 82.8. In western Europe, the infant mortality rate is about 5 deaths per 1,000 births; in Japan it is 3.3; in the United States it is 6.8.

In the American population, wealthier people live longer, healthier lives than do poorer people. Great differences between rich and poor can produce poor health for the poorest citizens. From the 1920s to the early 1970s, America experienced an expansion of the middle class. Since then, the rich have nearly doubled their share of the country’s wealth. Hopelessness and rage can lead to substance abuse, violence, and mental depression, which can negatively affect health and longevity. More direct effects of poverty that shorten life spans for the poorest populations include malnutrition, exposure to extremes of heat and cold, and lack of medical attention.

More cohesive communities with a more equitable distribution of income and goods, even if relatively poor, tend to have better overall health than those with great disparities in wealth. For example, in the early 1990s the District of Columbia, where there are great disparities between the wealthy neighborhoods and the majority of poor neighborhoods, had an overall life expectancy of 62 for men and 74 for women. In Kansas, where the median household income was below that of Washington, D.C., but where the social differences are less sharply defined, the life expectancy was 73 for men and 80 for women.

Life expectancies also differ substantially by ethnicity and race. In 1999, whites, who tend to be wealthier, had a life expectancy of 77.3, and blacks, who tend to have less wealth, had a life expectancy of 71.4. This is, however, a smaller gap than once existed.

As noted earlier, women have a longer life expectancy than men. This is because women have a somewhat stronger immune system and suffer less from stress-related illnesses and from alcoholism, drug abuse, and violence. Because of the longer female life span, the U.S. population had more women than men in 2003—147.7 million women compared to 142.6 million men. Up to age 30, however, men outnumbered women in the United States, for two reasons: slightly more males are born than females, and slightly more young men immigrate into the United States than women.

Growth through Natural Increase: Deaths - Disease and Death in Early America

The small groups of people who migrated to the Americas from Asia thousands of years ago brought few germs with them. Although accidents and malnourishment were always possible, few infectious diseases were present in the Americas. When explorers and settlers arrived from densely populated Europe, they introduced diseases such as smallpox, measles, influenza, tuberculosis, whooping cough, scarlet fever, malaria, and gonorrhea. Africans brought smallpox as well, along with yellow fever, dengue fever, and malaria. Most Europeans and Africans had stronger immunities to the common diseases of their homelands, and Africans had discovered how to inoculate themselves against smallpox. Native Americans had no immunity to these imported diseases, and they died in large numbers. One estimate indicates disease was responsible for reducing native populations by 25 to 50 percent (in comparison, warfare reduced native populations by about 10 percent during the 18th and 19th centuries). Some Native American nations became extinct. Starvation and dislocation lasting into the 20th century also contributed to high death rates among Native Americans.

The earliest European settlers in the 17th century experienced high death rates. In Virginia, only about a third of the 104 people who came from England in May 1607 survived eight months after arriving. By 1624, about 7,000 settlers had come ashore, but only about 1,200 remained alive. The emphasis on searching for gold and quick profits meant that these first colonists paid little attention to producing food, building houses, or establishing permanent settlements. Starvation, exposure to the elements, and war with the native peoples caused large numbers of deaths. Half of the first settlers in New England did not survive the first winter, in 1620. However, the death rate decreased sharply in the north as colonists arrived in family groups and quickly created farms and towns to provide economic support. As a consequence of the low death rate, the population in the north grew rapidly without the need for many additional immigrants.

At first more European men than women lived in the south, and the southern population grew more slowly than the northern population. Deaths matched or surpassed births. The hotter climate in the south bred diseases such as malaria and dysentery, and European laborers frequently died of these and other semitropical diseases. Africans, who were imported to labor in the fields, were susceptible to lung diseases, but had some protection against malaria and yellow fever, and against smallpox if they had been inoculated in their homelands. African slaves shared their knowledge of smallpox inoculation during the 18th century, and the English discovered a vaccine against smallpox in the early 19th century. Even so, most diseases remained untreatable because the causes of illness were not understood.

Another source of disease emerged when large cities grew up around northern ports in the 17th and 18th centuries. These early cities were dirty places that grew haphazardly, without provision for clean water or sewage disposal. They served as ports of entry not only for travelers and immigrants but also for the diseases these voyagers brought with them. Epidemics of smallpox, yellow fever, measles, mumps, scarlet fever, and influenza frequently swept through the cities, while the isolated countryside was often spared these devastating illnesses. Among the worst of these was a series of yellow fever epidemics that hit Philadelphia in the 1790s. Ten percent of the population died in 1793, and smaller epidemics occurred in New York, Harrisburg, and other cities.

Growth through Natural Increase: Deaths - Improved Sanitation

These outbreaks prompted the first concerted efforts at health reform in the late 18th and early 19th centuries. Major northern cities began constructing central water systems and collecting garbage. Central water systems meant that people in the largest cities had cleaner water for drinking and water for washing more frequently. Central water systems also made obsolete the rain barrels where disease-carrying mosquitoes bred. Cities invested in nuisance abatement, which included measures such as draining swamps and flooded areas, cleaning outhouses untended by landlords, tearing down abandoned housing, killing rats and mice, rounding up stray dogs, supervising cemeteries and burial practices, enforcing sanitation measures and market inspections, removing trash, and cleaning streets. Cities also enforced the quarantine of arriving passengers until all seemed healthy. Merchants often protested when their ships were quarantined. However, merchants were convinced of the effectiveness of such measures after quarantines helped diminish death rates during cholera epidemics in the 1840s.

By the middle of the 19th century, these civic reforms made the northern cities healthier than the countryside. Rural areas, however, could not afford the public health measures that improved conditions in the largest and most prosperous cities. Cholera was a major killer on wagon trains heading West. Yellow fever, malaria, hookworm, and other maladies still prevailed in the South, which experienced major yellow fever epidemics in the 1850s and in 1873. These epidemics led to the creation of the National Board of Health and a federal quarantine system.

In the mid-19th century, the development of the germ theory, which stated that microorganisms cause infectious diseases, helped people understand how diseases were transmitted. Antiseptic procedures began to be used, saving many lives in surgery and childbirth. Concerned individuals and private groups carried on much of the early fight against germs and disease. Mothers sought to improve health by attacking the germs that might harm their families. They taught their children to brush their teeth, use a handkerchief when blowing their nose, cover their mouths when coughing, wash with soap, and never spit. This concern for health and sanitation even helped fuel the woman’s suffrage movement, as many women demanded the right to vote in order to push for clean water, clean streets, and the pasteurization of milk. In the second half of the 19th century, the health and longevity of African Americans and their children improved substantially after the end of slavery enabled them to form permanent families. Enslaved children had been undernourished, poorly clothed, and denied education. When plantation owners no longer made the decisions about child care, children were healthier and better educated. And after 1867 the Granger movement, which brought farmers together to solve common problems, helped raise standards of sanitation on farms.

By the turn of the 20th century, the United States was a major center for medical research, and vaccines, antiseptic methods, and preventive measures substantially improved medical care. One estimate is that by 1910 a patient had a 50-50 chance of being cured by a doctor’s advice. As the 20th century began, deaths from communicable diseases were generally declining, although deaths from tuberculosis and influenza remained significant. At the same time, degenerative diseases of old age, such as heart disease, started to become more common causes of death.

Improvements in medicine, sanitation, and health, however, were countered by rapid industrialization of the United States in the late 19th century, which created air and water pollution, overcrowded cities, and substantial pockets of abject poverty in urban and rural areas. The Progressive movement of the late 19th and early 20th centuries addressed the health problems of the urban poor. Its many reforms included meat inspections, the Pure Food and Drug Act, and pasteurization of milk. State and federal governments began to enforce public health measures. The well-being of residents was no longer only a personal or a municipal matter, as state and federal agencies began to bring health reforms to larger numbers of Americans.

The New Deal, the government’s program in the 1930s to counteract the effects of the Great Depression, continued the Progressive agenda of improving health and sanitation. It was particularly effective in improving conditions in the South, which lagged behind the health advances made in the North. This regional disparity was largely because the rural, agricultural South lacked the financial resources of the industrial North. The Civil Works Administration, a New Deal agency that provided work relief in 1933 and 1934, targeted malaria as a severe problem in the South. One aspect of the agency’s activities was building improved housing with screened windows to keep out disease-carrying mosquitoes.

Growth through Natural Increase: Deaths - Better Health Care

Access to modern medicine also began to equalize with the New Deal. After 1935 the Social Security Administration began to provide medical aid to children, pregnant women, and the disabled. During this time, private, commercial health insurance began to be developed. In 1929 a group of schoolteachers in Dallas, Texas, contracted with a local hospital to provide health coverage for a fixed fee. Shortly thereafter, the American Hospital Association created Blue Cross and Blue Shield to offer health insurance policies for groups. Health maintenance organizations (HMOs) were developed in the 1940s but did not become widespread until the 1980s.

Higher levels of medical care reached millions as people joined the armed forces during World War II. Community health also improved in many rural areas near military bases, as the government modernized water systems and sewage plants, exterminated mosquitoes and other disease-carrying insects, campaigned against sexually transmitted diseases, and provided direct medical attention to civilian workers at the bases.

The federal Department of Health, Education, and Welfare (now the Department of Health and Human Services) was created in 1953. It underwrote the construction of hospitals and clinics and provided funds for medical research. Medicare and Medicaid were added to the Social Security laws in the mid-1960s to offer medical care to the elderly and to the needy. In the 1970s the federal government funded toxic waste cleanups and promoted clean air and water.

Modern antibiotics—including sulfa drugs and penicillin first used during World War II—became available to the American public in the postwar years. These drugs provided the first effective weapons against bacterial infections, and their use transformed medicine in the 1950s. Medical researchers in the 1950s also developed new vaccines, including one against polio. The annual death rate in 1940 (age-adjusted to discount any effect of the postwar baby boom), before the availability of the new antibiotics, was 10.76 percent; by 1960 it was down to 7.6.

Growth through Natural Increase: Deaths - Current Trends

Since those days of miracle drugs, however, the rates for cancer have risen, despite considerable improvements in treatment. Cancer and heart disease were the leading causes of death in the United States at the beginning of the 21st century, in part due to the aging of the American population and the successes in curing other diseases. Another reason these diseases became more common is the unhealthy lifestyle of many Americans, who eat high-fat foods and high-calorie snacks and do not exercise enough. In addition, pollution is a suspected cause of cancer.

Additionally, new diseases emerged and old diseases resurfaced in the last quarter of the 20th century. The most serious of the new diseases was acquired immunodeficiency syndrome (AIDS). In 1995 it ranked as the eighth leading cause of death in the United States, but it has since declined significantly. Some diseases—such as tuberculosis, thought to be nearly wiped out because of antibiotics—developed resistance to drugs most commonly used to treat it. Cases of tuberculosis increased during the 1980s, and decreased only after 1991, when the government started taking aggressive steps to halt the increase.

A significant cause of death in the United States in the 20th century is unrelated to disease.During the span of the 20th century, homicide rose from insignificant levels to become a major cause of death. It was, in 1998, the number-three cause of death among children from the ages of 1 to 4, the number-four cause of death among children from 5 to 14, and the number-two cause among young adults from 15 to 24. Only after age 45 does homicide disappear as a major cause of death. While homicide rates in the United States remain higher than in other industrialized nations, in the 1990s the homicide rate began to fall dramatically. In 1991 there were 9.8 homicide victims for every 100,000 people in the United States; by 1999 the rate had decreased to 5.7 victims per 100,000.