Visual History of the World

(CONTENTS)
 

 


HISTORY OF CIVILIZATION & CULTURE

From Prehistoric to Romanesque  Art
Gothic Art
Renaissance  Art
Baroque and Rococo Art
The Art of Asia
Neoclassicism, Romanticism  Art
Art Styles in 19th century
Art of the 20th century
Artists that Changed the World
Design and Posters
Photography
Classical Music
Literature and Philosophy

Visual History of the World
Prehistory
First Empires
The Ancient World
The Middle Ages
The Early Modern Period
The Modern Era
The World Wars and Interwar Period
The Contemporary World

Dictionary of Art and Artists

 




The Contemporary World

1945 to the present



After World War II, a new world order came into being in which two superpowers, the United States and the Soviet Union, played the leading roles. Their ideological differences led to the arms race of the Cold War and fears of a global nuclear conflict. The rest of the world was also drawn into the bipolar bloc system, and very few nations were able to remain truly non-aligned. The East-West conflict came to an end in 1990 with the collapse of the Soviet Union and the consequent downfall of the Eastern Bloc. Since that time, the world has been driven by the globalization of worldwide economic and political systems. The world has, however, remained divided: The rich nations of Europe, North America, and East Asia stand in contrast to the developing nations of the Third World.



The first moon landing made science-fiction dreams reality in the year 1969.
Space technology has made considerable progress as the search for new
possibilities of using space continues.

 

 


Key Ideas:

Globalization

 

 

Since the end of the East-West conflict in the early 1990s, the buzzword "globalization" has been used to refer to a various processes affecting almost all areas of life. There really is no clear definition of this contentious phrase that does justice to all the interpretations of the term, but it generally means the homogenization of standards and procedures throughout the world. Through the fast-paced progress of information technology, which makes worldwide communication possible in real time, distances and national borders are increasingly losing significance in the financial, political, and cultural decision-making processes. Networks arc created among national companies, and regional events now have increased economic and political effects on faraway parts of the world. In this respect, the acquisition of knowledge and media expertise are ever more important. Particularly in industrial countries, a "knowledge society" is replacing the "industrial society."

 


Economics and the Markets of the 21st Century
 


1 Thanks to modern technology, computer work can be done outside the traditional office;
2 Faster and more productive: producing computer chips in Shanghai;
3 Poster from the World Economic Forum in Davos, January 2005

 

With the end of the competition between ideological systems in the 1990s and the consequent opening of additional markets, the internationalization of the economic world has taken on a completely different quality. Finance, product, and service markets throughout the world are increasingly interwoven.

The most 2 progressive sector in the process of globalization is the economic globalization of international financial markets, where around the clock huge amounts of capital are transferred within seconds from one country to another, and from one continent to another.

Transnational conglomerates coordinate their activities 1 worldwide and choose the most advantageous production and delivery bases.

Supply and demand are coordinated globally and 4 price regulation is left to the market.

Through this process, practically the whole world has become a 3 market.

Nations find themselves in harsh local competition for labor and the favors of mobile capital. Many states attempt to attract investors and "human capital" into the country through the lowering of taxes and the creation ofadvantages in the basic economic framework, by the deregulation of the labor market and the further liberalization of trade, for example. How far a state should strengthen—or balance with sociopolitical measures—the disparity between rich and poor resulting from the power of the mostly affluent businesses and investors is one of the politically contentious questions posed by the new economic world order.


4 Hustle and bustle at the stock exchange in Kuwait,
March 6, 2005

 


Affluence and Poverty: The Consequences of a Globalized Economy
 

The global gross domestic product has multiplied by five since 1959. Since the mid- to late 1990s, global trade has been continually growing at a very fast pace; external investment are booming. A majority of direct investments still takes place between the industrial countries, but increasingly capital is flowing into the developing countries as well.

Because 6 labor is relatively cheap there, these nations are being integrated into the global production system of the transnational companies.

Particularly in the newly industrializing nations, such as China or India, the opening of the markets has led to high growth rates and positive effects on the labor markets of the poorer countries.

On the other hand, this improvement is only relative. Sub-Saharan Africa, for example, remains cut off from the benefits of a global economy, and its population has 5 little access to information technology and communication networks that are the positive consequences of globalization.

Here, an increase in poverty is visible and economic output is in decline. Huge foreign debt burdens African state economies. Primary goods such as food and raw materials, which often constitute the wealth of the developing countries, are playing an ever smaller role compared to high-tech goods in the world market, and because the means for processing their own raw materials are underdeveloped in Africa, as well as in many of the countries of Latin America, these regions are increasingly reliant on imports from the developed industrial nations. Whether these developing countries will succeed in integrating themselves advantageously into the global world economy is very questionable, in view of the unstable conditions and the strong tribal and clan structures in these areas.


6 Chinese workers produce trousers for a international company


5 Egyptian farmers on a
donkey cart with cauliflowers, 2001

 


"One World" versus "Coca-Cola Imperialism"
 

Globalization is reflected in other areas of life such as culture and lifestyle.

8
Modern mass media and increased mobility favor a sort of cultural globalization.

African cooking and Indian films have become as common in Europe as Western fast food has become in Asia or Hollywood films in Arabia.

Optimists see this mingling of world society as a chance to integrate 9 "the foreign" into one's own cultural value system and in this way to increase mutual tolerance.

Growing commonalities in the sense of a recognized universal value system, such as human rights, can develop in this way. This perspective presupposes free access to information and knowledge.

In contrast, critics emphasize the economic dominance of the rich industrial nations in the media, through which they force their Western model of affluence on the weaker countries for their own economic advantage. This feeling of cultural hegemony is expressed in phrases such as "Coca-Cola imperialism" or "Mc-Donaldization" of the world. The general commercialization and reshaping of national or regional cultures through foreign influences have in many parts of the world provoked movements seeking a return to their own traditions and values.

One can trace the radical anti-Western or 7 anti-American movements—up to and including terrorism—back to these perceived causes.

The emphasis on regional, local, and new nationalist thinking can be seen as a reaction to globalization.


8 Two veiled Jordanian students surf the
World Wide Web in an Internet cafe in 2001

 


Global Domestic Policies
 

The challenges of globalization are diverse: concerns include the growing disparity between rich and poor, and the protection of the environment. The capacity of national governments to intervene directly in global economics is limited. Thus politics must essentially be globalized if mankind is to meet the worldwide problems effectively.

In order to have a type of "world government" to guide the global economy, a strengthening of the system of the 10 United Nations and a further concentration and linking-up of international relations seems unavoidable.

An example of this process in action is the development of the European Union into a supranational organization; the European national states have given up a part of their sovereign state rights to the union, while still protecting national and regional identities. Even the non-governmental organizations such as Amnesty International work through worldwide networks, in which democratic cooperation and the opportunity to influence the world outside state diplomacy develop. Examples of these nongovernmental organizations (NGOs) are the worldwide action network "Attac," which is critical of globalization and fights for social control of the financial markets, and Greenpeace, which operates internationally against the negative environmental effects of a globalized of the economy.
 


7 Iranian demonstrators burn an American and Israel flags, 1997
9 Arabic Coca-Cola advertising in the Tunisian capital Algiers, 2004
10 Vote in the UN Security Council in New York on an increase of peacekeeping troops in the former Yugoslavia, November 1992

 


Cultural Globalization


Travelers checking in for a flight.


Process by which the experience of everyday life, marked by the diffusion of commodities and ideas, is becoming standardized around the world.

Factors that have contributed to globalization include increasingly sophisticated communications and transportation technologies and services, mass migration and the movement of peoples, a level of economic activity that has outgrown national markets through industrial combinations and commercial groupings that cross national frontiers, and international agreements that reduce the cost of doing business in foreign countries. Globalization offers huge potential profits to companies and nations but has been complicated by widely differing expectations, standards of living, cultures and values, and legal systems as well as unexpected global cause-and-effect linkages. See also free trade.

a phenomenon by which the experience of everyday life, as influenced by the diffusion of commodities and ideas, reflects a standardization of cultural expressions around the world. Propelled by the efficiency or appeal of wireless communications, electronic commerce, popular culture, and international travel, globalization has been seen as a trend toward homogeneity that will eventually make human experience everywhere essentially the same. This appears, however, to be an overstatement of the phenomenon. Although homogenizing influences do indeed exist, they are far from creating anything akin to a single world culture.

Emergence of global subcultures
Some observers argue that a rudimentary version of world culture is taking shape among certain individuals who share similar values, aspirations, or lifestyles. The result is a collection of elite groups whose unifying ideals transcend geographical limitations.


“Davos” culture
One such cadre, according to political scientist Samuel Huntington in The Clash of Civilizations (1998), comprises an elite group of highly educated people who operate in the rarefied domains of international finance, media, and diplomacy. Named after the Swiss town that began hosting annual meetings of the World Economic Forum in 1971, these “Davos” insiders share common beliefs about individualism, democracy, and market economics. They are said to follow a recognizable lifestyle, are instantly identifiable anywhere in the world, and feel more comfortable in each other’s presence than they do among their less-sophisticated compatriots.


The international “faculty club”
The globalization of cultural subgroups is not limited to the upper classes. Expanding on the concept of Davos culture, sociologist Peter L. Berger observed that the globalization of Euro-American academic agendas and lifestyles has created a worldwide “faculty club”—an international network of people who share similar values, attitudes, and research goals. While not as wealthy or privileged as their Davos counterparts, members of this international faculty club wield tremendous influence through their association with educational institutions worldwide and have been instrumental in promoting feminism, environmentalism, and human rights as global issues. Berger cited the antismoking movement as a case in point: the movement began as a singular North American preoccupation in the 1970s and subsequently spread to other parts of the world, traveling along the contours of academe’s global network.


Nongovernmental organizations
Another global subgroup comprises “cosmopolitans” who nurture an intellectual appreciation for local cultures. As pointed out by Swedish anthropologist Ulf Hannerz, this group advocates a view of global culture based not on the “replication of uniformity” but on the “organization of diversity.” Often promoting this view are nongovernmental organizations (NGOs) that lead efforts to preserve cultural traditions in the developing world. By the beginning of the 21st century, institutions such as Cultural Survival were operating on a world scale, drawing attention to indigenous groups who are encouraged to perceive themselves as “first peoples”—a new global designation emphasizing common experiences of exploitation among indigenous inhabitants of all lands. By sharpening such identities, these NGOs have globalized the movement to preserve indigenous world cultures.


Transnational workers
Another group stems from the rise of a transnational workforce. Indian-born anthropologist Arjun Appadurai has studied English-speaking professionals who trace their origins to South Asia but who live and work elsewhere. They circulate in a social world that has multiple home bases, and they have gained access to a unique network of individuals and opportunities. For example, many software engineers and Internet entrepreneurs who live and work in Silicon Valley, California, maintain homes in—and strong social ties to—Indian states such as Maharashtra and Punjab.


The persistence of local culture
Underlying these various visions of globalization is a reluctance to define exactly what is meant by the term culture. During most of the 20th century, anthropologists defined culture as a shared set of beliefs, customs, and ideas that held people together in recognizable, self-identified groups. Scholars in many disciplines challenged this notion of cultural coherence, especially as it became evident that members of close-knit groups held radically different visions of their social worlds. Culture is no longer perceived as a knowledge system inherited from ancestors. As a result, many social scientists now treat culture as a set of ideas, attributes, and expectations that change as people react to changing circumstances. Indeed, by the turn of the 21st century, the collapse of barriers enforced by Soviet communism and the rise of electronic commerce have increased the perceived speed of social change everywhere.

The term local culture is commonly used to characterize the experience of everyday life in specific, identifiable localities. It reflects ordinary people’s feelings of appropriateness, comfort, and correctness—attributes that define personal preferences and changing tastes. Given the strength of local cultures, it is difficult to argue that an overarching global culture actually exists. Jet-setting sophisticates may feel comfortable operating in a global network disengaged from specific localities, but these people constitute a very small minority; their numbers are insufficient to sustain a coherent cultural system. It is more important to ask where these global operators maintain their families, what kind of kinship networks they rely upon, if any, and whether theirs is a transitory lifestyle or a permanent condition. For most people, place and locality still matter. Even the transnational workers discussed by Appadurai are rooted in local communities bound by common perceptions of what represents an appropriate and fulfilling lifestyle.


Indian businessman using a cell phone on a train.


Experiencing globalization

Research on globalization has shown that it is not an omnipotent, unidirectional force leveling everything in its path. Because a global culture does not exist, any search for it would be futile. It is more fruitful to instead focus on particular aspects of life that are indeed affected by the globalizing process.


The compression of time and space
The breakdown of time and space is best illustrated by the influential “global village” thesis posed by communications scholar Marshall McLuhan in Gutenberg Galaxy (1962). Instantaneous communication, predicted McLuhan, would soon destroy geographically based power imbalances and create a global village. Later, geographer David Harvey argued that the postmodern condition is characterized by a “time-space compression” that arises from inexpensive air travel and the ever-present use of telephones, fax, and, more recently, e-mail.

There can be little doubt that people perceive the world today as a smaller place than it appeared to their grandparents. In the 1960s and ’70s immigrant workers in London relied on postal systems and personally delivered letters to send news back to their home villages in India, China, and elsewhere; it could take two months to receive a reply. The telephone was not an option, even in dire emergencies. By the late 1990s, the grandchildren of these first-generation migrants were carrying cellular phones that linked them to cousins in cities such as Calcutta (Kolkata), Singapore, or Shanghai. Awareness of time zones (when people will be awake; what time offices open) is now second nature to people whose work or family ties connect them to far-reaching parts of the world.

McLuhan’s notion of the global village presupposed the worldwide spread of television, which brings distant events into the homes of viewers everywhere. Building on this concept, McLuhan claimed that accelerated communications produce an “implosion” of personal experience—that is, distant events are brought to the immediate attention of people halfway around the world.

The spectacular growth of Cable News Network (CNN) is a case in point. CNN became an icon of globalization by broadcasting its U.S.-style news programming around the world, 24 hours a day. Live coverage of the fall of the Berlin Wall in 1989, the Persian Gulf War in 1991, and extended coverage of events surrounding the terrorist attacks in New York City and Washington, D.C., on September 11, 2001, illustrated television’s powerful global reach. Some governments have responded to such advances by attempting to restrict international broadcasting, but satellite communication makes these restrictions increasingly unenforceable.




The standardization of experience

Travel
Since the mid-1960s, the cost of international flights has declined, and foreign travel has become a routine experience for millions of middle- and working-class people. Diplomats, businesspeople, and ordinary tourists can feel “at home” in any city, anywhere in the world. Foreign travel no longer involves the challenge of adapting to unfamiliar food and living arrangements. CNN has been an essential feature of the standardized hotel experience since at least the 1990s. More significantly, Western-style beds, toilets, showers, fitness centres, and restaurants now constitute the global standard. A Japanese variant on the Westernized hotel experience, featuring Japanese-style food and accommodations, can also be found in most major cities. These developments are linked to the technology of climate control. In fact, the very idea of routine global travel was inconceivable prior to the universalization of air-conditioning. An experience of this nature would have been nearly impossible in the 1960s, when the weather, aroma, and noise of the local society pervaded one’s hotel room.


Clothing
Modes of dress can disguise an array of cultural diversity behind a facade of uniformity. The man’s business suit, with coloured tie and buttoned shirt, has become “universal” in the sense that it is worn just about everywhere, although variations have appeared in countries that are cautious about adopting global popular culture. Iranian parliamentarians, for example, wear the “Western” suit but forgo the tie, while Saudi diplomats alternate “traditional” Bedouin robes with tailored business suits, depending upon the occasion. In the early years of the 21st century, North Korea and Afghanistan were among the few societies holding out against these globalizing trends.

The emergence of women’s “power suits” in the 1980s signified another form of global conformity. Stylized trouser-suits, with silk scarves and colourful blouses (analogues of the male business suit), are now worldwide symbols of modernity, independence, and competence. Moreover, the export of used clothing from Western countries to developing nations has accelerated the adoption of Western-style dress by people of all socioeconomic levels around the world.

Some military fashions reflect a similar sense of convergence. Rebel fighters, such as those in Central Africa, South America, or the Balkans, seemed to take their style cue from the guerrilla garb worn by movie star Sylvester Stallone in his trilogy of Rambo films. In the 1990s the United States military introduced battle helmets that resembled those worn by the German infantry during World War II. Many older Americans were offended by the association with Nazism, but younger Americans and Europeans made no such connections. In 2001, a similar helmet style was worn by elite Chinese troops marching in a parade in Beijing’s Tiananmen Square.

Chinese fashion underwent sweeping change after the death in 1976 of Communist Party Chairman Mao Zedong and the resultant economic liberalization. Western suits or casual wear became the norm. The androgynous gray or blue Mao suit essentially disappeared in the 1980s, worn only by communist patriarch Deng Xiaoping and a handful of aging leaders who dressed in the uniform of the Cultural Revolution until their deaths in the 1990s—by which time Mao suits were being sold in Hong Kong and Shanghai boutiques as high-priced nostalgia wear, saturated with postmodern irony.


Entertainment
The power of media conglomerates and the ubiquity of entertainment programming has globalized television’s impact and made it a logical target for accusations of cultural imperialism. Critics cite a 1999 anthropological study that linked the appearance of anorexia in Fiji to the popularity of American television programs, notably Melrose Place and Beverly Hills 90210. Both series featured slender young actresses who, it was claimed, led Fijian women (who are typically fuller-figured) to question indigenous notions of the ideal body.

Anti-globalism activists contend that American television shows have corrosive effects on local cultures by highlighting Western notions of beauty, individualism, and sexuality. Although many of the titles exported are considered second-tier shows in the United States, there is no dispute that these programs are part of the daily fare for viewers around the world. Television access is widespread, even if receivers are not present in every household. In the small towns of Guatemala, the villages of Jiangxi province in China, or the hill settlements of Borneo, for instance, one television set—often a satellite system powered by a gasoline generator—may serve two or three dozen viewers, each paying a small fee. Collective viewing in bars, restaurants, and teahouses was common during the early stages of television broadcasting in Indonesia, Japan, Kenya, and many other countries. By the 1980s video-viewing parlours had become ubiquitous in many regions of the globe.

Live sports programs continue to draw some of the largest global audiences. The 1998 World Cup men’s football (soccer) final between Brazil and France was watched by an estimated two billion people. After the 1992 Olympic Games, when the American “Dream Team” of National Basketball Association (NBA) stars electrified viewers who had never seen the sport played to U.S. professional standards, NBA games were broadcast in Australia, Israel, Japan, China, Germany, and Britain. In the late 1990s Michael Jordan, renowned for leading the Chicago Bulls to six championships with his stunning basketball skills, became one of the world’s most recognized personalities.

Hollywood movies have had a similar influence, much to the chagrin of some countries. In early 2000 Canadian government regulators ordered the Canadian Broadcasting Corporation (CBC) to reduce the showing of Hollywood films during prime time and to instead feature more Canadian-made programming. CBC executives protested that their viewers would stop watching Canadian television stations and turn to satellite reception for international entertainment. Such objections were well grounded, given that, in 1998, 79 percent of English-speaking Canadians named a U.S. program when asked to identify their favourite television show.

Hollywood, however, does not hold a monopoly on entertainment programming. The world’s most prolific film industry is in Bombay (Mumbai), India (“Bollywood”), where as many as 1,000 feature films are produced annually in all of India’s major languages. Primarily love stories with heavy doses of singing and dancing, Bollywood movies are popular throughout Southeast Asia and the Middle East. State censors in Islamic countries often find the modest dress and subdued sexuality of Indian film stars acceptable for their audiences. Although the local appeal of Bollywood movies remains strong, exposure to Hollywood films such as Jurassic Park (1993) and Speed (1994) caused young Indian moviegoers to develop an appreciation for the special effects and computer graphics that had become the hallmarks of many American films.


Food
Food is the oldest global carrier of culture. In fact, food has always been a driving force for globalization, especially during earlier phases of European trade and colonial expansion. The hot red pepper was introduced to the Spanish court by Christopher Columbus in 1493. It spread rapidly throughout the colonial world, transforming cuisines and farming practices in Africa, Asia, and the Middle East. It might be difficult to imagine Korean cuisine without red pepper paste or Szechuan food without its fiery hot sauce, but both are relatively recent innovations—probably from the 17th century. Other New World crops, such as corn (maize), cassava, sweet potatoes, and peanuts (groundnuts), were responsible for agricultural revolutions in Asia and Africa, opening up terrain that had previously been unproductive.

One century after the sweet potato was introduced into south China (in the mid-1600s), it had become a dominant crop and was largely responsible for a population explosion that created what today is called Cantonese culture. It is the sweet potato, not the more celebrated white rice, which sustained generations of southern Chinese farmers.

These are the experiences that cause cultural meaning to be attached to particular foods. Today the descendants of Cantonese, Hokkien, and Hakka pioneers disdain the sweet potato as a “poverty food” that conjures images of past hardships. In Taiwan, by contrast, independence activists (affluent members of the rising Taiwanese middle class) have embraced the sweet potato as an emblem of identity, reviving old recipes and celebrating their cultural distinctions from “rice-eating mainlanders.”

While the global distribution of foods originated with the pursuit of exotic spices (such as black pepper, cinnamon, and cloves), contemporary food trading features more prosaic commodities, such as soybeans and apples. African bananas, Chilean grapes, and California oranges have helped to transform expectations about the availability and affordability of fresh produce everywhere in the world. Green beans are now grown in Burkina Faso in Central Africa and shipped by express air cargo to Paris, where they end up on the plates of diners in the city’s top restaurants. This particular exchange system is based on a “nontraditional” crop that was not grown in Burkina Faso until the mid-1990s, when the World Bank encouraged its cultivation as a means of promoting economic development. The country soon became Africa’s second largest exporter of green beans. Central African farmers consequently found themselves in direct competition with other “counter-season” growers of green beans from Brazil and Florida.

The average daily diet has also undergone tremendous change, with all nations converging on a diet high in meat, dairy products, and processed sugars. Correlating closely to a worldwide rise in affluence, the new “global diet” is not necessarily a beneficial trend, as it can increase the risk of obesity and diabetes. Now viewed as a global health threat, obesity has been dubbed “globesity” by the World Health Organization. To many observers, the homogenization of human diet appears to be unstoppable. Vegetarians, environmental activists, and organic food enthusiasts have organized rearguard actions to reintroduce “traditional” and more wholesome dietary practices, but these efforts have been concentrated among educated elites in industrial nations.

Western food corporations are often blamed for these dietary trends. McDonald’s, KFC (Kentucky Fried Chicken), and Coca-Cola are primary targets of anti-globalism demonstrators (who are themselves organized into global networks, via the Internet). McDonald’s has become a symbol of globalism for obvious reasons: on an average day in 2001, the company served nearly 45 million customers at more than 25,000 restaurants in 120 countries. It succeeds in part by adjusting its menu to local needs. In India, for example, no beef products are sold.

McDonald’s also succeeds in countries that might be expected to disdain fast food. In France, for example, food, especially haute cuisine, is commonly regarded as the core element of French culture. Nevertheless, McDonald’s continues to expand in the very heartland of opposition: by the turn of the 21st century there were more than 850 McDonald’s restaurants in France, employing over 30,000 people. Not surprisingly, many European protest movements have targeted McDonald’s as an agent of cultural imperialism. French intellectuals may revile the Big Mac sandwich for all that it symbolizes, but the steady growth of fast-food chains demonstrates that anti-globalist attitudes do not always affect economic behaviour, even in societies (such as France) where these sentiments are nearly universal. Like their counterparts in the United States, French workers are increasingly pressed for time. The two-hour lunch is largely a thing of the past.

Food and beverage companies attract attention because they cater to the most elemental form of human consumption. We are what we eat, and when diet changes, notions of national and ethnic identity are affected. Critics claim that the spread of fast food undermines indigenous cuisines by forcing a homogenization of world dietary preferences, but anthropological research in Russia, Japan, and Hong Kong does not support this view.

Close study of cultural trends at the local level, however, shows that the globalization of fast food can influence public conduct. Fast-food chains have introduced practices that changed some consumer behaviours and preferences. For example, in Japan, where using one’s hands to eat prepared foods was considered a gross breach of etiquette, the popularization of McDonald’s hamburgers has had such a dramatic impact on popular etiquette that it is now common to see Tokyo commuters eating in public without chopsticks or spoons.

In late-Soviet Russia, rudeness had become a high art form among service personnel. Today customers expect polite, friendly service when they visit Moscow restaurants—a social revolution initiated by McDonald’s and its employee training programs. Since its opening in 1990, Moscow’s Pushkin Square restaurant has been one of the busiest McDonald’s in the world.

The social atmosphere in colonial Hong Kong of the 1960s was anything but genteel. Cashing a check, boarding a bus, or buying a train ticket required brute force. When McDonald’s opened in 1975, customers crowded around the cash registers, shouting orders and waving money over the heads of people in front of them. McDonald’s responded by introducing queue monitors—young women who channeled customers into orderly lines. Queuing subsequently became a hallmark of Hong Kong’s cosmopolitan, middle-class culture. Older residents credit McDonald’s for introducing the queue, a critical element in this social transition.

Yet another innovation, in some areas of Asia, Latin America, and Europe, was McDonald’s provision of clean toilets and washrooms. In this way the company was instrumental in setting new cleanliness standards (and thereby raising consumer expectations) in cities that had never offered public facilities. Wherever McDonald’s has set up business, it rapidly has become a haven for an emerging class of middle-income urbanites.

The introduction of fast food has been particularly influential on children, especially since so many advertisements are designed to appeal to them. Largely as a consequence of such advertising, American-style birthday parties have spread to many parts of the world where individual birth dates previously had never been celebrated. McDonald’s and KFC have become the leading venues for birthday parties throughout East Asia, with special rooms and services provided for the events. These and other symbolic effects make fast food a powerful force for dietary and social change, because a meal at these restaurants will introduce practices that younger consumers may not experience at home—most notably, the chance to choose one’s own food. The concept of personal choice is symbolic of Western consumer culture. Visits to McDonald’s and KFC have become signal events for children who approach fast-food restaurants with a heady sense of empowerment.


Religion and globalization
Central to Huntington’s thesis in The Clash of Civilizations is the assumption that the post-Cold War world would regroup into regional alliances based on religious beliefs and historical attachments to various “civilizations.” Identifying three prominent groupings—Western Christianity (Roman Catholicism and Protestantism), Orthodox Christianity (Russian and Greek), and Islam, with additional influences from Hinduism and Confucianism—he predicted that the progress of globalization would be severely constrained by religio-political barriers. The result would be a “multipolar world.” Huntington’s view differed markedly from those who prophesied a standardized, homogenized global culture.

There is, however, considerable ethnographic evidence, gathered by anthropologists and sociologists, that refutes this model of civilizational clash and suggests instead a rapid diffusion of religious and cultural systems throughout the world. Islam is one case in point, given that it constitutes one of the fastest-growing religions in the United States, France, and Germany—supposed bastions of Western Christianity. Before the end of the 20th century, entire arrondissements (districts) of Paris were dominated by Muslims, the majority of them French citizens born and reared in France. Thirty-five percent of students in the suburban Dearborn, Michigan, public school system were Muslim in 2001, making the provision of ḥalāl (“lawful” under Islam) meals at lunchtime a hot issue in local politics. By the start of the 21st century, Muslims of Turkish origin constituted the fastest-growing sector of Berlin’s population, and, in northern England, the old industrial cities of Bradford and Newcastle had been revitalized by descendants of Pakistani and Indian Muslims who immigrated during the 1950s and ’60s.

From its inception, Christianity has been an aggressively proselytizing religion with a globalizing agenda. Indeed, the Roman Catholic Church was arguably the first global institution, having spread rapidly throughout the European colonial world and beyond. Today, perhaps the fastest-growing religion is evangelical Christianity. Stressing the individual’s personal experience of divinity (as opposed to priestly intercession), evangelicalism has gained wide appeal in regions such as Latin America and sub-Saharan Africa, presenting serious challenges to established Catholic churches. Following the collapse of Soviet power in 1991, the Russian Orthodox church began the process of rebuilding after more than seven decades of repression. At the same time, evangelical missionaries from the United States and Europe shifted much of their attention from Latin America and Africa to Russia, alarming Russian Orthodox leaders. By 1997, under pressure from Orthodox clergy, the Russian government promoted legislation to restrict the activities of religious organizations that had operated in Russia for less than 15 years, effectively banning Western evangelical missionaries. The debate over Russian religious unity continues, however, and, if China is any guide, such legislation could have little long-term effect.

In China, unauthorized “house churches” became a major concern for Communist Party officials who attempted to control Muslim, Christian, and Buddhist religious activity through state-sponsored organizations. Many of the unrecognized churches are syncretic in the sense that they combine aspects of local religion with Christian ideas. As a result they have been almost impossible to organize, let alone control.

Social scientists confirm the worldwide resurgence, since the late 20th century, of conservative religion among faiths such as Islam, Hinduism, Buddhism, and even Shinto in Japan and Sikhism in India. The social and political connotations of these conservative upsurges are unique to each culture and religion. For example, some sociologists have identified Christian evangelicalism as a leading carrier of modernization: its emphasis on the Bible is thought to encourage literacy, while involvement in church activities can teach administrative skills that are applicable to work environments. As a sociologist of religion, Berger argues that “there may be other globalizing popular movements [today], but evangelicalism is clearly the most dynamic.”


Demographic influences
Huntington’s “clash of civilizations” thesis assumes that the major East Asian societies constitute an alliance of “Confucian” cultures that share a common heritage in the teachings of Confucius, the ancient Chinese sage. Early 21st-century lifestyles in Tokyo, Seoul, Beijing, Taipei, and Hong Kong, however, show far more evidence of globalization than Confucianization. The reputed hallmarks of Confucianism—respect for parental authority and ancestral traditions—are no more salient in these cities than in Boston, London, or Berlin. This is a consequence of (among other things) a steady reduction in family size that has swept through East Asian societies since the 1980s. State-imposed restrictions on family size, late childbearing, and resistance to marriage among highly educated, working women have undermined the basic tenets of the Confucian family in Asia.

Birth rates in Singapore and Japan, in fact, have fallen below replacement levels and are at record low levels in Hong Kong; birth rates in Beijing, Shanghai, and other major Chinese cities are also declining rapidly. These developments mean that East Asia—like Europe—could face a fiscal crisis as decreasing numbers of workers are expected to support an ever-growing cohort of retirees. By 2025, China is projected to have 274 million people over age 60—more than the entire 1998 population of the United States. The prospects for other East Asian countries are far worse: 17.2 percent of Japan’s 127 million people were over age 65 in 2000; by 2020 that percentage could rise to 27.

Meanwhile, Asia’s “Confucian” societies face a concurrent revolution in family values: the conjugal family (centring on the emotional bond between wife and husband) is rapidly replacing the patriarchal joint family (focused on support of aged parents and grandparents). This transformation is occurring even in remote, rural regions of northwest China where married couples now expect to reside in their own home (“neolocal” residence) as opposed to the house or compound of the groom’s parents (“patrilocal” residence). The children produced by these conjugal units are very different from their older kin who were reared in joint families: today’s offspring are likely to be pampered only children known as “Little Emperors” or “Little Empresses.” Contemporary East Asian families are characterized by an ideology of consumerism that is diametrically opposed to the neo-authoritarian Confucian rhetoric promoted by political leaders such as Singapore’s Lee Kuan Yew and Hong Kong’s Tung Chee-hwa at the turn of the 21st century.

Italy, Mexico, and Sweden (among other countries) also experienced dramatic reductions in family size and birth rates during the late 20th century. Furthermore, new family formations are taking root, such as those of the transnational workers who maintain homes in more than one country. Multi-domiciled families were certainly evident before the advent of cheap air travel and cellular phones, but new technologies have changed the quality of life (much for the better) in diaspora communities. Thus, the globalization of family life is no longer confined to migrant workers from developing economies who take low-paying jobs in advanced capitalist societies. The transnational family is increasingly a mark of high social status and affluence.



Japanese McDonald's fast food as an evidence of corporate globalization
and the integration of the same into different cultures.
 


Political consequences of globalization

Challenges to national sovereignty and identity
Anti-globalism activists often depict the McDonald’s, Disney, and Coca-Cola corporations as agents of globalism or cultural imperialism—a new form of economic and political domination. Critics of globalism argue that any business enterprise capable of manipulating personal tastes will thrive, whereas state authorities everywhere will lose control over the distribution of goods and services. According to this view of world power, military force is perceived as hopelessly out of step or even powerless; the control of culture (and its production) is seen as far more important than the control of political and geographic borders. Certainly, it is true that national boundaries are increasingly permeable and any effort by nations to exclude global pop culture usually makes the banned objects all the more irresistible.

The commodities involved in the exchange of popular culture are related to lifestyle, especially as experienced by young people: pop music, film, video, comics, fashion, fast foods, beverages, home decorations, entertainment systems, and exercise equipment. Millions of people obtain the unobtainable by using the Internet to breach computer security systems and import barriers. “Information wants to be free” was the clarion call of software designers and aficionados of the World Wide Web in the 1990s. This code of ethics takes its most creative form in societies where governments try hardest to control the flow of information (e.g., China and Iran). In 1999, when Serbian officials shut down the operations of Radio B92, the independent station continued its coverage of events in the former Republic of Yugoslavia by moving its broadcasts to the Internet.

The idea of a borderless world is reflected in theories of the “virtual state,” a new system of world politics that is said to reflect the essential chaos of 21st-century capitalism. In Out of Control (1994), author Kevin Kelly predicted that the Internet would gradually erode the power of governments to control citizens; advances in digital technology would instead allow people to follow their own interests and form trans-state coalitions. Similarly, Richard Rosecrance, in The Rise of the Virtual State (1999), wrote that military conflicts and territorial disputes would be superseded by the flow of information, capital, technology, and manpower between states. Many scholars disagreed, insisting that the state was unlikely to disappear and could continue to be an essential and effective basis of governance.

Arguments regarding the erosion of state sovereignty are particularly unsettling for nations that have become consumers rather than producers of digital technology. Post-Soviet Russia, post-Mao China, and post-Gaullist France are but three examples of Cold War giants facing uncertain futures in the emerging global system. French intellectuals and politicians have seized upon anti-globalism as an organizing ideology in the absence of other unifying themes. In Les cartes de la France à l’heure de la mondialisation (2000; “France’s Assets in the Era of Globalization”), French Foreign Minister Hubert Vedrine denounced the United States as a “hyperpower” that promotes “uniformity” and “unilateralism.” Speaking for the French intelligentsia, he argued that France should take the lead in building a “multipolar world.” Ordinary French citizens also were concerned about losing their national identity, particularly as the regulatory power of the European Union began to affect everyday life. Sixty percent of respondents in a 1999 L’Expansion poll agreed that globalization represented the greatest threat to the French way of life.


Anti-globalism movements and the Internet
Anti-globalism organizers are found throughout the world, not least in many management organizations. They are often among the world’s most creative and sophisticated users of Internet technology. This is doubly ironic, because even as NGOs contest the effects of globalization, they exhibit many of the characteristics of a global, transnational subculture; the Internet, moreover, is one of the principal tools that makes globalization feasible and organized protests against it possible. For example, Greenpeace, an environmentalist NGO, has orchestrated worldwide protests against genetically modified (GM) foods. Highly organized demonstrations appeared, seemingly overnight, in many parts of the world, denouncing GM products as “Frankenfoods” that pose unknown (and undocumented) dangers to people and to the environment. The bioengineering industry, supported by various scientific organizations, launched its own Internet-based counterattack, but the response was too late and too disorganized to outflank Greenpeace and its NGO allies. Sensational media coverage had already turned consumer sentiment against GM foods before the scientific community even entered the debate.

The anti-GM food movement demonstrates the immense power of the Internet to mobilize political protests. This power derives from the ability of a few determined activists to communicate with thousands (indeed millions) of potential allies in an instant. The Internet’s power as an organizing tool became evident during the World Trade Organization (WTO) protests in Seattle, Washington, in 1999, in which thousands of activists converged on the city, disrupting the WTO meetings and drawing the world’s attention to criticisms of global trade practices. The Seattle protests set the stage for similar types of activism in succeeding years.



HSBC is the largest bank in the world and operates across the globe.


The illusion of global culture

Localized responses
For hundreds of millions of urban people, the experience of everyday life has become increasingly standardized since the 1960s. Household appliances, utilities, and transportation facilities are increasingly universal. Technological “marvels” that North Americans and Europeans take for granted have had even more profound effects on the quality of life for billions of people in the less-developed world. Everyday life is changed by the availability of cold beverages, hot water, frozen fish, screened windows, bottled cooking-gas, or the refrigerator. It would be a mistake, however, to assume that these innovations have an identical, homogenizing effect wherever they appear. For most rural Chinese, the refrigerator has continued to be seen as a status symbol. They use it to chill beer, soft drinks, and fruit, but they dismiss the refrigeration of vegetables, meat, and fish as unhealthy. Furthermore, certain foods (notably bean curd dishes) are thought to taste better when cooked with more traditional fuels such as coal or wood, as opposed to bottled gas.

It remains difficult to argue that the globalization of technologies is making the world everywhere the same. The “sameness” hypothesis is only sustainable if one ignores the internal meanings that people assign to cultural innovations.


Borrowing and “translating” popular culture
The domain of popular music illustrates how difficult it is to unravel cultural systems in the contemporary world: Is rock music a universal language? Do reggae and ska have the same meaning to young people everywhere? American-inspired hip-hop (rap) swept through Brazil, Britain, France, China, and Japan in the 1990s. Yet Japanese rappers developed their own, localized versions of this art form. Much of the music of hip-hop, grounded in urban African American experience, is defiantly antiestablishment, but the Japanese lyric content is decidedly mild, celebrating youthful solidarity and exuberance. Similar “translations” between form and content have occurred in the pop music of Indonesia, Mexico, and Korea. Even a casual listener of U.S. radio can hear the profound effects that Brazilian, South African, Indian, and Cuban forms have had on the contemporary American pop scene. An earlier example of splashback—when a cultural innovation returns, somewhat transformed, to the place of its origin—was the British Invasion of the American popular music market in the mid-1960s. Forged in the United States from blues and country music, rock and roll crossed the Atlantic in the 1950s to captivate a generation of young Britons who, forming bands such as the Beatles and the Rolling Stones, made the music their own, then reintroduced it to American audiences with tremendous success. The flow of popular culture is rarely, if ever, unidirectional.


Subjectivity of meaning—the case of Titanic
A cultural phenomenon does not convey the same meaning everywhere. In 1998, the drama and special effects of the American movie Titanic created a sensation among Chinese fans. Scores of middle-aged Chinese returned to the theatres over and over—crying their way through the film. Enterprising hawkers began selling packages of facial tissue outside Shanghai theatres. The theme song of Titanic became a best-selling CD in China, as did posters of the young film stars. Chinese consumers purchased more than 25 million pirated (and 300,000 legitimate) video copies of the film.

One might ask why middle-aged Chinese moviegoers became so emotionally involved with the story told in Titanic. Interviews among older residents of Shanghai revealed that many people had projected their own, long-suppressed experiences of lost youth onto the film. From 1966 to 1976 the Cultural Revolution convulsed China, destroying any possibility of educational or career advancement for millions of people. At that time, communist authorities had also discouraged romantic love and promoted politically correct marriages based on class background and revolutionary commitment. Improbable as it might seem to Western observers, the story of lost love on a sinking cruise ship hit a responsive chord among the veterans of the Cultural Revolution. Their passionate, emotional response had virtually nothing to do with the Western cultural system that framed the film. Instead, Titanic served as a socially acceptable vehicle for the public expression of regret by a generation of aging Chinese revolutionaries who had devoted their lives to building a form of socialism that had long since disappeared.

Chinese President Jiang Zemin invited the entire Politburo of the Chinese Communist Party to a private screening of Titanic so that they would understand the challenge. He cautioned that Titanic could be seen as a Trojan horse, carrying within it the seeds of American cultural imperialism.

Chinese authorities were not alone in their mistrust of Hollywood. There are those who suggest, as did China’s Jiang, that exposure to Hollywood films will cause people everywhere to become more like Americans. Yet anthropologists who study television and film are wary of such suggestions. They emphasize the need to study the particular ways in which consumers make use of popular entertainment. The process of globalization looks far from hegemonic when one focuses on ordinary viewers and their efforts to make sense of what they see.

Another case in point is anthropologist Daniel Miller’s study of television viewing in Trinidad, which demonstrated that viewers are not passive observers. In 1988, 70 percent of Trinidadians who had access to a television watched daily episodes of The Young and the Restless, a series that emphasized family problems, sexual intrigue, and gossip. Miller discovered that Trinidadians had no trouble relating to the personal dramas portrayed in American soap operas, even though the lifestyles and material circumstances differed radically from life in Trinidad. Local people actively reinterpreted the episodes to fit their own experience, seeing the televised dramas as commentaries on contemporary life in Trinidad. The portrayal of American material culture, notably women’s fashions, was a secondary attraction. In other words, it is a mistake to treat television viewers as passive.


The ties that still bind
Local culture remains a powerful influence in daily life. People are tied to places, and those places continue to shape particular norms and values. The fact that residents of Moscow, Beijing, and New Delhi occasionally eat at McDonald’s, watch Hollywood films, and wear Nike athletic shoes (or copies thereof) does not make them “global.” The appearance of homogeneity is the most salient, and ultimately the most deceptive, feature of globalization. Outward appearances do not reveal the internal meanings that people assign to a cultural innovation. True, the standardization of everyday life will likely accelerate as digital technology comes to approximate the toaster in “user-friendliness.” But technological breakthroughs are not enough to create a world culture. People everywhere show a desire to partake of the fruits of globalization, but they just as earnestly want to celebrate the distinctiveness of their own cultures.

James L. Watson

Encyclopaedia Britannica



 

 

 


20th century events



From Wikipedia, the free encyclopedia
 

Contents

1 The world at the beginning of the century
2 "The war to end all wars": World War I (1914–1918)
3 The Russian Revolution and communism
4 Between the wars
4.1 Economic depression
4.2 The rise of dictatorship
5 Global war: World War II (1939–1945)
5.1 The war in Europe
5.2 The war in the Pacific
5.3 The Holocaust
5.4 The Nuclear Age begins
6 The post-war world
6.1 The end of empires: decolonization
6.2 Mutually assured destruction: the Cold War (1947–1991)
6.3 War by proxy
6.4 The space race
6.5 The end of the Cold War
6.6 Information and communications technology
7 The world at the end of the century
8 World population
9 See also
10 References
11 Sources
12 External links

 

The 20th century events include many notable events which occurred throughout the 20th century, which began on January 1, 1901 and ended on December 31, 2000, according to the Gregorian calendar.



The world at the beginning of the century
In Europe, the British Empire achieved the height of its power. Germany and Italy, which came into existence as unified nations at the end of the 19th century, grew in power, challenging the traditional hegemony of Britain and France. With nationalism in full force at this time, the European powers competed with each other for land, military strength and economic power.

Asia and Africa were for the most part still under control of their European colonizers. The major exceptions were China and Japan. The Russo-Japanese War in 1905 was the first major instance of a European power being defeated by a so-called inferior nation. The war itself strengthened Japanese militarism and enhanced Japan's rise to the status of a world power. Tsarist Russia, on the other hand, did not handle the defeat well. The war exposed the country's military weakness and increasing economic backwardness, and contributed to the Russian Revolution of 1905, the dress rehearsal for the conclusive one in 1917.

Already in the 19th century, the United States had become an influential actor in world politics. It had made its presence known on the world stage by challenging Spain in the Spanish-American War, gaining the colonies of Puerto Rico and the Philippines as protectorates. Now, with growth in immigration and a resolution of the national unity issue through the bloody American Civil War, America was emerging as an industrial power as well, rivaling Britain, Germany, and France.

With increasing rivalry among the European powers, and the rise of Japan and the United States, the stage was set for a major upheaval in world affairs.

 

"The war to end all wars": World War I (1914–1918)
The First World War, termed "The Great War" (or simply WWI) by contemporaries, started in 1914 and ended in 1918. It was ignited by the Assassination in Sarajevo of the Austro-Hungarian Empire's heir to the throne, Erzherzog Franz Ferdinand, by Gavrilo Princip of the Serbian nationalist organization "Black Hand". Bound by Slavic nationalism to help the small Serbian state, the Russians came to the aid of the Serbs when they were attacked. Interwoven alliances, an increasing arms race, and old hatreds dragged Europe into war. The Allies, known initially as "The Triple Entente", comprised the British Empire, Russia and France, as well as Italy and the United States later in the war. On the other side, Germany, along with Austria-Hungary , Bulgaria and later the Ottoman Empire, were known as "The Central Powers".

In 1917, Russia ended hostile actions against the Central Powers after the fall of the Tsar. The Bolsheviks negotiated the Treaty of Brest-Litovsk with Germany, although it was at huge cost to Russia. Although Germany shifted huge forces from the eastern to the western front after signing the treaty, it was unable to stop the Allied advance, especially with the entrance of American troops in 1918.

The war itself was also a chance for the combatant nations to show off their military strength and technological ingenuity. The Germans introduced the machine gun, U-Boats and deadly gases. The British first used the tank. Both sides had a chance to test out their new aircraft to see if they could be used in warfare. It was widely believed that the war would be short. Unfortunately, since trench warfare was the best form of defense, advances on both sides were very slow, and came at a terrible cost in lives.

When the war was finally over in 1918, the results would set the stage for the next twenty years. First and foremost, the Germans were forced to sign the Treaty of Versailles, forcing them to make exorbitant payments to repair damages caused during the War. Many Germans felt these reparations were unfair because they did not actually "lose" the war nor did they feel they caused the war (see Stab-in-the-back legend). Germany was never occupied by Allied troops, yet it had to accept a liberal democratic government imposed on it by the victors after the abdication of Kaiser Wilhelm.

Much of the map of Europe was redrawn by the victors based upon the theory that future wars could be prevented if all ethnic groups had their own "homeland". New states like Yugoslavia and Czechoslovakia were created out of the former Austro-Hungarian Empire to accommodate the nationalist aspirations of these groups. An international body called the League of Nations was formed to mediate disputes and prevent future wars, although its effectiveness was severely limited by, among other things, its reluctance and inability to act.



The Russian Revolution and communism
The Russian Revolution of 1917 sparked a wave of communist revolutions across Europe, prompting many to believe that a socialist world revolution could be realized in the near future. However, the European revolutions were defeated, Lenin died in 1924, and within a few years Joseph Stalin displaced Leon Trotsky as the de facto leader of the Soviet Union. The idea of worldwide revolution was no longer in the forefront, as Stalin concentrated on "socialism in one country" and embarked on a bold plan of collectivization and industrialization. The majority of socialists and even many communists became disillusioned with Stalin's autocratic rule, his purges and the assassination of his "enemies", as well as the news of famines he imposed on his own people.

Communism was strengthened as a force in Western democracies when the global economy crashed in 1929 in what became known as the Great Depression. Many people saw this as the first stage of the end of the capitalist system and were attracted to Communism as a solution to the economic crisis, especially as the Soviet Union's economic development in the 1930s was strong, unaffected by the capitalist world's crisis.
 

Between the wars

Economic depression
After World War I, the global economy remained strong through the 1920s. The war had provided a stimulus for industry and for economic activity in general. There were many warning signs foretelling the collapse of the global economic system in 1929 that were generally not understood by the political leadership of the time. The responses to the crisis often made the situation worse, as millions of people watched their savings become next to worthless and the idea of a steady job with a reasonable income fading away.

Many sought answers in alternative ideologies such as communism and fascism. They believed that the capitalist economic system was collapsing, and that new ideas were required to meet the crisis. The early responses to the crisis were based upon the assumption that the free market would correct itself. This, however, did very little to correct the crisis or to alleviate the suffering of many ordinary people. Thus, the idea that the existing system could be reformed by government intervention in the economy rather than by continuing the laissez-faire approach became prominent as a solution to the crisis. Democratic governments assumed the responsibility to provide needed services in society and alleviate poverty. Thus was born the welfare state. These two politico-economic principles, the belief in government intervention and the welfare state, as opposed to the belief in the free market and private institutions, would define many political battles for the rest of the century.

The rise of dictatorship
Fascism first appeared in Italy with the rise to power of Benito Mussolini in 1922. The ideology was supported by a large proportion of the upper classes as a strong challenge to the threat of communism.

When Adolf Hitler came to power in Germany in 1933, a new variant of fascism called Nazism took over Germany and ended the German experiment with democracy. The National Socialist party in Germany was dedicated to the restoration of German honor and prestige, the unification of German-speaking peoples, and the annexation of Central and Eastern Europe as vassal states, with the Slavic population to act as slave labor to serve German economic interests. There was also a strong appeal to a mythical racial purity (the idea that Germans were the Herrenvolk or the "master race"), and a vicious anti-semitism which promoted the idea of Jews as subhuman (Untermensch) and worthy only of extermination.

Many people in Western Europe and the United States greeted the rise of Hitler with relief or indifference. They could see nothing wrong with a strong Germany ready to take on the communist menace to the east. Anti-semitism during the Great Depression was widespread as many were content to blame the Jews for causing the economic downturn.

Hitler began to put his plan in motion, annexing Austria in the Anschluss, or reunification of Austria to Germany, in 1938. He then negotiated the annexation of the Sudetenland, a German speaking mountainous area of Czechoslovakia, in the Munich Conference. The British were eager to avoid war and believed Hitler's assurance to protect the security of the Czech state. Hitler annexed the rest of the Czech state shortly afterwards. It could no longer be argued that Hitler was solely interested in unifying the German people.

Fascism was not the only form of dictatorship to rise in the post-war period. Almost all of the new democracies in the nations of Eastern Europe collapsed and were replaced by authoritarian regimes. Spain also became a dictatorship under the leadership of General Francisco Franco after the Spanish Civil War. Totalitarian states attempted to achieve total control over their subjects as well as their total loyalty. They held the state above the individual, and were often responsible for some of the worst acts in history, such as the Holocaust Adolf Hitler perpetrated on European Jews, or the Great Purge Stalin perpetrated in the Soviet Union in the 1930s.



Global war: World War II (1939–1945)

The war in Europe
This section provides a conversational overview of World War II in Europe. See main article for a fuller discussion.

Soon after the events in Czechoslovakia, Britain and France issued assurances of protection to Poland, which seemed to be next on Hitler's list. World War II officially began on September 1, 1939. On that date, Hitler unleashed his Blitzkrieg, or lightning war, against Poland. Britain and France, much to Hitler's surprise, immediately declared war upon Germany, but the help they could afford Poland was negligible. After only a few weeks, the Polish forces were overwhelmed, and its government fled to exile in London (see Polish government in Exile).

In starting World War II, the Germans had unleashed a new type of warfare, characterized by highly mobile forces and the use of massed aircraft. The German strategy concentrated upon the devotion of the Wehrmacht, or German army, to the use of tank groups, called panzer divisions, and groups of mobile infantry, in concert with relentless attacks from the air. Encirclement was also a major part of the strategy. This change smashed any expectations that the Second World War would be fought in the trenches like the first.

As Hitler's forces conquered Poland, the Soviet Union, under General Secretary Joseph Stalin, was acting out guarantees of territory under a secret part of a nonaggression pact between the USSR and Germany known as the Nazi-Soviet Pact. This treaty gave Stalin free rein to take the Baltic republics of Estonia, Latvia, and Lithuania, as well as Eastern Poland, all of which would remain in Soviet possession after the war. Stalin also launched an attack on Finland, which he hoped to reduce to little more than a Soviet puppet state, but the Red Army met staunch Finnish resistance in what became known as the Winter War, and succeeded in gaining only limited territory from the Finns. This action would later cause the Finns to ally with Germany when its attack on the Soviet Union came in 1941.

After the defeat of Poland, a period known as the Phony War ensued during the winter of 1939–1940. All of this changed on May 10, 1940, when the Germans launched a massive attack on the Low Countries (Belgium, the Netherlands, and Luxembourg), most probably to surmount the Maginot Line of defenses on the Franco-German border. This witnessed the incredible fall of Eben Emael, a Belgian fort considered impregnable and guarded by 600 Belgians, to a force of only 88 German paratroopers. The worst of this was that King Léopold III of Belgium surrendered to the Germans on May 28 without warning his allies, exposing the entire flank of the Allied forces to German panzer groups. Following the conquest of the Low Countries, Hitler occupied Denmark and Norway, beginning on April 9, 1940. Norway was strategically important because of its sea routes which supplied crucial Swedish ore to the Nazi war machine. Norway held on for a few crucial weeks, but Denmark surrendered after only four days.

With the disaster in the Low Countries, France, considered at the time to have had the finest army in world, lasted only four weeks, with Paris being occupied on June 14. Three days later, Marshal Philippe Pétain surrendered to the Germans. The debacle in France also led to one of the war's greatest mysteries, and Hitler's first great blunder, Dunkirk, where a third of a million trapped British and French soldiers were evacuated by not only British war boats, but every boat the army could find, including fishing rafts. Hitler refused to "risk" his panzers on action at Dunkirk, listening to the advice of Air Minister Hermann Göring and allowing the Luftwaffe, or German Air Force, to handle the job. The irony of this was that the escaped men would form the core of the army that was to invade the beaches of Normandy in 1944. Hitler did not occupy all of France, but about three-quarters, including all of the Atlantic coast, allowing Marshal Pétain to remain as dictator of an area known as Vichy France. However, members of the escaped French Army formed around General Charles de Gaulle to create the Free French forces, which would continue to battle Hitler in the stead of an independent France. At this moment, Italy, under Benito Mussolini, declared war on the Allies on June 10, thinking that the war was almost over, but he managed only to occupy a few hundred yards of French territory. Throughout the war, the Italians would be more of a burden to the Nazis than a boon, and would later cost them precious time in Greece.

Hitler now turned his eyes on Great Britain, which stood alone against him. He ordered his generals to draw up plans for an invasion, code named Operation Sea Lion, and ordered the Luftwaffe to launch a massive air war against the British isles, which would come to be known as the Battle of Britain. The British at first suffered steady losses, but eventually managed to turn the air war against Germany, taking down 2,698 German planes throughout the summer of 1940 to only 915 Royal Air Force (RAF) losses. The key turning point came when the Germans discontinued successful attacks against British airplane factories and radar command and coordination stations and turned to civilian bombing known as terror bombing using the distinctive "bomb" sound created by the German dive-bomber, the Stuka. The switch came after a small British bombing force had attacked Berlin. Hitler was infuriated. However, his decision to switch the attacks' focus allowed the British to rebuild the RAF and eventually force the Germans to indefinitely postpone Sea Lion.

The importance of the Battle of Britain is that it marked the beginning of Hitler's defeat. Secondly, it marked the advent of radar as a major weapon in modern air war. With radar, squadrons of fighters could be quickly assembled to respond to incoming bombers attempting to bomb civilian targets. It also allowed the identification of the type and a guess at the number of incoming enemy aircraft, as well as tracking of friendly airplanes.

Hitler, taken aback by his defeat over the skies of Britain, now turned his gaze eastward to the Soviet Union. Despite having signed the non-aggression pact with Stalin, Hitler despised communism and wished to destroy it in the land of its birth. He originally planned to launch the attack in early spring of 1941 to avoid the disastrous Russian winter. However, a pro-allied coup in Yugoslavia and Mussolini's almost utter defeat in his invasion of Greece from occupied Albania prompted Hitler to launch a personal campaign of revenge in Yugoslavia and to occupy Greece at the same time. The Greeks would have a bitter revenge of sorts; the attack caused a delay of several crucial weeks of the invasion of Russia.

On June 22, 1941, Hitler attacked Stalin with the largest army the world has ever seen. Over three million men and their weapons were put into service against the Soviet Union. Stalin had been warned about the attack, both by other countries and by his own intelligence network, but he had refused to believe it. Therefore, the Russian army was largely unprepared and suffered incredible setbacks in the early part of the war, despite Stalin's orders to counterattack the Germans. Throughout 1941, German forces, divided into 3 army groups (Army Group A, Army Group B, and Army Group C), occupied the Eastern Europe states of Ukraine and Belarus, laid siege to Leningrad (present day Saint Petersburg), and advanced to within 15 miles of Moscow. At this critical moment, the Russian winter, which began early that year, stalled the German Wehrmacht to a halt at the gates of Moscow. Stalin had planned to evacuate the city, and had already moved important government functions, but decided to stay and rally the city. Recently arrived troops from the east under the command of military genius Marshal Georgi Zhukov counterattacked the Germans and drove them from Moscow. The German army then dug in for the winter.

Here marks the third great blunder of Hitler's. He could have won the war in the USSR except for a few reasons. One, he started the war too late to avoid the Russian winter. Second, he tried to capture too much too fast; he wanted the German army to advance all the way to the Urals, which amounted to one million square miles (2,600,000 km²) of territory, when he probably should have concentrated on taking Moscow and thereby driving a wedge into heart of the Soviet Union. Third, he ignored the similar experiences of Napoleon Bonaparte nearly one hundred and fifty years earlier in his attempt to conquer Russia. Despite this, Stalin was not in a good position. Roughly two-fifths of the USSR's industrial might was in German hands. Also, the Germans were at first seen by many as liberators fighting the communists. Stalin was also not a very able general, and like Hitler, at first tried to fight the war as a military strategist. However, Hitler managed to turn all of his advantages against himself, and lost the only remaining hope for Germany: seizing the Caucasus and taking control of North Africa and the oil-rich Middle East.

Mussolini had launched an offensive in North Africa from Italian-controlled Libya into British-controlled Egypt. However, the British smashed the Italians and were on the verge of taking Libya. Hitler decided to help by sending in a few thousand troops, a Luftwaffe division, and the first-rate general Erwin Rommel. Rommel managed to use his small force to repeatedly smash massively superior British forces and to recapture the port city of Tobruk and advance into Egypt. However, Hitler, embroiled in his invasion of the Soviet Union, refused to send Rommel any more troops. If he had, Rommel might have been able to seize the Middle East, where Axis-friendly regimes had taken root in Iraq and Persia (present-day Iran). Here, Rommel could have cut the major supply route of the Soviets through Persia, and helped take the Caucasus, virtually neutralizing Britain's effectiveness in the war and potentially sealing the fate of the USSR. However, Hitler blundered again, throwing away the last vestiges of the German advantage on his coming offensive in 1942.

After the winter, Hitler launched a fresh offensive in the spring of 1942, with the aim of capturing the oil-rich Caucacus and the city of Stalingrad. However, he repeatedly switched his troops to where they were not needed. The offensive bogged down, and the entire 6th Army, considered the best of German troops, was trapped in Stalingrad. Hitler now refused to let 6th Army break out. He insisted that the German army would force its way in. Hermann Goering also assured Hitler that the Luftwaffe could supply the 6th Army adequately, when it could in reality only supply a minute fraction of the needed ammunition and rations. Eventually, the starved 6th Army surrendered, dealing a severe blow to the Germans. In the end, the defeat at Stalingrad was the turning point for the war in the east.

Meanwhile, the Japanese had attacked the United States at Pearl Harbor in Hawaii on December 7, 1941. This disastrous attack forced the Americans into the war. Hitler need not have declared war on the United States, and kept its continued neutrality in Europe, but he did not. Both he and Mussolini declared war only a few days after the attack. At the time, most German generals, preoccupied with war in Russia, did not even notice America's entrance. It was to be a crucial blunder.

Throughout the rest of 1942 and 1943, the Soviets began to gain ground against the Germans. The tank battle of Kursk is one example. However, by this time, Rommel had been forced to abandon North Africa after a defeat by Montgomery at El Alamein, and the Wehrmacht had encountered serious casualties that it could not replace. Hitler also insisted on a "hold at all costs" policy which forbade relinquishing any ground. He followed a "fight to the last man" policy that was completely ineffective. By the beginning of 1944, Hitler had lost all initiative in Russia, and was struggling even to hold back the tide turning against him.

From 1942 to 1944, the United States and Britain acted in only a limited manner in the European theater, much to the chagrin of Stalin. They drove out the Germans in Africa, invading Morocco and Algeria on November 8, 1942. Then, on July 10, 1943, the Allies invaded Sicily, in preparation for an advance through Italy, the "soft underbelly" of the Axis, as Winston Churchill called it. On September 9, the invasion of Italy began. By the winter of 1943, the southern half of Italy was in Allied hands. The Italians, most of whom did not really support the war, had already turned against Mussolini. In July, he had been stripped of power and taken prisoner, though the Italians feigned continued support of the Axis. On September 8, the Italians formally surrendered, but most of Italy not in Allied hands was controlled by German troops and those loyal to Mussolini's (Mussolini had been freed by German paratroopers) new Italian Social Republic, which in reality consisted of the shrinking zone of German control. The Germans offered staunch resistance, but by June 4, 1944, Rome had fallen.

The Second Battle of the Atlantic took place from 1942 to 1944. The Germans hoped to sever the vital supply lines between Britain and America, sinking many tons of shipping with U-boats, German submarines. However, the development of the destroyer and aircraft with a longer patrol range were effective at countering the U-boat threat. By 1944, the Germans had lost the battle.

On June 6, 1944, the Western Allies finally launched the long awaited assault on "Fortress Europe" so wanted by Stalin. The offensive, codenamed Operation Overlord, began the early morning hours of June 6. The day, known as D-day, was marked by foul weather. Rommel, who was now in charge of defending France against possible Allied attack, thought the Allies would not attack during the stormy weather, and was on holiday in Germany. Besides this, the Germans were expecting an attack, but at the natural harbor of Calais and not the beaches of Normandy; a blunder that sealed the operation's success. They did not know about the Allies' artificial harbours, and clues planted by the Allies suggested Calais as the landing site.

By this time, the war was looking ever darker for Germany. On July 20, 1944, a group of conspiring German officers attempted to assassinate Hitler. The bomb they used did injure him, but the second was not used, and a table shielded Hitler in a stroke of luck. The plotters still could have launched a coup, but only the head of occupied Paris acted, arresting SS and Gestapo forces in the city. The German propaganda minister, Joseph Goebbels, rallied the Nazis, and saved the day for Hitler.

In France, the Allies took Normandy and finally Paris on August 25. In the east, the Russians had advanced almost to the former Polish-Russian border. At this time, Hitler introduced the V-weapons, the V-1 flying bomb and, later, the V-2, the first rockets used in modern warfare. The V-1 was often intercepted by air pilots, but the V-2 was extremely fast and carried a large payload. However, this advance came too late in the war to have any real effect. The Germans were also on the verge on introducing a number of terrifying new weapons, including advanced jet aircraft, which were too fast for ordinary propeller aircraft, and submarine improvements which would allow the Germans to again fight effectively in the Atlantic. All this came too late to save Hitler. Although a September invasion of The Netherlands failed, the Allies made steady advances. In the winter of 1944, Hitler put everything into one last desperate gamble in the West, known as the Battle of the Bulge, which, despite an initial advance, was a failure, because the introduction of new Allied tanks and low troop numbers among the Germans prevented any real action being taken.

In early February 1945, the three Allied leaders, Franklin Roosevelt, Winston Churchill, and Joseph Stalin, met at newly liberated Yalta in the Crimea in the Soviet Union in the Yalta Conference. Here, they agreed upon a plan to divide post-war Europe. Most of the east went to Stalin, who agreed to allow free elections in Eastern Europe, which he never did. The west went to Britain, France, and the U.S. Post-war Germany would be split between the four, as would Berlin. Here the territory of the Cold War was set. The boundaries of a new Europe, stripped of some of its oldest ruling families, were drawn up by the three men at Yalta.

At the beginning of 1945, Hitler was on his last strings. The Russians launched a devastating attack from Poland, where they had liberated Warsaw, into Germany and Eastern Europe, intending to take Berlin. The Germans collapsed in the West, allowing the Allies to fan out across Germany. However, the Supreme Allied Commander, American general Dwight D. Eisenhower, refused to strike for Berlin, and instead became obsessed with reports of possible guerrilla activity in southern Germany, which in reality existed only in the propaganda of Joseph Goebbels. By April 25, the Russians had besieged Berlin. Hitler remained in the city in a bunker under the Chancellery garden. On April 30, he committed suicide, after a ritual wedding with his long time mistress Eva Braun. The Germans held out another 7 days under Admiral Doenitz, their new leader, but the Germans surrendered unconditionally on May 7, 1945, ending the war in Europe (see V-E Day).

Rivalries that had begun during the war, combined with the sense of strength in the victorious powers, laid the foundations of the Iron Curtain and of the Cold War.



The war in the Pacific



Slave laborers at the Buchenwald concentration camp.


The Holocaust
The Holocaust (which roughly means "great fire") was the deliberate, systematic, and horrific murder of millions of Jews and other minorities during World War II by the Nazi regime in Germany. Several differing views exist regarding whether it was intended to occur from the war's beginning, or if the plans for it came about later. Regardless, persecution of Jews extended well before the war even started, such as in the Kristallnacht (literally "Crystal Night", Night of Broken Glass). The Nazis used propaganda to great effect to stir up anti-Semitic feelings within ordinary Germans.

After the conquest of Poland, the Third Reich, which had previously deported Jews and other "undesirables", suddenly had within its borders the largest concentration of Jews in the world. The solution was to round up Jews and place them in concentration camps or in ghettos, cordoned off sections of cities where Jews were forced to live in deplorable conditions, often with tens of thousands starving to death, and the bodies decaying in the streets. As appalling as this sounds, they were the lucky ones. After the invasion of the Soviet Union, armed killing squads of SS men known as Einsatzgruppen systematically rounded up Jews and murdered an estimated one million Jews within the country. As barbaric and inhuman as this seems, it was too slow and inefficient by Nazi standards.

In 1942, the top leadership met in Wannsee, a suburb of Berlin, and began to plan a more efficient way to slaughter the Jews. The Nazis created a system of extermination camps throughout Poland, and began rounding up Jews from the Soviet Union, and from the Ghettos. Not only were Jews shot or gassed to death en masse, but they were forced to provide slave labor and they were used in horrific medical experiments (see Human experimentation in Nazi Germany). Out of the widespread condemnation of the Nazis' medical experiments, the Nuremberg Code of medical ethics was devised.


The Nazis took a sadistic pleasure in the death camps; the entrance to the worst camp, Auschwitz, stated "Arbeit Macht Frei"—"Work Makes Free". In the end, seven million Jews, homosexuals, Jehovah's Witnesses, Gypsies and political prisoners were killed by various means, mainly in the death camps. An additional seven million Soviet and other Allied prisoners of war died in camps and holding areas.

There is some controversy over whether ordinary Germans knew about the Holocaust. It appears that many Germans knew about the concentration camps; such things were prominently displayed in magazines and newspapers. In many places, Jews had to walk past towns and villages on their way to work as slaves in German industry. In any case, Allied soldiers reported that the smell of the camps carried for miles. A very small number of people deny the Holocaust occurred entirely, though these claims have been routinely discredited by mainstream historians.

 

The Nuclear Age begins


The first nuclear explosion, named "Trinity", was detonated July 16, 1945.

During the 1930s, innovations in physics made it apparent that it could be possible to develop nuclear weapons of incredible power using nuclear reactions. When World War II broke out, scientists and advisors among the Allies feared that Nazi Germany may have been trying to develop its own atomic weapons, and the United States and the United Kingdom pooled their efforts in what became known as the Manhattan Project to beat them to it. At the secret Los Alamos laboratory in New Mexico, scientist Robert Oppenheimer led a team of the world's top scientists to develop the first nuclear weapons, the first of which was tested at the Trinity site in July 1945. However, Germany had surrendered in May 1945, and it had been discovered that the German atomic bomb program had not been very close to success.

The Allied team produced two nuclear weapons for use in the war, one powered by uranium-235 and the other by plutonium as fissionable material, named "Little Boy" and "Fat Man". These were dropped on the Japanese cities of Hiroshima and Nagasaki in August 1945. This, in combination with the Soviet entrance in the war, convinced the Japanese to surrender unconditionally. These two weapons remain the only two nuclear weapons ever used against other countries in war.

Nuclear weapons brought an entirely new and terrifying possibility to warfare: a nuclear holocaust. While at first the United States held a monopoly on the production of nuclear weapons, the Soviet Union, with some assistance from espionage, managed to detonate its first weapon (dubbed "Joe-1" by the West) in August 1949. The post-war relations between the two, which had already been deteriorating, began to rapidly disintegrate. Soon the two were locked in a massive stockpiling of nuclear weapons. The United States began a crash-program to develop the first hydrogen bomb in 1950, and detonated its first thermonuclear weapon in 1952. This new weapon was alone over 400 times as powerful as the weapons used against Japan. The Soviet Union detonated a primitive thermonuclear weapon in 1953 and a full-fledged one in 1955.

The conflict continued to escalate, with the major superpowers developing long-range missiles (such as the ICBM) and a nuclear strategy which guaranteed that any use of the nuclear weapons would be suicide for the attacking nation (Mutually Assured Destruction). The creation of early warning systems put the control of these weapons into the hands of newly created computers, and they served as a tense backdrop throughout the Cold War.

Since the 1940s there were concerns about the rising proliferation of nuclear weapons to new countries, which was seen as being destabilizing to international relations, spurring regional arms races, and generally increasing the likelihood of some form of nuclear war. Eventually, seven nations would overtly develop nuclear weapons, and still maintain stockpiles today: the United States, the Soviet Union (and later Russia would inherit these), the United Kingdom, France, China, India, and Pakistan. South Africa developed six crude weapons in the 1980s (which it later dismantled), and Israel almost certainly developed nuclear weapons though it never confirmed nor denied it. The creation of the Nuclear Non-proliferation Treaty in 1968 was an attempt to curtail such proliferation, but a number of countries developed nuclear weapons since it was signed (and many did not sign it), and a number of other countries, including Libya, Iran, and North Korea, were suspected of having clandestine nuclear weapons programs.



Nuclear missiles and computerized launch
systems increased the range and scope of
possible nuclear war.




The post-war world
Following World War II, the majority of the industrialized world lay in ruins as a result of aerial bombings, naval bombardment, and protracted land campaigns. The United States was a notable exception to this; barring Pearl Harbor and some minor incidents, the U.S. had suffered no attacks upon its territory. The United States and the Soviet Union, which, despite the devastation of its most populated areas, rebuilt quickly, found themselves the world's two dominant superpowers.

Much of Western Europe was rebuilt after the war with assistance from the Marshall Plan. Germany, chief instigator of the war, was placed under joint military occupation by the United States, Great Britain, France, and the Soviet Union. Berlin, although in Soviet-controlled territory, was also divided among the four powers. Occupation of Berlin would continue until 1990. Japan was also placed under U.S. occupation, that would last five years, until 1949. Oddly, these two Axis powers, despite military occupation, soon rose to become the second (Japan) and third (West Germany) most powerful economies in the world.

Following the end of the war, the Allies famously prosecuted numerous German officials for war crimes and other offenses in the Nuremberg Trials. Although Adolf Hitler had committed suicide, many of his cronies, including Hermann Göring, were convicted. Less well-known trials of other Axis officials also occurred, including the Tokyo War Crime Trial.

The failure of the League of Nations to prevent World War II essentially discredited the organization, and it was dissolved. A new attempt at world peace was begun with the founding of the United Nations on October 24, 1945 in San Francisco. Today, nearly all countries are members, but despite its many successes, the organization's success at achieving its goal of world peace is dubious. The organization was never given enough power to overcome the conflicting interests and priorities of its member nations.

The end of empires: decolonization

Almost all of the major nations that were involved in World War II began shedding their overseas colonies soon after the conflict. In Africa, nationalists such as Jomo Kenyatta of Kenya and Kwame Nkrumah of Ghana led their respective nations to independence from foreign rule. The tactics employed by the revolutionaries ranged from non-violent forms of protest to armed rebellions, depending on the nation involved. The United States granted independence to the Philippines, its major Pacific possession. European powers also began withdrawing from their possessions in Africa and Asia. France was forced out of both Indochina and, later, Algeria.

 

Mutually assured destruction: the Cold War (1947–1991)
 

War by proxy
Two wars and a near-war in the 1950s became the foci for capitalist versus communist struggle. The first war was the Korean War, fought between People's Republic of China-backed North Korea and mainly United States-backed South Korea. North Korea's invasion of South Korea led to United Nations intervention. General Douglas MacArthur led troops from the United States, Canada, Australia, Great Britain, and other countries in repulsing the Northern invasion. However, the war reached a stalemate after Chinese intervention pushed U.N. forces back, and a cease-fire ended hostilities, leaving the two Koreas divided and tense for the rest of the century.

The second war, the Vietnam War, was perhaps the second most visible war of the 20th century, after World War II. After the French withdrawal from its former colony, Vietnam became partitioned into two halves, much like Korea. Fighting between North and South eventually escalated into a regional war. The United States provided aid to South Vietnam, but was not directly involved until the Gulf of Tonkin Resolution, passed in reaction to a supposed North Vietnamese attack upon American destroyers, brought the U.S. into the war as a belligerent. The war was initially viewed as a fight to contain communism (see containment, Truman Doctrine, and Domino Theory), but, as more Americans were drafted and news of events such as the Tet Offensive and My Lai massacre leaked out, American sentiment turned against the war. U.S. President Richard Nixon was elected partially on claims of a "secret plan" to stop the war. This Nixon Doctrine involved a gradual pullout of American forces; South Vietnamese units were supposed to replace them, backed up by American air power. Unfortunately, the plan went awry, and the war spilled into neighboring Cambodia while South Vietnamese forces were pushed further back. Eventually, the U.S. and North Vietnam signed the Paris Peace Accords, ending U.S. involvement in the war. With the threat of U.S. retaliation gone, the North proceeded to violate the ceasefire and invaded the South with full military force. Saigon was captured on April 30, 1975, and Vietnam was unified under Communist rule a year later, effectively bringing an end to one of the most unpopular wars of all time.

The Cuban Missile Crisis illustrates just how close to the brink of nuclear war the world came during the Cold War. Cuba, under Fidel Castro's socialist government, had formed close ties with the Soviet Union. This was obviously disquieting to the United States, given Cuba's proximity. When Lockheed U-2 spy plane flights over the island revealed that Soviet missile launchers were being installed, U.S. President John F. Kennedy instituted a naval blockade and publicly confronted the Soviet Union. After a tense week, the Soviet Union backed down and ordered the launchers removed, not wanting to risk igniting a new world war.
 

The space race


In 1969, humans first set foot on the Moon.

With Cold War tensions running high, the Soviet Union and United States took their rivalry to the stars in 1957 with the Soviet launch of Sputnik. A "space race" between the two powers followed. Although the USSR reached several important milestones, such as the first craft on the Moon (Luna 2) and the first human in space (Yuri Gagarin), the U.S. allegedly pulled ahead eventually with its Mercury, Gemini, and Apollo programs, which culminated in Apollo 11's manned landing on the moon. Five more manned landings followed (Apollo 13 was forced to abort its mission). Nevertheless, despite its successes U.S. space program couldn't match many major achievements of Soviet space program, such as unmanned rover-based space exploration and image and video transfer from surface of another planet, until early 21st century.

In addition, both countries launched numerous probes into space, such as the Venera 7 and Voyager 2.

In later decades, space became a somewhat friendlier place. Regular manned space flights were made possible with the American space shuttle, which was the first reusable spacecraft to be successfully used. Mir and Skylab enabled prolonged human habitation in space. In the 1990s, work on the International Space Station began, and by the end of the century, while still incomplete, it was in continual use by astronauts from the United States, Europe, Russia, Japan, and Canada.

 

The end of the Cold War


In 1989, the Berlin Wall separating West and East Berlin fell.

By the 1980s, the Soviet Union was weakening. The Sino-Soviet split had removed the USSR's most powerful ally, the People's Republic of China. Its arms race with the U.S. was draining the country of funds, and further weakened by internal pressures, ethnic and political. Mikhail Gorbachev, its last leader, attempted to reform the country with glasnost and perestroika, but the formation of Solidarity, the fall of the Berlin Wall, and the breaking-off of several Soviet republics, such as Lithuania, started a slippery slope of events that culminated in a coup to overthrow Gorbachev, organized by Communist Party hard-liners. Boris Yeltsin, president of Russia, organized mass opposition, and the coup failed. On December 26, 1991, the Soviet Union was officially disbanded into its constituent republics, thus putting a final line under the already exhausted Cold War.
 

Information and communications technology

The creation of the transistor revolutionized the development of the computer. The first computers, room-sized electro-mechanical devices built to break cryptographical codes during World War II, quickly became at least 20 times smaller using transistors. Computers became reprogrammable rather than fixed-purpose devices. The invention of programming languages meant computer operators could concentrate on problem solving at a high-level, without having to think in terms of the individual instructions to the computer itself. The creation of operating systems also vastly improved programming productivity. Building on this, computer pioneers could now realize what they had envisioned. The graphical user interface, piloted by a computer mouse made it simple to harness the power of the computer. Storage for computer programs progressed from punch cards and paper tape to magnetic tape, floppy disks and hard disks. Core memory and bubble memory fell to random access memory.

The invention of the word processor, spreadsheet and database greatly improved office productivity over the old paper, typewriter and filing cabinet methods. The economic advantage given to businesses led to economic efficiencies in computers themselves. Cost-effective CPUs led to thousands of industrial and home-brew computer designs, many of which became successful; a home-computer boom was led by the Apple II, the ZX80 and the Commodore PET.


IBM, seeking to embrace the microcomputer revolution, devised its IBM Personal Computer (PC). Crucially, IBM developed the PC from third-party components that were available on the open market. The only impediment to another company duplicating the system's architecture was the proprietary BIOS software. Other companies, starting with Compaq, reverse engineered the BIOS and released PC compatible computers that soon became the dominant architecture. Microsoft, which produced an operating system for the PC, rode this wave of popularity to become the world's leading software company.

The 1980s heralded the Information Age. The rise of computer applications and data processing made ethereal "information" as valuable as physical commodities. This brought about new concerns surrounding intellectual property issues. The U.S. Government made algorithms patentable, forming the basis of software patents. The controversy over these and proprietary software led Richard Stallman to create the Free Software Foundation and begin the GNU Project.

Computers also became a usable platform for entertainment. Computer games were first developed by software programmers exercising their creativity on large systems at universities, but these efforts became commercially successful in arcade games such as Pong and Space Invaders. Once the home computer market was established, young programmers in their bedrooms became the core of a youthful games industry. In order to take advantage of advancing technology, games consoles were created. Like arcade systems, these machines had custom hardware designed to do game-oriented operations (such as sprites and parallax scrolling) in preference to general purpose computing tasks.

Computer networks appeared in two main styles; the local area network, linking computers in an office or school to each other, and the wide area network, linking the local area networks together. Initially, computers depended on the telephone networks to link to each other, spawning the Bulletin Board sub-culture. However, a DARPA project to create bomb-proof computer networks led to the creation of the Internet, a network of networks. The core of this network was the robust TCP/IP network protocol. Thanks to efforts from Al Gore, the Internet grew beyond its military role when universities and commercial businesses were permitted to connect their networks to it. The main impetus for this was electronic mail, a far faster and convenient form of communication than conventional letter and memo distribution, and the File Transfer Protocol (FTP). However, the Internet remained largely unknown to the general public, who were used to Bulletin Boards and services like Compuserve and America Online. This changed when Tim Berners-Lee devised a simpler form of Vannevar Bush's hypertext, which he dubbed the World Wide Web. "The Web" suddenly changed the Internet into a printing press beyond the geographic boundaries of physical countries; it was termed "cyberspace". Anyone with a computer and an Internet connection could write pages in the simple HTML format and publish their thoughts to the world.

The Web's immense success also fueled the commercial use of the Internet. Convenient home shopping had been an element of "visions of the future" since the development of the telephone, but now the race was on to provide convenient, interactive consumerism. Companies trading through web sites became known as "dot coms", due to the ".com" suffix of commercial Internet addresses.

 

The world at the end of the century


The geographic distribution of surface warming during the 21st century
calculated by the HadCM3 climate model if a business as usual scenario
is assumed for economic growth and greenhouse gas emissions.
In this figure, the globally averaged warming corresponds to 3.0 °C (5.4 °F).


By the end of the century, more technological advances had been made than in all of preceding history. Communications and information technology, transportation technology, and medical advances had radically altered daily lives. Europe appeared to be at a sustainable peace for the first time in recorded history. The people of the Indian subcontinent, a sixth of the world population at the end of the century, had attained an indigenous independence for the first time in centuries. China, an ancient nation comprising a fifth of the world population, was finally open to the world in a new and powerful synthesis of west and east, creating a new state after the near-complete destruction of the old cultural order. With the end of colonialism and the Cold War, nearly a billion people in Africa were left with truly independent new nation states, some cut from whole cloth, standing up after centuries of foreign domination.

The world was undergoing its second major period of globalization; the first, which started in the 18th century, having been terminated by World War I. Since the US was in a position of almost unchallenged domination, a major part of the process was Americanization. This led to anti-Western and anti-American feelings in parts of the world, especially the Middle East. The influence of China and India was also rising, as the world's largest populations, long marginalized by the West and by their own rulers, were rapidly integrating with the world economy.

However, several problems faced the world. The gap between rich and poor nations continued to widen. Some said that this problem could not be fixed, that there was a set amount of wealth and it could only be shared by so many. Others said that the powerful nations with large economies were not doing enough to help improve the rapidly evolving economies of the Third World. However, developing countries faced many challenges, including the scale of the task to be surmounted, rapidly growing populations, and the need to protect the environment, and the cost that goes along with it.

Terrorism, dictatorship, and the spread of nuclear weapons were other issues requiring attention. The world was still blighted by small-scale wars and other violent conflicts, fueled by competition over resources and by ethnic conflicts. Despots such as Kim Jong-il of North Korea continued to lead their nations toward the development of nuclear weapons.

Disease threatened to destabilize many regions of the world. New viruses such as SARS and West Nile continued to spread. In poor nations, malaria and other diseases affected the majority of the population. Millions were infected with HIV, the virus which causes AIDS. The virus was becoming an epidemic in southern Africa.


Perhaps most importantly, it was speculated that in the long term, environmental problems threatened the planet's liveability. The most serious problem was global warming, which was predicted to frequently flood coastal areas, due to human-caused emission of greenhouse gases, particularly carbon dioxide produced by the burning of fossil fuels. This prompted many nations to negotiate and sign the Kyoto treaty, which set mandatory limits on carbon dioxide emissions.

World population
A significant driver of many of the problems at the end of the 20th century was overpopulation. At the century's end, the global population was 6.1 billion and rising. There was some hope on this score, because the number of children per woman had been decreasing throughout the world, not only in the rich countries. In the long term, it was predicted that the population would probably reach a plateau of nine billion around 2050. However, it remained doubtful whether the planet had the long-term capacity to sustain such numbers.

 

 

 

Developments in brief

Wars and politics

–After decades of struggle by the women's suffrage movement, all western countries gave women the right to vote.

-Rising nationalism and increasing national awareness were among the many causes of World War I (1914–1918), the first of two wars to involve many major world powers including Germany, France, Italy, Japan, Russia/USSR, the United States and the British Empire. World War I led to the creation of many new countries, especially in Eastern Europe. At the time it was said by many to be the "war to end all wars".

-Civil wars occurred in many nations. A violent civil war broke out in Spain in 1936 when General Francisco Franco rebelled against the Second Spanish Republic. Many consider this war as a testing battleground for World War II, as the fascist armies bombed some Spanish territories.

-The economic and political aftermath of World War I and the Great Depression in the 1930s led to the rise of fascism and nazism in Europe, and subsequently to World War II (1939–1945). This war also involved Asia and the Pacific, in the form of Japanese aggression against China and the United States. Civilians also suffered greatly in World War II, due to the aerial bombing of cities on both sides, and the German genocide of the Jews and others, known as the Holocaust. In 1945, Hiroshima and Nagasaki were bombed with nuclear weapons.

-During World War I, in Russia the Bolshevik putsch took over the Russian Revolution of 1917, precipitating the founding of the Soviet Union and the rise of communism. After the Soviet Union's involvement in World War II, communism became a major force in global politics, notably in Eastern Europe, China, Indochina and Cuba, where communist parties gained near-absolute power. This led to the Cold War and proxy wars with the West, including wars in Korea (1950–1953) and Vietnam (1957–1975).

-The Soviet authorities caused the deaths of millions of their own citizens in order to eliminate domestic opposition. More than 18 million people passed through the Gulag, with a further 6 million being exiled to remote areas of the Soviet Union.

-The civil rights movement in the United States and the movement against apartheid in South Africa challenged racial segregation in those countries.

-The two world wars led to efforts to increase international cooperation, notably through the founding of the League of Nations after World War I, and its successor, the United Nations, after World War II.

-The creation of Israel, a Jewish state in the Middle East, by the British Mandate of Palestine fueled many regional conflicts. These were also influenced by the vast oil fields in many of the other countries of the mostly Arab region.

-The end of colonialism led to the independence of many African and Asian countries. During the Cold War, many of these aligned with the United States, the USSR, or China for defense.

-After a long period of civil wars and conflicts with European powers, China's last imperial dynasty ended in 1912. The resulting republic was replaced, after yet another civil war, by a communist People's Republic in 1949. At the end of the century, though still ruled by a communist party, China's economic system had transformed almost completely to capitalism.

-The Great Chinese Famine was a direct cause of the death of tens of millions of Chinese peasants between 1959 and 1962. It is thought to be the largest famine in human history.

-The Tiananmen Square protests of 1989, culminating in the deaths of hundreds of civilian protestors, were a series of demonstrations in and near Tiananmen Square in Beijing, China. Led mainly by students and intellectuals, the protests occurred in a year that saw the collapse of a number of communist governments around the world.

-The revolutions of 1989 released Eastern and Central Europe from Soviet supremacy. Soon thereafter, the Soviet Union, Czechoslovakia, and Yugoslavia dissolved, the latter violently over several years, into successor states, many rife with ethnic nationalism. East Germany and West Germany were reunified in 1990.

-European integration began in earnest in the 1950s, and eventually led to the European Union, a political and economic union that comprised 15 countries at the end of the century.



Culture and entertainment

-As the century began, Paris was the artistic capital of the world, where both French and foreign writers, composers and visual artists gathered.

-Movies, music and the media had a major influence on fashion and trends in all aspects of life. As many movies and much music originate from the United States, American culture spread rapidly over the world.

-Computer games and internet surfing became new and popular form of entertainment during the last 25 years of the century.

-In literature, science fiction, fantasy (with well developed, rich in details fictional worlds), alternative history fiction gained unprecedented popularity. Detective fiction gained unprecedented popularity between the two world wars.

-Blues and jazz music became popularized during the 1910s and 1920s in the United States. Blues went on to influence rock and roll in the 1950s, which only increased in popularity with the British Invasion of the mid-to-late '60s. Rock soon branched into many different genres, including heavy metal, punk rock, and alternative rock and became the dominant genre of popular music. This was challenged with the rise of hip hop in the 1980s and 1990s. Other genres such as house, techno, reggae, and soul all developed during the latter half of the 20th century and went through various periods of popularity.

-In classical music, composition branched out into many completely new domains, including dodecaphony, aleatoric (chance) music, and minimalism.
Synthesizers began to be employed widely in music and crossed over into the mainstream with new wave music in the 1980s. Electronic instruments have been widely deployed in all manners of popular music and has led to the development of such genres as house, synthpop, electronic dance music, and industrial.

-The art world experienced the development of new styles and explorations such as expressionism, Dadaism, cubism, de stijl, abstract expressionism and surrealism.

-The modern art movement revolutionized art and culture and set the stage for both Modernism and its counterpart postmodern art as well as other contemporary art practices.

-In Europe, modern architecture departed radically from the excess decoration of the Victorian era. Streamlined forms inspired by machines became more commonplace, enabled by developments in building materials and technologies. Before World War II, many European architects moved to the United States, where modern architecture continued to develop.

-After gaining political rights in the United States and much of Europe in the first part of the century, and with the advent of new birth control techniques, women became more independent throughout the century.

-The automobile vastly increased the mobility of people in the Western countries in the early to mid-century, and in many other places by the end of the century. City design throughout most of the West became focused on transport via car.

-The popularity of sport increased considerably—both as an activity for all, not just the elite, and as entertainment, particularly on television.



Science

-Starting with invention of Turing machine, new fields of mathematics studying computability and computation complexity were developed.

-Gödel's incompleteness theorems were formulated and proven.

-New areas of physics, like special relativity, general relativity, and quantum mechanics, were developed during the first half of the century.

-While some pioneering experiments about internal structure of atoms had been made at the end of XIX century, it is only in XX century the structure of atoms was -clearly understood, followed by discovery of elementary particles.

-It was found that all the known forces can be traced to only four fundamental interactions. It was discovered further that two of them, namely electromagnetism and weak interaction, can be merged in the electroweak interaction, leaving only three different fundamental interactions.

-Discovery of nuclear reactions, in particular nuclear fusion, finally solved the problem of the source of solar energy. The age of solar system, including Earth, was determined and it turned to be much older than what was considered before (more than 4 billions years rather than 20 millions years suggested by lord Kelvin in 1862[3]).

-Radiocarbon dating became a powerful technique to determine the age of prehistoric animals and plants as well as historical objects. No such technique existed in XIX century.

-In astronomy, much better understanding of the evolution of the Universe was achieved, its age was determined, the Big Bang theory was proposed. Planets of solar system and their moons were closely discovered. It was found that there is no sentient (or complex animal or plant) life on their surface.

-In biology, genetics was unanimously accepted and significantly developed. The structure of DNA was determined in 1953 by Watson and Criek, following by developing techniques which allow to read DNA sequences and culminating in starting the Human Genome Project (not finished in XX century) and cloning the first mammal in 1996.

-The role of sex reproduction in evolution was understood, and bacterial conjugation was discovered.



Technology

-The number and types of home appliances increased dramatically due to advancements in technology, electricity availability, and increases in wealth and leisure time. -Such basic appliances as washing machines, clothes dryers, exercise machines, refrigerators, freezers, electric stoves, and vacuum cleaners all became popular from the 1920s through the 1950s. The microwave oven became popular during the 1980s. Radios were popularized as a form of entertainment during the 1920s, which extended to television during the 1950s. Cable television spread rapidly during the 1980s. Personal computers began to enter the home during the 1970s-1980s as well. The age of the portable music player grew during the 1960s with the development of 8-track and cassette tapes, which slowly began to replace record players. These were in turn replaced by the CD during the late 1980s and 1990s. The proliferation of the Internet in the mid-to-late 1990s made digital distribution of music (mp3s) possible. VCRs were popularized in the 1970s, but by the end of the millennium, DVDs were beginning to replace them.

-The first airplane was flown in 1903. With the engineering of the faster jet engine in the 1940s, mass air travel became commercially viable.

-The assembly line made mass production of the automobile viable. By the end of the century, billions of people had automobiles for personal transportation. The combination of the automobile, motor boats and air travel allowed for unprecedented personal mobility. In western nations, motor vehicle accidents became the greatest cause of death for young people. However, expansion of divided highways reduced the death rate.

-The triode tube (Audion), transistor and integrated circuit revolutionized computers, leading to the proliferation of the personal computer in the 1980s and cell phones and the public-use Internet in the 1990s.

-New materials, most notably plastics, polyethylene, Velcro, and teflon, came into widespread use for many various applications.

-Thousands of chemicals were developed for industrial processing and home use.

-The Space Race between the United States and the Soviet Union gave a peaceful outlet to the political and military tensions of the Cold War, leading to the first human spaceflight with the Soviet Union's Vostok 1 mission in 1961, and man's first landing on another world—the Moon—with America's Apollo 11 mission in 1969. Later, the first space station was launched by the Soviet space program. The United States developed the first (and to date only) reusable spacecraft system with the Space Shuttle program, first launched in 1981. As the century ended, a permanent manned presence in space was being founded with the ongoing construction of the International Space Station.

-In addition to Human spaceflight, unmanned space probes became a practical and relatively inexpensive form of exploration. The first orbiting space probe, Sputnik 1, was launched by the Soviet Union in 1957. Over time, a massive system of artificial satellites was placed into orbit around Earth. These satellites greatly advanced navigation, communications, military intelligence, geology, climate, and numerous other fields. Also, by the end of the century, unmanned probes had visited the Moon, Mercury, Venus, Mars, Jupiter, Saturn, Uranus, Neptune, and various asteroids and comets. The Hubble Space Telescope, launched in 1990, greatly expanded our understanding of the Universe and brought brilliant images to TV and computer screens around the world.



Medicine

-Placebo-controlled, randomized, blinded clinical trials became a powerful tool for testing new medicines.

-Antibiotics drastically reduced mortality from bacterial diseases and their prevalence.

-A vaccine was developed for polio, ending a worldwide epidemic. Effective vaccines were also developed for a number of other serious infectious diseases, including influenza, diphtheria, pertussis (whooping cough), tetanus, measles, mumps, rubella (German measles), chickenpox, hepatitis A, and hepatitis B.

-A successful application of epidemiology and vaccination led to the eradication of the smallpox virus in humans.

-X-rays became powerful diagnostic tool for wide spectrum of diseases, from bone fractures to cancer. In the 1960s, computerized tomography was invented. Other important diagnostic tools developed were sonography and magnetic resonance imaging.

-Development of vitamins virtually eliminated scurvy and other vitamin-deficiency diseases from industrialized societies.

-New psychiatric drugs were developed. These include antipsychotics for treating hallucinations and delusions, and antidepressants for treating depression.

-The role of tobacco smoking in the causation of cancer and other diseases was proven during the 1950s (see British Doctors Study).

-New methods for cancer treatment, including chemotherapy, radiation therapy, and immunotherapy, were developed. As a result, cancer could often be cured or placed in remission.

-The development of blood typing and blood banking made blood transfusion safe and widely available.

-The invention and development of immunosuppressive drugs and tissue typing made organ and tissue transplantation a clinical reality.

-Research on sleep and circadian rhythms led to the discovery of sleep disorders.

-New methods for heart surgery were developed, including pacemakers and artificial hearts.

-Cocaine/crack and heroin were found to be dangerous addictive drugs, and their wide usage had been outlawed; mind-altering drugs such as LSD and MDMA were discovered and later outlawed. In many countries, a war on drugs caused prices to soar 10x-20x higher, leading to profitable black market drugdealing, and to prison inmate sentences being 80% related to drug use by the 1990s.

-Contraceptive drugs were developed, which reduced population growth rates in industrialized countries.

-The development of medical insulin during the 1920s helped raise the life expectancy of diabetics to three times of what it had been earlier.
-The elucidation of the structure and function of DNA initiated the development of genetic engineering and the mapping of the human genome.
-Masturbation was found to be a harmless activity. Beliefs that it seriously harms physical and mental health, shared by XIX century physicians, found to be wrong.
-As a result of some of the above developments, most notably antibiotics and vaccines, child and young people's mortality decreased drastically.


Notable diseases

-An influenza pandemic, the Spanish Flu, killed anywhere from 20 to 100 million people between 1918 and 1919.

-A new viral disease, AIDS, arose in Africa and subsequently killed millions of people throughout the world. AIDS treatments remained inaccessible to many people living with AIDS in developing countries, and a cure has yet to be discovered.

-Because of increased life spans, the prevalence of cancer, Alzheimer's disease, Parkinson's disease, and other diseases of old age increased slightly.

-Sedentary lifestyles, due to labor-saving devices and technology, contributed to an "epidemic" of obesity, at first in the rich countries, but by the end of the century, increasingly in the developing world, too.


Energy and the environment

-The dominant use of fossil sources and nuclear power, considered the conventional energy sources.

-Widespread use of petroleum in industry—both as a chemical precursor to plastics and as a fuel for the automobile and airplane—led to the vital geopolitical importance of petroleum resources. The Middle East, home to many of the world's oil deposits, became a center of geopolitical and military tension throughout the latter half of the century. (For example, oil was a factor in Japan's decision to go to war against the United States in 1941, and the oil cartel, OPEC, used an oil embargo of sorts in the wake of the Yom Kippur War in the 1970s).

-A vast increase in fossil fuel consumption caused smog and other forms of air pollution, global warming, local and global climate change.

-Pesticides, herbicides and other toxic chemicals accumulated in the environment, including the bodies of humans and other animals.

-Overpopulation and worldwide deforestation diminished the quality of the environment.

-Rapidly falling fertility rates among Americans and Europeans begin to cause what has been called a "demographic winter," with a possibility of soon bringing these cultures to an end forever.

 


The first model of the IBM PC, the personal computer whose successors would fill the world.
 

 


Internet

Overview
computer network
Publicly accessible computer network connecting many smaller networks from around the world.

It grew out of a U.S. Defense Department program called ARPANET (Advanced Research Projects Agency Network), established in 1969 with connections between computers at the University of California at Los Angeles, Stanford Research Institute, the University of California-Santa Barbara, and the University of Utah. ARPANET’s purpose was to conduct research into computer networking in order to provide a secure and survivable communications system in case of war. As the network quickly expanded, academics and researchers in other fields began to use it as well. In 1971 the first program for sending e-mail over a distributed network was developed; by 1973, the year international connections to ARPANET were made (from Britain and Norway), e-mail represented most of the traffic on ARPANET. The 1970s also saw the development of mailing lists, newsgroups and bulletin-board systems, and the TCP/IP communications protocols, which were adopted as standard protocols for ARPANET in 1982–83, leading to the widespread use of the term Internet. In 1984 the domain name addressing system was introduced. In 1986 the National Science Foundation established the NSFNET, a distributed network of networks capable of handling far greater traffic, and within a year more than 10,000 hosts were connected to the Internet. In 1988 real-time conversation over the network became possible with the development of Internet Relay Chat protocols (see chat). In 1990 ARPANET ceased to exist, leaving behind the NSFNET, and the first commercial dial-up access to the Internet became available. In 1991 the World Wide Web was released to the public (via FTP). The Mosaic browser was released in 1993, and its popularity led to the proliferation of World Wide Web sites and users. In 1995 the NSFNET reverted to the role of a research network, leaving Internet traffic to be routed through network providers rather than NSF supercomputers. That year the Web became the most popular part of the Internet, surpassing the FTP protocols in traffic volume. By 1997 there were more than 10 million hosts on the Internet and more than 1 million registered domain names. Internet access can now be gained via radio signals, cable-television lines, satellites, and fibre-optic connections, though most traffic still uses a part of the public telecommunications (telephone) network. The Internet is widely regarded as a development of vast significance that will affect nearly every aspect of human culture and commerce in ways still only dimly discernible.

A system architecture that has revolutionized communications and methods of commerce by allowing various computer networks around the world to interconnect. Sometimes referred to as a “network of networks,” the Internet emerged in the United States in the 1970s but did not become visible to the general public until the early 1990s. By the beginning of the 21st century, approximately 360 million people, or roughly 6 percent of the world’s population, were estimated to have access to the Internet. It is widely assumed that at least half of the world’s population will have some form of Internet access by 2010 and that wireless access will play a growing role.

The Internet provides a capability so powerful and general that it can be used for almost any purpose that depends on information, and it is accessible by every individual who connects to one of its constituent networks. It supports human communication via electronic mail (e-mail), “chat rooms,” newsgroups, and audio and video transmission and allows people to work collaboratively at many different locations. It supports access to digital information by many applications, including the World Wide Web. The Internet has proved to be a spawning ground for a large and growing number of “e-businesses” (including subsidiaries of traditional “brick-and-mortar” companies) that carry out most of their sales and services over the Internet. (See electronic commerce.) Many experts believe that the Internet will dramatically transform business as well as society.



Computer users at an Internet café in Saudi Arabia.
 

Origin and development

Early networks
The first computer networks were dedicated special-purpose systems such as SABRE (an airline reservation system) and AUTODIN I (a defense command-and-control system), both designed and implemented in the late 1950s and early 1960s. By the early 1960s computer manufacturers had begun to use semiconductor technology in commercial products, and both conventional batch-processing and time-sharing systems were in place in many large, technologically advanced companies. Time-sharing systems allowed a computer’s resources to be shared in rapid succession with multiple users, cycling through the queue of users so quickly that the computer appeared dedicated to each user’s tasks despite the existence of many others accessing the system “simultaneously.” This led to the notion of sharing computer resources (called host computers or simply hosts) over an entire network. Host-to-host interactions were envisioned, along with access to specialized resources (such as supercomputers and mass storage systems) and interactive access by remote users to the computational powers of time-sharing systems located elsewhere. These ideas were first realized in ARPANET, which established the first host-to-host network connection on Oct. 29, 1969. It was created by the Advanced Research Projects Agency (ARPA) of the U.S. Department of Defense. ARPANET was one of the first general-purpose computer networks. It connected time-sharing computers at government-supported research sites, principally universities in the United States, and it soon became a critical piece of infrastructure for the computer science research community in the United States. Tools and applications—such as the simple mail transfer protocol (SMTP, commonly referred to as e-mail), for sending short messages, and the file transfer protocol (FTP), for longer transmissions—quickly emerged. In order to achieve cost-effective interactive communications between computers, which typically communicate in short bursts of data, ARPANET employed the new technology of packet switching. Packet switching takes large messages (or chunks of computer data) and breaks them into smaller, manageable pieces (known as packets) that can travel independently over any available circuit to the target destination, where the pieces are reassembled. Thus, unlike traditional voice communications, packet switching does not require a single dedicated circuit between each pair of users.

Commercial packet networks were introduced in the 1970s, but these were designed principally to provide efficient access to remote computers by dedicated terminals. Briefly, they replaced long-distance modem connections by less-expensive “virtual” circuits over packet networks. In the United States, Telenet and Tymnet were two such packet networks. Neither supported host-to-host communications; in the 1970s this was still the province of the research networks, and it would remain so for many years.

DARPA (Defense Advanced Research Projects Agency; formerly ARPA) supported initiatives for ground-based and satellite-based packet networks. The ground-based packet radio system provided mobile access to computing resources, while the packet satellite network connected the United States with several European countries and enabled connections with widely dispersed and remote regions. With the introduction of packet radio, connecting a mobile terminal to a computer network became feasible. However, time-sharing systems were then still too large, unwieldy, and costly to be mobile or even to exist outside a climate-controlled computing environment. A strong motivation thus existed to connect the packet radio network to ARPANET in order to allow mobile users with simple terminals to access the time-sharing systems for which they had authorization. Similarly, the packet satellite network was used by DARPA to link the United States with satellite terminals serving the United Kingdom, Norway, Germany, and Italy. These terminals, however, had to be connected to other networks in European countries in order to reach the end users. Thus arose the need to connect the packet satellite net, as well as the packet radio net, with other networks.


Foundation of the Internet
The Internet resulted from the effort to connect various research networks in the United States and Europe. First, DARPA established a program to investigate the interconnection of “heterogeneous networks.” This program, called Internetting, was based on the newly introduced concept of open architecture networking, in which networks with defined standard interfaces would be interconnected by “gateways.” A working demonstration of the concept was planned. In order for the concept to work, a new protocol had to be designed and developed; indeed, a system architecture was also required.

In 1974 Vinton Cerf, then at Stanford University in California, and this author, then at DARPA, collaborated on a paper that first described such a protocol and system architecture—namely, the transmission control protocol (TCP), which enabled different types of machines on networks all over the world to route and assemble data packets. TCP, which originally included the Internet protocol (IP), a global addressing mechanism that allowed routers to get data packets to their ultimate destination, formed the TCP/IP standard, which was adopted by the U.S. Department of Defense in 1980. By the early 1980s the “open architecture” of the TCP/IP approach was adopted and endorsed by many other researchers and eventually by technologists and businessmen around the world.

By the 1980s other U.S. governmental bodies were heavily involved with networking, including the National Science Foundation (NSF), the Department of Energy, and the National Aeronautics and Space Administration (NASA). While DARPA had played a seminal role in creating a small-scale version of the Internet among its researchers, NSF worked with DARPA to expand access to the entire scientific and academic community and to make TCP/IP the standard in all federally supported research networks. In 1985–86 NSF funded the first five supercomputing centres—at Princeton University, the University of Pittsburgh, the University of California, San Diego, the University of Illinois, and Cornell University. In the 1980s NSF also funded the development and operation of the NSFNET, a national “backbone” network to connect these centres. By the late 1980s the network was operating at millions of bits per second. NSF also funded various nonprofit local and regional networks to connect other users to the NSFNET. A few commercial networks also began in the late 1980s; these were soon joined by others, and the Commercial Internet Exchange (CIX) was formed to allow transit traffic between commercial networks that otherwise would not have been allowed on the NSFNET backbone. In 1995, after extensive review of the situation, NSF decided that support of the NSFNET infrastructure was no longer required, since many commercial providers were now willing and able to meet the needs of the research community, and its support was withdrawn. Meanwhile, NSF had fostered a competitive collection of commercial Internet backbones connected to one another through so-called network access points (NAPs).

From the Internet’s origin in the early 1970s, control of it steadily devolved from government stewardship to private-sector participation and finally to private custody with government oversight and forbearance. Today a loosely structured group of several thousand interested individuals known as the Internet Engineering Task Force participates in a grassroots development process for Internet standards. Internet standards are maintained by the nonprofit Internet Society, an international body with headquarters in Reston, Virginia. The Internet Corporation for Assigned Names and Numbers (ICANN), another nonprofit, private organization, oversees various aspects of policy regarding Internet domain names and numbers.




Commercial expansion
The rise of commercial Internet services and applications helped to fuel a rapid commercialization of the Internet. This phenomenon was the result of several other factors as well. One important factor was the introduction of the personal computer and the workstation in the early 1980s—a development that in turn was fueled by unprecedented progress in integrated circuit technology and an attendant rapid decline in computer prices. Another factor, which took on increasing importance, was the emergence of ethernet and other “local area networks” to link personal computers. But other forces were at work too. Following the restructuring of AT&T in 1984, NSF took advantage of various new options for national-level digital backbone services for the NSFNET. In 1988 the Corporation for National Research Initiatives received approval to conduct an experiment linking a commercial e-mail service (MCI Mail) to the Internet. This application was the first Internet connection to a commercial provider that was not also part of the research community. Approval quickly followed to allow other e-mail providers access, and the Internet began its first explosion in traffic.

In 1993 federal legislation allowed NSF to open the NSFNET backbone to commercial users. Prior to that time, use of the backbone was subject to an “acceptable use” policy, established and administered by NSF, under which commercial use was limited to those applications that served the research community. NSF recognized that commercially supplied network services, now that they were available, would ultimately be far less expensive than continued funding of special-purpose network services.

Also in 1993 the University of Illinois made widely available Mosaic, a new type of computer program, known as a browser, that ran on most types of computers and, through its “point-and-click” interface, simplified access, retrieval, and display of files through the Internet. Mosaic incorporated a set of access protocols and display standards originally developed at the European Organization for Nuclear Research (CERN) by Tim Berners-Lee for a new Internet application called the World Wide Web (WWW). In 1994 Netscape Communications Corporation (originally called Mosaic Communications Corporation) was formed to further develop the Mosaic browser and server software for commercial use. Shortly thereafter, the software giant Microsoft Corporation became interested in supporting Internet applications on personal computers (PCs) and developed its Internet Explorer Web browser (based initially on Mosaic) and other programs. These new commercial capabilities accelerated the growth of the Internet, which as early as 1988 had already been growing at the rate of 100 percent per year.

By the late 1990s there were approximately 10,000 Internet service providers (ISPs) around the world, more than half located in the United States. However, most of these ISPs provided only local service and relied on access to regional and national ISPs for wider connectivity. Consolidation began at the end of the decade, with many small to medium-size providers merging or being acquired by larger ISPs. Among these larger providers were groups such as America Online, Inc. (AOL), which started as a dial-up information service with no Internet connectivity but made a transition in the late 1990s to become the leading provider of Internet services in the world—with more than 25 million subscribers by 2000 and with branches in Australia, Europe, South America, and Asia. Widely used Internet “portals” such as AOL, Yahoo!, Excite, and others were able to command advertising fees owing to the number of “eyeballs” that visited their sites. Indeed, during the late 1990s advertising revenue became the main quest of many Internet sites, some of which began to speculate by offering free or low-cost services of various kinds that were visually augmented with advertisements. By 2001 this speculative bubble had burst.


Future directions
While the precise structure of the future Internet is not yet clear, many directions of growth seem apparent. One is the increased availability of wireless access. Wireless services enable applications not previously possible in any economical fashion. For example, global positioning systems (GPS) combined with wireless Internet access would help mobile users to locate alternate routes, generate precise accident reports and initiate recovery services, and improve traffic management and congestion control. In addition to wireless laptop computers and personal digital assistants (PDAs), wearable devices with voice input and special display glasses are under development.

Another future direction is toward higher backbone and network access speeds. Backbone data rates of 10 billion bits (10 gigabits) per second are readily available today, but data rates of 1 trillion bits (1 terabit) per second or higher will eventually become commercially feasible. If the development of computer hardware, software, applications, and local access keeps pace, it may be possible for users to access networks at speeds of 100 gigabits per second. At such data rates, high-resolution video—indeed, multiple video streams—would occupy only a small fraction of available bandwidth. Remaining bandwidth could be used to transmit auxiliary information about the data being sent, which in turn would enable rapid customization of displays and prompt resolution of certain local queries. Much research, both public and private, has gone into integrated broadband systems that can simultaneously carry multiple signals—data, voice, and video. In particular, the U.S. government has funded research to create new high-speed network capabilities dedicated to the scientific-research community.

It is clear that communications connectivity will be an important function of a future Internet as more machines and devices are interconnected. In 1998, after four years of study, the Internet Engineering Task Force published a new 128-bit IP address standard intended to replace the conventional 32-bit standard. By allowing a vast increase in the number of available addresses (2128, as opposed to 232), this standard will make it possible to assign unique addresses to almost every electronic device imaginable. Thus, the expressions “wired” office, home, and car may all take on new meanings, even if the access is really wireless.

The dissemination of digitized text, pictures, and audio and video recordings over the Internet, primarily available today through the World Wide Web, has resulted in an information explosion. Clearly, powerful tools are needed to manage network-based information. Information available on the Internet today may not be available tomorrow without careful attention’s being paid to preservation and archiving techniques. The key to making information persistently available is infrastructure and the management of that infrastructure. Repositories of information, stored as digital objects, will soon populate the Internet. At first these repositories may be dominated by digital objects specifically created and formatted for the World Wide Web, but in time they will contain objects of all kinds in formats that will be dynamically resolvable by users’ computers in real time. Movement of digital objects from one repository to another will still leave them available to users who are authorized to access them, while replicated instances of objects in multiple repositories will provide alternatives to users who are better able to interact with certain parts of the Internet than with others. Information will have its own identity and, indeed, become a “first-class citizen” on the Internet.

Robert Kahn



Society and the Internet
What began as a largely technical and limited universe of designers and users became one of the most important mediums of the late 20th and early 21st centuries. As the Pew Charitable Trust observed in 2004, it took 46 years to wire 30 percent of the United States for electricity; it took only 7 years for the Internet to reach that same level of connection to American homes. By 2005, 68 percent of American adults and 90 percent of American teenagers had used the Internet. Europe and Asia were at least as well connected as the United States. Nearly half of the citizens of the European Union are online, and even higher rates are found in the Scandinavian countries. There is a wide variance in Asian countries; for example, by 2005 Taiwan, Hong Kong, and Japan had at least half of their populations online, whereas India, Pakistan, and Vietnam had less than 10 percent. South Korea was the world leader in connecting its population to the Internet through high-speed broadband connections.

Such statistics can chart the Internet’s growth, but they offer few insights into the changes wrought as users—individuals, groups, corporations, and governments—have embedded the technology into everyday life. The Internet is now as much a lived experience as a tool for performing particular tasks, offering the possibility of creating an environment or virtual reality in which individuals might work, socially interact with others, and perhaps even live out their lives.


History, community, and communications

Two agendas
The Internet has evolved from the integration of two very different technological agendas—the Cold War networking of the U.S. military and the personal computer (PC) revolution. The first agenda can be dated to 1973, when the Defense Advanced Research Projects Agency (DARPA) sought to create a communications network that would support the transfer of large data files between government and government-sponsored academic-research laboratories. The result was the ARPANET, a robust decentralized network that supported a vast array of computer hardware. Initially, ARPANET was the preserve of academics and corporate researchers with access to time-sharing mainframe computer systems. Computers were large and expensive; most computer professionals could not imagine anyone needing, let alone owning, his own “personal” computer. And yet Joseph Licklider, one of the driving forces at DARPA for computer networking, stated that online communication would “change the nature and value of communication even more profoundly than did the printing press and the picture tube.”

The second agenda began to emerge in 1977 with the introduction of the Apple II, the first affordable computer for individuals and small businesses. Created by Apple Computer, Inc. (now Apple Inc.), the Apple II was popular in schools by 1979, but in the corporate market it was stigmatized as a game machine. The task of cracking the business market fell to IBM. In 1981 the IBM PC was released and immediately standardized the PC’s basic hardware and operating system—so much so that first PC-compatible and then simply PC came to mean any personal computer built along the lines of the IBM PC. A major centre of the PC revolution was the San Francisco Bay area, where several major research institutions funded by DARPA—Stanford University, the University of California, Berkeley, and Xerox PARC—provided much of the technical foundation for Silicon Valley. It was no small coincidence that Apple’s two young founders—Steven Jobs and Stephen Wozniak—worked as interns in the Stanford University Artificial Intelligence Laboratory and at the nearby Hewlett-Packard Company. The Bay Area’s counterculture also figured prominently in the PC’s history. Electronic hobbyists saw themselves in open revolt against the “priesthood” of the mainframe computer and worked together in computer-enthusiast groups to spread computing to the masses.


The WELL
Why does this matter? The military played an essential role in shaping the Internet’s architecture, but it was through the counterculture that many of the practices of contemporary online life emerged. A telling example is the early electronic bulletin board system (BBS), such as the WELL (Whole Earth ’Lectronic Link). Established in 1985 by American publisher Stewart Brand, who viewed the BBS as an extension of his Whole Earth Catalog, the WELL was one of the first electronic communities organized around forums dedicated to particular subjects such as parenting and Grateful Dead concerts. The latter were an especially popular topic of online conversation, but it was in the parenting forum where a profound sense of community and belonging initially appeared. For example, when one participant’s child was diagnosed with leukemia, members of the forum went out of their way either to find health resources or to comfort the distressed parents. In this one instance, several features still prevalent in the online world can be seen. First, geography was irrelevant. WELL members in California and New York could bring their knowledge together within the confines of a forum—and could do so collectively, often exceeding the experience available to any local physician or medical centre. This marshaling of shared resources persists to this day as many individuals use the Internet to learn more about their ailments, find others who suffer from the same disease, and learn about drugs, physicians, and alternative therapies.

Another feature that distinguished the WELL forums was the use of moderators who could interrupt and focus discussion while also disciplining users who broke the rather loose rules. “Flame wars” (crass, offensive, or insulting exchanges) were possible, but anyone dissatisfied in one forum was free to organize another. In addition, the WELL was intensely democratic. WELL forums were the original chat rooms—online spaces where individuals possessing similar interests might congregate, converse, and even share their physical locations to facilitate meeting in person. Finally, the WELL served as a template for other online communities dedicated to subjects as diverse as Roman Catholicism, liberal politics, gardening, and automobile modification.
 





Instant broadcast communication
For the individual the Internet opened up new communication possibilities. E-mail has already led to a substantial decline in traditional “snail mail.” Instant messaging (IM), or text messaging, continues to expand, especially among youth, with the convergence of the Internet and cellular telephone access to the Web. Indeed, IM has become a particular problem in classrooms, where students often surreptitiously exchange notes via wireless communication devices. More than 50 million American adults, including 11 million at work, use IM.

From mailing lists to “buddy lists,” e-mail and IM have been used to create “smart mobs” that converge in the physical world. Examples include protest organizing, spontaneous performance art, and shopping. Obviously, people congregated before the Internet existed; the change wrought by mass e-mailings has been in the speed of assembling such events. For example, in February 1999 activists began planning protests against the November 1999 World Trade Organization (WTO) meetings in Seattle, Washington. Using the Internet, organizers mobilized more than 50,000 individuals from around the world to engage in demonstrations—at times violent—that effectively altered the WTO’s agenda.

In the wake of catastrophic disasters, citizens have used the Internet to donate to charities in an unprecedented fashion. Others have used the Internet to reunite family members or to match lost pets with their owners. The role of the Internet in responding to disasters, both natural and deliberate, remains the topic of much discussion, as it is unclear whether the Internet actually can function in a disaster area when much of the infrastructure is destroyed. Certainly during the September 11, 2001, attacks, people found it easier to communicate with loved ones in New York City via e-mail than through the overwhelmed telephone network.


Social gaming
One-to-one or even one-to-many communication is only the most elementary form of Internet social life. The very nature of the Internet makes spatial distances largely irrelevant for social interactions. Online gaming has moved from simply playing a game with friends to a rather complex form of social life in which the game’s virtual reality spills over into the physical world. The case of EverQuest, a popular electronic game with several hundred thousand players, is one example. Property acquired in the game has been sold on the online auction site eBay, and characters with particular skill sets are also available for sale. What does it mean that one can own virtual property and that someone is willing to pay for this property with real money? Economists have begun studying such virtual economies, some of which now exceed the gross national product of countries in Africa and Asia. In fact, virtual economies finally have given economists a means of running controlled experiments.

Millions of people have created online game characters for entertainment purposes. Gaming creates an online community, but it also allows for a blurring of the boundaries between the real world and the virtual one. In Shanghai one gamer stabbed and killed another one in the real world over a virtual sword used in Legend of Mir 3. Although attempts were made to involve the authorities in the original dispute, the police found themselves at a loss prior to the murder because the law did not acknowledge the existence of virtual property. In South Korea violence surrounding online gaming happens often enough that police refer to such murders as “off-line PK,” a reference to player killing (PK), or player-versus-player lethal contests, which are allowed or encouraged in some games. By 2001 crime related to Lineage had forced South Korean police to create special cybercrime units to patrol both within the game and off-line. Potential problems from such games are not limited to crime. Virtual life can be addictive. Reports of players neglecting family, school, work, and even their health to the point of death have become more common.




Love and sex
By the start of the 21st century, approximately 20 percent of the Internet population had used it at some time to meet others, with Internet dating services collecting nearly half a billion dollars per year in matchmaking fees. Dating sites capture an important aspect of the Web economy—the ability to appeal to particular niche groups. Of the myriads of dating Web sites, many cater to individuals of particular ethnic or national identities and thereby preselect people along some well-defined axes of interest. Because of the low costs involved in setting up a Web site, the possibilities for “nichification” are nearly endless.

Pornography is another domain in which nichification is prevalent. By the 21st century, there were some four million Web sites devoted to pornography, containing more than a quarter of a billion pages—in other words, more than 10 percent of the Web. Forty million American adults regularly visit pornographic sites, which generate billions of dollars in yearly revenues. In response to this proliferation, a suggestion was made to place pornographic sites in a special “xxx” Web domain so that software could easily trace them and render them invisible to children. However, with no organization to ensure voluntary compliance, nothing came of the proposal. All of society’s vices, as well as its virtues, have manifested themselves on the Internet.


Advertising and e-commerce
Nichification allows for consumers to find what they want, but it also provides opportunities for advertisers to find consumers. For example, most search engines generate revenue by matching ads to an individual’s particular search query. Among the greatest challenges facing the Internet’s continued development is the task of reconciling advertising and commercial needs with the right of Internet users not to be bombarded by “pop-up” Web pages and spam (unsolicited e-mail).

Nichification also opens up important e-commerce opportunities. A bookstore can carry only so much inventory on its shelves, which thereby limits its collection to books with broad appeal. An online bookstore can “display” nearly everything ever published. Although traditional bookstores often have a special-order department, consumers have taken to searching and ordering from online stores from the convenience of their homes and offices.

Although books can be made into purely digital artifacts, “e-books” have not sold nearly as well as digital music. In part, this disparity is due to the need for an e-book reader to have a large, bright screen, which adds to the display’s cost and weight and leads to more-frequent battery replacement. Also, it is difficult to match the handy design and low cost of an old-fashioned paperback book. Interestingly, it turns out that listeners download from online music vendors as many obscure songs as big record company hits. Just a few people interested in some obscure song are enough to make it worthwhile for a vendor to store it electronically for sale over the Internet. What makes the Internet special here is not only its ability to match buyers and sellers quickly and relatively inexpensively but also that the Internet and the digital economy in general allow for a flowering of multiple tastes—in games, people, and music.





Information and copyright

Education
Commerce and industry are certainly arenas in which the Internet has had a profound effect, but what of the foundational institutions of any society—namely, those related to education and the production of knowledge? Here the Internet has had a variety of effects, some of which are quite disturbing. There are more computers in the classroom than ever before, but there is scant evidence that they enhance the learning of basic skills in reading, writing, and arithmetic. And while access to vast amounts of digital information is convenient, it has also become apparent that most students now see libraries as antiquated institutions better used for their computer terminals than for their book collections. As teachers at all education levels can attest, students typically prefer to research their papers by reading online rather than wandering through a library’s stacks.

In a related effect the Internet has brought plagiarism into the computer era in two distinct senses. First, electronic texts have made it simple for students to “cut and paste” published sources (e.g., encyclopaedia articles) into their own papers. Second, although students could always get someone to write their papers for them, it is now much easier to find and purchase anonymous papers at Web sites and to even commission original term papers for a fixed fee. Ironically, what the Internet gives, it also takes away. Teachers now have access to databases of electronically submitted papers and can easily compare their own students’ papers against a vast archive of sources. Even a simple online search can sometimes find where one particularly well-turned phrase originally appeared.


File sharing
College students have been at the leading edge of the growing awareness of the centrality of intellectual property in a digital age. When American college student Shawn Fanning invented Napster in 1999, he set in motion an ongoing legal battle over digital rights. Napster was a file-sharing system that allowed users to share electronic copies of music online. The problem was obvious: recording companies were losing revenues as one legal copy of a song was shared among many people. Although the record companies succeeded in shutting down Napster, they found themselves having to contend with a new form of file sharing, P2P (“person-to-person”). In P2P there is no central administrator to shut down as there had been with Napster. Initially, the recording industry sued the makers of P2P software and a few of the most prolific users—often students located on university campuses with access to high-speed connections for serving music and, later, movie files—in an attempt to discourage the millions of people who regularly used the software. Still, even while some P2P software makers have been held liable for losses that the copyright owners have incurred, more-devious schemes for circumventing apprehension have been invented.

The inability to prevent file sharing has led the recording and movie industries to devise sophisticated copy protection on their CDs and DVDs. In a particularly controversial incident, Sony Corporation introduced CDs into the market in 2005 with copy protection that involved a special viruslike code that hid on a user’s computer. This code, however, also was open to being exploited by virus writers to gain control of users’ machines.


Electronic publishing
The Internet has become an invaluable and discipline-transforming environment for scientists and scholars. In 2004 Google began digitizing public-domain and out-of-print materials from several cooperating libraries in North America and Europe, such as the University of Michigan library, which made some seven million volumes available. Although some authors and publishers challenged the project for fear of losing control of copyrighted material, similar digitization projects were launched by Microsoft Corporation and the online book vendor Amazon.com, although the latter company proposed that each electronic page would be retrieved for a small fee shared with the copyright holders.

The majority of academic journals are now online and searchable. This has created a revolution in scholarly publishing, especially in the sciences and engineering. For example, arXiv.org has transformed the rate at which scientists publish and react to new theories and experimental data. Begun in 1991, arXiv.org is an online archive in which physicists, mathematicians, computer scientists, and computational biologists upload research papers long before they will appear in a print journal. The articles are then open to the scrutiny of the entire scientific community, rather than to one or two referees selected by a journal editor. In this way scientists around the world can receive an abstract of a paper as soon as it has been uploaded into the depository. If the abstract piques a reader’s interest, the entire paper can be downloaded for study. Cornell University in Ithaca, New York, and the U.S. National Science Foundation support arXiv.org as an international resource.

While arXiv.org deals with articles that might ultimately appear in print, it is also part of a larger shift in the nature of scientific publishing. In the print world a handful of companies control the publication of the most scientific journals, and the price of institutional subscriptions is frequently exorbitant. This has led to a growing movement to create online-only journals that are accessible for free to the entire public—a public that often supports the original research with its taxes. For example, the Public Library of Science publishes online journals of biology and medicine that compete with traditional print journals. There is no difference in how their articles are vetted for publication; the difference is that the material is made available for free. Unlike other creators of content, academics are not paid for what they publish in scholarly journals, nor are those who review the articles. Journal publishers, on the other hand, have long received subsidies from the scientific community, even while charging that community high prices for its own work. Although some commercial journals have reputations that can advance the careers of those who publish in them, the U.S. government has taken the side of the “open source” publishers and demanded that government-financed research be made available to taxpayers as soon as it has been published.

In addition to serving as a medium for the exchange of articles, the Internet can facilitate the discussion of scientific work long before it appears in print. Scientific blogs—online journals kept by individuals or groups of researchers—have flourished as a form of online salon for the discussion of ongoing research. There are pitfalls to such practices, though. Astronomers who in 2005 posted abstracts detailing the discovery of a potential 10th planet found that other researchers had used their abstracts to find the new astronomical body themselves. In order to claim priority of discovery, the original group rushed to hold a news conference rather than waiting to announce their work at an academic conference or in a peer-reviewed journal.






Politics and culture

Free speech
The Internet has broadened political participation by ordinary citizens, especially through the phenomenon of blogs. Many blogs are simply online diaries or journals, but others have become sources of information and opinion that challenge official government pronouncements or the mainstream news media. By 2005 there were approximately 15 million blogs, a number that was doubling roughly every six months. The United States dominates the blog universe, or “blogosphere,” with English as the lingua franca, but blogs in other languages are proliferating. In one striking development, the Iranian national language, Farsi, has become the commonest Middle Eastern language in the blogosphere. Despite the Iranian government’s attempts to limit access to the Internet, some 60,000 active Farsi blogs are hosted at a single service provider, PersianBlog.

The Internet poses a particular problem for autocratic regimes that restrict access to independent sources of information. The Chinese government has been particularly successful at policing the public’s access to the Internet, beginning with its “Great Firewall of China” that automatically blocks access to undesirable Web sites. The state also actively monitors Chinese Web sites to ensure that they adhere to government limits on acceptable discourse and tolerable dissent. In 2000 the Chinese government banned nine types of information, including postings that might “harm the dignity and interests of the state” or “disturb social order.” Users must enter their national identification number in order to access the Internet at cybercafés. Also, Internet service providers are responsible for the content on their servers. Hence, providers engage in a significant amount of self-censorship in order to avoid problems with the law, which may result in losing access to the Internet or even serving jail time. Finally, the authorities are willing to shut Web sites quickly and with no discussion. Of course, the state’s efforts are not completely effective. Information can be smuggled into China on DVDs, and creative Chinese users can circumvent the national firewall with proxy servers—Web sites that allow users to move through the firewall to an ostensibly acceptable Web site where they can connect to the rest of the Internet.

Others have taken advantage of the Internet’s openness to spread a variety of political messages. The Ukrainian Orange Revolution of 2004 had a significant Internet component. More troubling is the use of the Internet by terrorist groups such as al-Qaeda to recruit members, pass along instructions to sleeper cells, and celebrate their own horrific activities. The Iraq War was fought not only on the ground but also online as al-Qaeda operatives used specific Web sites to call their followers to jihad. Al-Qaeda used password-protected chat rooms as key recruitment centres, as well as Web sites to test potential recruits before granting them access to the group’s actual network. On the other hand, posting material online is also a potential vulnerability. Gaining access to the group’s “Jihad Encyclopaedia” has enabled security analysts to learn about potential tactics, and Arabic-speaking investigators have learned to infiltrate chat rooms and gain access to otherwise hidden materials.


Political campaigns and muckraking
During the 2004 U.S. presidential campaign, blogs became a locus for often heated exchanges about the candidates. In fact, the candidates themselves used blogs and Web sites for fund-raising and networking. One of the first innovators was Howard Dean, an early front-runner in the Democratic primaries, whose campaign used a Web site for fund-raising and organizing local meetings. In particular, Dean demonstrated that a modern presidential campaign could use the Internet to galvanize volunteer campaign workers and to raise significant sums from many small donations. In a particularly astute move, Dean’s campaign set up a blog for comments from his supporters, and it generated immediate feedback on certain proposals such as refusing to accept public campaign funding. Both the George W. Bush and the John Kerry presidential campaigns, as well as the Democratic and Republican parties, came to copy the practices pioneered by Dean and his advisers. In addition, changes in U.S. campaign finance laws allowed for the creation of “527s,” independent action groups such as Moveon.org that used the Internet to raise funds and rally support for particular issues and candidates.

By 2005 it was widely agreed that politicians would have to deal not only with the mainstream media (i.e., newspapers, magazines, radio, and television) but also with a new phenomenon—the blogosphere. Although blogs do not have editors or fact checkers, they have benefited from scandals in the mainstream media, which have made many readers more skeptical of all sources of information. Also, bloggers have forced mainstream media to confront topics they might otherwise ignore. Some pundits have gone so far as to predict that blogs and online news sources will replace the mainstream media, but it is far more likely that these diverse sources of information will complement each other. Indeed, falling subscription rates have led many newspaper publishers to branch into electronic editions and to incorporate editorial blogs and forums for reader feedback; thus, some of the distinctions between the media have already been blurred.

Michael Aaron Dennis

Encyclopaedia Britannica
 

 


NASA Tests First Deep-Space Internet

 

 

Discuss Art

Please note: site admin does not answer any questions. This is our readers discussion only.

 
| privacy