Copyright © 2015 Bert N. Langford (Images may be subject to copyright. Please send feedback)
Welcome to Our Generation USA!
Culture
as a way of life for a specific group of people, including their behavior, race, ethnicity, beliefs, values, and symbols that they accept, generally without thinking about them, and that can be passed along by communication and imitation from one generation to the next.
Culture
YouTube Video about Vietnamese Culture
Pictured: LEFT: Sri Mariamman Temple, Singapore By AngMoKio. - Own work., CC BY-SA 3.0; RIGHT: Celebrations, rituals, and patterns of consumption are important aspects of folk culture.

Culture is, in the words of E.B. Tylor, "that complex whole which includes knowledge, belief, art, morals, law, custom and any other capabilities and habits acquired by man as a member of society."
Cambridge English Dictionary states that culture is, "the way of life, especially the general customs and beliefs, of a particular group of people at a particular time." Terror Management Theory posits that culture is a series of activities and worldviews that provide humans with the illusion of being individuals of value in a world meaning—raising themselves above the merely physical aspects of existence, in order to deny the animal insignificance and death that Homo Sapiens became aware of when they acquired a larger brain.
As a defining aspect of what it means to be human, culture is a central concept in anthropology, encompassing the range of phenomena that are transmitted through social learning in human societies.
The word is used in a general sense as the evolved ability to categorize and represent experiences with symbols and to act imaginatively and creatively. This ability arose with the evolution of behavioral modernity in humans around 50,000 years ago. This capacity is often thought to be unique to humans, although some other species have demonstrated similar, though much less complex abilities for social learning.
It is also used to denote the complex networks of practices and accumulated knowledge and ideas that is transmitted through social interaction and exist in specific human groups, or cultures, using the plural form. Some aspects of human behavior, such as language, social practices such as kinship, gender and marriage, expressive forms such as art, music, dance, ritual, religion, and technologies such as cooking, shelter, clothing are said to be cultural universals, found in all human societies.
The concept material culture covers the physical expressions of culture, such as technology, architecture and art, whereas the immaterial aspects of culture such as principles of social organization (including, practices of political organization and social institutions), mythology, philosophy, literature (both written and oral), and science make up the intangible cultural heritage of a society.
In the humanities, one sense of culture, as an attribute of the individual, has been the degree to which they have cultivated a particular level of sophistication, in the arts, sciences, education, or manners. The level of cultural sophistication has also sometimes been seen to distinguish civilizations from less complex societies.
Such hierarchical perspectives on culture are also found in class-based distinctions between a high culture of the social elite and a low culture, popular culture or folk culture of the lower classes, distinguished by the stratified access to cultural capital.
In common parlance, culture is often used to refer specifically to the symbolic markers used by ethnic groups to distinguish themselves visibly from each other such as body modification, clothing or jewelry.
Mass culture refers to the mass-produced and mass mediated forms of consumer culture that emerged in the 20th century. Some schools of philosophy, such as Marxism and critical theory, have argued that culture is often used politically as a tool of the elites to manipulate the lower classes and create a false consciousness, such perspectives common in the discipline of cultural studies.
In the wider social sciences, the theoretical perspective of cultural materialism holds that human symbolic culture arises from the material conditions of human life, as humans create the conditions for physical survival, and that the basis of culture is found in evolved biological dispositions.
When used as a count noun "a culture", is the set of customs, traditions and values of a society or community, such as an ethnic group or nation. In this sense, multiculturalism is a concept that values the peaceful coexistence and mutual respect between different cultures inhabiting the same territory.
Sometimes "culture" is also used to describe specific practices within a subgroup of a society, a subculture (e.g. "bro culture"), or a counter culture. Within cultural anthropology, the ideology and analytical stance of cultural relativism holds that cultures cannot easily be objectively ranked or evaluated because any evaluation is necessarily situated within the value system of a given culture.
Click on any of the following blue hyperlinks for amplification:
Cambridge English Dictionary states that culture is, "the way of life, especially the general customs and beliefs, of a particular group of people at a particular time." Terror Management Theory posits that culture is a series of activities and worldviews that provide humans with the illusion of being individuals of value in a world meaning—raising themselves above the merely physical aspects of existence, in order to deny the animal insignificance and death that Homo Sapiens became aware of when they acquired a larger brain.
As a defining aspect of what it means to be human, culture is a central concept in anthropology, encompassing the range of phenomena that are transmitted through social learning in human societies.
The word is used in a general sense as the evolved ability to categorize and represent experiences with symbols and to act imaginatively and creatively. This ability arose with the evolution of behavioral modernity in humans around 50,000 years ago. This capacity is often thought to be unique to humans, although some other species have demonstrated similar, though much less complex abilities for social learning.
It is also used to denote the complex networks of practices and accumulated knowledge and ideas that is transmitted through social interaction and exist in specific human groups, or cultures, using the plural form. Some aspects of human behavior, such as language, social practices such as kinship, gender and marriage, expressive forms such as art, music, dance, ritual, religion, and technologies such as cooking, shelter, clothing are said to be cultural universals, found in all human societies.
The concept material culture covers the physical expressions of culture, such as technology, architecture and art, whereas the immaterial aspects of culture such as principles of social organization (including, practices of political organization and social institutions), mythology, philosophy, literature (both written and oral), and science make up the intangible cultural heritage of a society.
In the humanities, one sense of culture, as an attribute of the individual, has been the degree to which they have cultivated a particular level of sophistication, in the arts, sciences, education, or manners. The level of cultural sophistication has also sometimes been seen to distinguish civilizations from less complex societies.
Such hierarchical perspectives on culture are also found in class-based distinctions between a high culture of the social elite and a low culture, popular culture or folk culture of the lower classes, distinguished by the stratified access to cultural capital.
In common parlance, culture is often used to refer specifically to the symbolic markers used by ethnic groups to distinguish themselves visibly from each other such as body modification, clothing or jewelry.
Mass culture refers to the mass-produced and mass mediated forms of consumer culture that emerged in the 20th century. Some schools of philosophy, such as Marxism and critical theory, have argued that culture is often used politically as a tool of the elites to manipulate the lower classes and create a false consciousness, such perspectives common in the discipline of cultural studies.
In the wider social sciences, the theoretical perspective of cultural materialism holds that human symbolic culture arises from the material conditions of human life, as humans create the conditions for physical survival, and that the basis of culture is found in evolved biological dispositions.
When used as a count noun "a culture", is the set of customs, traditions and values of a society or community, such as an ethnic group or nation. In this sense, multiculturalism is a concept that values the peaceful coexistence and mutual respect between different cultures inhabiting the same territory.
Sometimes "culture" is also used to describe specific practices within a subgroup of a society, a subculture (e.g. "bro culture"), or a counter culture. Within cultural anthropology, the ideology and analytical stance of cultural relativism holds that cultures cannot easily be objectively ranked or evaluated because any evaluation is necessarily situated within the value system of a given culture.
Click on any of the following blue hyperlinks for amplification:
- Etymology
- Change
- Early modern discourses
- German Romanticism
- English Romanticism
- Anthropology
- Sociology
- Cultural studies
- Cultural dynamics
- See also:
Cities with Significant Ethnic Culture
YouTube Video of Chinatown, San Francisco, CA
Pictured: LEFT: Chinatown in San Francisco, CA: RIGHT: Hispanic Community in Chicago, IL
The above link provides a list of United States cities with large ethnic minority populations. (There are many cities in the US with no ethnic majority.)
Every large society contains ethnic minorities and linguistic minorities. Their style of life, language, culture and origin can differ from the majority. The minority status is conditioned not only by a clearly numerical relations but also by questions of political power.
In some places, subordinate ethnic groups may constitute a numerical majority, such as Blacks in South Africa under apartheid. In addition to the "traditional" (longtime resident) minorities they may be migrant, indigenous or landless nomadic communities.
There is no legal definition of national (ethnic) minorities in international law. Only in Europe is this exact definition (probably) provided by the European Charter for Regional or Minority Languages and by the Framework Convention for the Protection of National Minorities.
In the United States, for example, white Americans constitute the majority (72.4%) and all other racial groups (African Americans, Asian Americans, Native Americans, and Native Hawaiians) are classified as "minorities".
If the non-Hispanic white population falls below 50% the group will only be the plurality, not the majority. However, national minority can be theoretically (not legally) defined as a group of people within a given national state:
International criminal law can protect the rights of racial or ethnic minorities in a number of ways. The right to self-determination is a key issue. The formal level of protection of national (ethnic) minorities is highest in European countries.
Every large society contains ethnic minorities and linguistic minorities. Their style of life, language, culture and origin can differ from the majority. The minority status is conditioned not only by a clearly numerical relations but also by questions of political power.
In some places, subordinate ethnic groups may constitute a numerical majority, such as Blacks in South Africa under apartheid. In addition to the "traditional" (longtime resident) minorities they may be migrant, indigenous or landless nomadic communities.
There is no legal definition of national (ethnic) minorities in international law. Only in Europe is this exact definition (probably) provided by the European Charter for Regional or Minority Languages and by the Framework Convention for the Protection of National Minorities.
In the United States, for example, white Americans constitute the majority (72.4%) and all other racial groups (African Americans, Asian Americans, Native Americans, and Native Hawaiians) are classified as "minorities".
If the non-Hispanic white population falls below 50% the group will only be the plurality, not the majority. However, national minority can be theoretically (not legally) defined as a group of people within a given national state:
- which is numerically smaller than the rest of population of the state or a part of the state
- which is not in a dominant position
- which has culture, language, religion, race etc. distinct from that of the majority of the population
- whose members have a will to preserve their specificity
- whose members are citizens of the state where they have the status of a minority.
- which have a long-term presence on the territory where it has lived.
International criminal law can protect the rights of racial or ethnic minorities in a number of ways. The right to self-determination is a key issue. The formal level of protection of national (ethnic) minorities is highest in European countries.
Civil Rights Act of 1960
YouTube Video about the History of the Civil Rights Movement
Pictured: LEFT: Civil Rights March, 1960; RIGHT: President Dwight D. Eisenhower signs Civil Rights Act 1960 into law.
The Civil Rights Act of 1960 (Pub.L. 86–449, 74 Stat. 89, enacted May 6, 1960) was a United States federal law that established federal inspection of local voter registration polls and introduced penalties for anyone who obstructed someone's attempt to register to vote.
It was designed to deal with discriminatory laws and practices in the segregated South, by which blacks and Mexican Texans had been effectively disfranchised since the late 19th and start of the 20th century. It extended the life of the Civil Rights Commission, previously limited to two years, to oversee registration and voting practices. The act was signed into law by President Dwight D. Eisenhower and served to eliminate certain loopholes left by the Civil Rights Act of 1957.
Toward the end of his presidency, President Eisenhower supported civil rights legislation. In his message to Congress, he proposed seven recommendations for the protection of civil rights:
Subsequent history:
After the subsequent intensive acts of 1964 and 1965, the act of 1957 and the Civil Rights Act of 1960 were deemed ineffective for the firm establishment of civil rights. The later legislation had firmer ground for the enforcement and protection of a variety of civil rights, where the acts of 1957 and 1960 were largely limited to voting rights. The Civil Rights Act of 1960 dealt with race and color but omitted coverage of those discriminated against for national origin, although Eisenhower had called for it in his message to Congress.
The Civil Rights Act of 1964 and Voting Rights Act of 1965 worked to fulfill the seven goals suggested by President Eisenhower in 1959. The two satisfied proponents of the civil rights movement to end racial discrimination and protect legal equality in the United States.
It was designed to deal with discriminatory laws and practices in the segregated South, by which blacks and Mexican Texans had been effectively disfranchised since the late 19th and start of the 20th century. It extended the life of the Civil Rights Commission, previously limited to two years, to oversee registration and voting practices. The act was signed into law by President Dwight D. Eisenhower and served to eliminate certain loopholes left by the Civil Rights Act of 1957.
Toward the end of his presidency, President Eisenhower supported civil rights legislation. In his message to Congress, he proposed seven recommendations for the protection of civil rights:
- Strengthen the laws that would root out threats to obstruct court orders in school desegregation cases
- Provide more investigative authority to the Federal Bureau of Investigation in crimes involving the destruction of schools/churches
- Grant Attorney General power to investigate Federal election records
- Provide temporary program for aid to agencies to assist changes necessary for school desegregation decisions
- Authorize provision of education for children of the armed forces
- Consider establishing a statutory Commission on Equal Job Opportunity Under Government Contracts (later mandated in the Civil Rights Act of 1964 to create the Equal Employment Opportunity Commission)
- Extend the Civil Rights Commission an additional two years
Subsequent history:
After the subsequent intensive acts of 1964 and 1965, the act of 1957 and the Civil Rights Act of 1960 were deemed ineffective for the firm establishment of civil rights. The later legislation had firmer ground for the enforcement and protection of a variety of civil rights, where the acts of 1957 and 1960 were largely limited to voting rights. The Civil Rights Act of 1960 dealt with race and color but omitted coverage of those discriminated against for national origin, although Eisenhower had called for it in his message to Congress.
The Civil Rights Act of 1964 and Voting Rights Act of 1965 worked to fulfill the seven goals suggested by President Eisenhower in 1959. The two satisfied proponents of the civil rights movement to end racial discrimination and protect legal equality in the United States.
Civil Rights Act of 1964
YouTube Video: President Lyndon B. Johnson Signs Civil Rights Act, Gives Pen to Dr. Martin Luther King Jr.*
Pictured: President Lyndon B. Johnson signing Civil Rights Act of 1964 into law
*-- Dr. Martin Luther King, Jr.
The Civil Rights Act of 1964 (enacted July 2, 1964) is a landmark piece of civil rights legislation in the United States that outlawed discrimination based on race, color, religion, sex, or national origin.
It ended unequal application of voter registration requirements and racial segregation in schools, at the workplace and by facilities that served the general public (known as "public accommodations").
Powers given to enforce the act were initially weak, but were supplemented during later years. Congress asserted its authority to legislate under several different parts of the United States Constitution, principally its power to regulate interstate commerce under Article One (section 8), its duty to guarantee all citizens equal protection of the laws under the Fourteenth Amendment and its duty to protect voting rights under the Fifteenth Amendment.
The Act was signed into law by President Lyndon B. Johnson on July 2, 1964, at the White House.
It ended unequal application of voter registration requirements and racial segregation in schools, at the workplace and by facilities that served the general public (known as "public accommodations").
Powers given to enforce the act were initially weak, but were supplemented during later years. Congress asserted its authority to legislate under several different parts of the United States Constitution, principally its power to regulate interstate commerce under Article One (section 8), its duty to guarantee all citizens equal protection of the laws under the Fourteenth Amendment and its duty to protect voting rights under the Fifteenth Amendment.
The Act was signed into law by President Lyndon B. Johnson on July 2, 1964, at the White House.
Contributors to African-American Culture
YouTube Video: Evolution of African American Music
Pictured: Vogue Magazine Covers of LEFT: Beyoncé and RIGHT: Rihanna
African-American culture, also known as Black-American culture, in the United States refers to the cultural contributions of African Americans to the culture of the United States, either as part of or distinct from American culture. The distinct identity of African-American culture is rooted in the historical experience of the African-American people, including the Middle Passage. The culture is both distinct and enormously influential to American culture as a whole.
African-American culture is rooted in West and Central Africa. Understanding its identity within the culture of the United States it is, in the anthropological sense, conscious of its origins as largely a blend of West and Central African cultures. Although slavery greatly restricted the ability of African-Americans to practice their original cultural traditions, many practices, values and beliefs survived, and over time have modified and/or blended with European cultures and other cultures such as that of Native Americans.
African-American identity was established during the slavery period, producing a dynamic culture that has had and continues to have a profound impact on American culture as a whole, as well as that of the broader world.
Elaborate rituals and ceremonies were a significant part of African Americans' ancestral culture. Many West African societies traditionally believed that spirits dwelled in their surrounding nature. From this disposition, they treated their environment with mindful care. They also generally believed that a spiritual life source existed after death, and that ancestors in this spiritual realm could then mediate between the supreme creator and the living. Honor and prayer was displayed to these "ancient ones," the spirit of those past. West Africans also believed in spiritual possession.
In the beginning of the eighteenth century Christianity began to spread across North Africa; this shift in religion began displacing traditional African spiritual practices. The enslaved Africans brought this complex religious dynamic within their culture to America. This fusion of traditional African beliefs with Christianity provided a common place for those practicing religion in Africa and America.
After emancipation, unique African-American traditions continued to flourish, as distinctive traditions or radical innovations in music, art, literature, religion, cuisine, and other fields. 20th-century sociologists, such as Gunnar Myrdal, believed that African Americans had lost most cultural ties with Africa.
But, anthropological field research by Melville Herskovits and others demonstrated that there has been a continuum of African traditions among Africans of the Diaspora. The greatest influence of African cultural practices on European culture is found below the Mason-Dixon line in the American South.
For many years African-American culture developed separately from European-American culture, both because of slavery and the persistence of racial discrimination in America, as well as African-American slave descendants' desire to create and maintain their own traditions. Today, African-American culture has become a significant part of American culture and yet, at the same time, remains a distinct cultural body.
African-American culture is rooted in West and Central Africa. Understanding its identity within the culture of the United States it is, in the anthropological sense, conscious of its origins as largely a blend of West and Central African cultures. Although slavery greatly restricted the ability of African-Americans to practice their original cultural traditions, many practices, values and beliefs survived, and over time have modified and/or blended with European cultures and other cultures such as that of Native Americans.
African-American identity was established during the slavery period, producing a dynamic culture that has had and continues to have a profound impact on American culture as a whole, as well as that of the broader world.
Elaborate rituals and ceremonies were a significant part of African Americans' ancestral culture. Many West African societies traditionally believed that spirits dwelled in their surrounding nature. From this disposition, they treated their environment with mindful care. They also generally believed that a spiritual life source existed after death, and that ancestors in this spiritual realm could then mediate between the supreme creator and the living. Honor and prayer was displayed to these "ancient ones," the spirit of those past. West Africans also believed in spiritual possession.
In the beginning of the eighteenth century Christianity began to spread across North Africa; this shift in religion began displacing traditional African spiritual practices. The enslaved Africans brought this complex religious dynamic within their culture to America. This fusion of traditional African beliefs with Christianity provided a common place for those practicing religion in Africa and America.
After emancipation, unique African-American traditions continued to flourish, as distinctive traditions or radical innovations in music, art, literature, religion, cuisine, and other fields. 20th-century sociologists, such as Gunnar Myrdal, believed that African Americans had lost most cultural ties with Africa.
But, anthropological field research by Melville Herskovits and others demonstrated that there has been a continuum of African traditions among Africans of the Diaspora. The greatest influence of African cultural practices on European culture is found below the Mason-Dixon line in the American South.
For many years African-American culture developed separately from European-American culture, both because of slavery and the persistence of racial discrimination in America, as well as African-American slave descendants' desire to create and maintain their own traditions. Today, African-American culture has become a significant part of American culture and yet, at the same time, remains a distinct cultural body.
Pop Culture in the United States
YouTube Video from "The Many Loves of Dobie Gillis" (3/9) An Honest and Decent Man (1959)
Pictured: LEFT: Young hippies near the Woodstock festival in August 1969; RIGHT: a picture of Jack Kerouac, who is considered a literary iconoclast and, alongside William S. Burroughs and Allen Ginsberg, is a pioneer of the Beat Generation.
Popular culture or pop culture is the entirety of ideas, perspectives, attitudes, images, and other phenomena that are within the mainstream of a given culture, especially Western culture of the early to mid 20th century and the emerging global mainstream of the late 20th and early 21st century.
Heavily influenced by mass media, this collection of ideas permeates the everyday lives of the society. The most common pop culture categories are: entertainment (movies, music, TV), sports, news (as in people/places in news), politics, fashion/clothes, technology, and slang.
Popular culture is often viewed as being trivial and "dumbed down" in order to find consensual acceptance throughout the mainstream. As a result, it comes under heavy criticism from various non-mainstream sources (most notably religious groups and countercultural groups) which deem it superficial, consumerist, sensationalist, and/or corrupt.
From the end of World War II, following major cultural and social changes brought by mass media innovations, the meaning of popular culture began to overlap with those of mass culture, media culture, image culture, consumer culture, and culture for mass consumption.
Social and cultural changes in the United States were a pioneer in this with respect to other western countries.
The abbreviated form "pop" for popular. as in pop music, dates from the late 1950s. Although terms "pop" and "popular" are in some cases used interchangeably, and their meaning partially overlap, the term "pop" is narrower. Pop is specific of something containing qualities of mass appeal, while "popular" refers to what has gained popularity, regardless of its style.
According to John Storey, there are six definitions of popular culture. The quantitative definition of culture has the problem that much "high culture" (e.g., television dramatizations of Jane Austen) is also "popular."
"Pop culture" is also defined as the culture that is "left over" when we have decided what high culture is. However, many works straddle the boundaries, e.g., Shakespeare and Charles Dickens.
A third definition equates pop culture with "mass culture" and ideas. This is seen as a commercial culture, mass-produced for mass consumption by mass media. From a Western European perspective, this may be compared to American culture. Alternatively, "pop culture" can be defined as an "authentic" culture of the people, but this can be problematic because there are many ways of defining the "people."
Storey argued that there is a political dimension to popular culture; neo-Gramscian hegemony theory "... sees popular culture as a site of struggle between the 'resistance' of subordinate groups in society and the forces of 'incorporation' operating in the interests of dominant groups in society." A postmodernist approach to popular culture would "no longer recognize the distinction between high and popular culture."
Storey claims that popular culture emerges from the urbanization of the Industrial Revolution. Studies of Shakespeare (by Weimann, Barber or Bristol, for example) locate much of the characteristic vitality of his drama in its participation in Renaissance popular culture, while contemporary practitioners like Dario Fo and John McGrath use popular culture in its Gramscian sense that includes ancient folk traditions (the commedia dell'arte for example).
Popular culture changes constantly and occurs uniquely in place and time. It forms currents and eddies, and represents a complex of mutually interdependent perspectives and values that influence society and its institutions in various ways.
For example, certain currents of pop culture may originate from, (or diverge into) a subculture, representing perspectives with which the mainstream popular culture has only limited familiarity.
Items of popular culture most typically appeal to a broad spectrum of the public. Important contemporary contributions for understanding what popular culture means have been given by the German researcher Ronald Daus, who studies the impact of extra-European cultures in North America, Asia and especially in Latin America.
Click here to read more.
Heavily influenced by mass media, this collection of ideas permeates the everyday lives of the society. The most common pop culture categories are: entertainment (movies, music, TV), sports, news (as in people/places in news), politics, fashion/clothes, technology, and slang.
Popular culture is often viewed as being trivial and "dumbed down" in order to find consensual acceptance throughout the mainstream. As a result, it comes under heavy criticism from various non-mainstream sources (most notably religious groups and countercultural groups) which deem it superficial, consumerist, sensationalist, and/or corrupt.
From the end of World War II, following major cultural and social changes brought by mass media innovations, the meaning of popular culture began to overlap with those of mass culture, media culture, image culture, consumer culture, and culture for mass consumption.
Social and cultural changes in the United States were a pioneer in this with respect to other western countries.
The abbreviated form "pop" for popular. as in pop music, dates from the late 1950s. Although terms "pop" and "popular" are in some cases used interchangeably, and their meaning partially overlap, the term "pop" is narrower. Pop is specific of something containing qualities of mass appeal, while "popular" refers to what has gained popularity, regardless of its style.
According to John Storey, there are six definitions of popular culture. The quantitative definition of culture has the problem that much "high culture" (e.g., television dramatizations of Jane Austen) is also "popular."
"Pop culture" is also defined as the culture that is "left over" when we have decided what high culture is. However, many works straddle the boundaries, e.g., Shakespeare and Charles Dickens.
A third definition equates pop culture with "mass culture" and ideas. This is seen as a commercial culture, mass-produced for mass consumption by mass media. From a Western European perspective, this may be compared to American culture. Alternatively, "pop culture" can be defined as an "authentic" culture of the people, but this can be problematic because there are many ways of defining the "people."
Storey argued that there is a political dimension to popular culture; neo-Gramscian hegemony theory "... sees popular culture as a site of struggle between the 'resistance' of subordinate groups in society and the forces of 'incorporation' operating in the interests of dominant groups in society." A postmodernist approach to popular culture would "no longer recognize the distinction between high and popular culture."
Storey claims that popular culture emerges from the urbanization of the Industrial Revolution. Studies of Shakespeare (by Weimann, Barber or Bristol, for example) locate much of the characteristic vitality of his drama in its participation in Renaissance popular culture, while contemporary practitioners like Dario Fo and John McGrath use popular culture in its Gramscian sense that includes ancient folk traditions (the commedia dell'arte for example).
Popular culture changes constantly and occurs uniquely in place and time. It forms currents and eddies, and represents a complex of mutually interdependent perspectives and values that influence society and its institutions in various ways.
For example, certain currents of pop culture may originate from, (or diverge into) a subculture, representing perspectives with which the mainstream popular culture has only limited familiarity.
Items of popular culture most typically appeal to a broad spectrum of the public. Important contemporary contributions for understanding what popular culture means have been given by the German researcher Ronald Daus, who studies the impact of extra-European cultures in North America, Asia and especially in Latin America.
Click here to read more.
Cultural History of the United States
YouTube Video: the Rolling Stones performing "Miss You" (Live in St. Louis -- 1997)
Pictured: LEFT: Thomas Jefferson designed his Georgian style Monticello estate in Virginia, the only World Heritage Site home in the United States; RIGHT: Apple pie is one of a number of American cultural icons.
The cultural history of the United States covers the cultural history of the United States since its founding in the late 18th century.
Various immigrant groups have been at play in the formation of the nation's culture. While different ethnic groups may display their own insular cultural aspects, throughout time a broad American culture has developed that encompasses the entire country. Developments in the culture of the United States in modern history have often been followed by similar changes in the rest of the world (American cultural imperialism).
This includes knowledge, customs, and arts of Americans; and events in the social, cultural, and political
See Also:
Various immigrant groups have been at play in the formation of the nation's culture. While different ethnic groups may display their own insular cultural aspects, throughout time a broad American culture has developed that encompasses the entire country. Developments in the culture of the United States in modern history have often been followed by similar changes in the rest of the world (American cultural imperialism).
This includes knowledge, customs, and arts of Americans; and events in the social, cultural, and political
See Also:
- Architecture of the United States
- Christianity in the United States
- Counterculture of the 1960s
- Cuisine of the United States
- History of education in the United States
- History of women in the United States
- United States religious history
- Entertainment:
- Fine arts:
Cultural Events: 1969 Woodstock Festival
- YouTube Video of the Woodstock Festival: Woodstock 1969: The Music
- YouTube Video: Janis Joplin singing Ball & Chain Live At Woodstock 1969
- YouTube Video of Jimi Hendrix performing Purple Haze Live at Woodstock 1969
The Woodstock Music & Art Fair—informally, the Woodstock Festival or simply Woodstock—was a music festival attracting an audience of 400,000 people, scheduled over three days on a dairy farm in New York state from August 15 to 17, 1969 but which ran over four days to August 18, 1969.
Billed as "An Aquarian Exposition: 3 Days of Peace & Music", it was held at Max Yasgur's 600-acre (240 ha; 0.94 sq mi) dairy farm in the Catskills near the hamlet of White Lake in the town of Bethel. Bethel, in Sullivan County, is 43 miles (69 km) southwest of the town of Woodstock, New York, in adjoining Ulster County.
During the sometimes rainy weekend, 32 acts performed outdoors before an audience of 400,000 people. It is widely regarded as a pivotal moment in popular music history, as well as the definitive nexus for the larger counterculture generation.
Rolling Stone listed it as one of the 50 Moments That Changed the History of Rock and Roll.
The event was captured in the Academy Award winning 1970 documentary movie Woodstock, an accompanying soundtrack album, and Joni Mitchell's song "Woodstock", which commemorated the event and became a major hit for both Crosby, Stills, Nash & Young and Matthews Southern Comfort.
Click Here to Read More.
Billed as "An Aquarian Exposition: 3 Days of Peace & Music", it was held at Max Yasgur's 600-acre (240 ha; 0.94 sq mi) dairy farm in the Catskills near the hamlet of White Lake in the town of Bethel. Bethel, in Sullivan County, is 43 miles (69 km) southwest of the town of Woodstock, New York, in adjoining Ulster County.
During the sometimes rainy weekend, 32 acts performed outdoors before an audience of 400,000 people. It is widely regarded as a pivotal moment in popular music history, as well as the definitive nexus for the larger counterculture generation.
Rolling Stone listed it as one of the 50 Moments That Changed the History of Rock and Roll.
The event was captured in the Academy Award winning 1970 documentary movie Woodstock, an accompanying soundtrack album, and Joni Mitchell's song "Woodstock", which commemorated the event and became a major hit for both Crosby, Stills, Nash & Young and Matthews Southern Comfort.
Click Here to Read More.
Counterculture of the 1960s
YouTube Video: John Lennon - "Give Peace A Chance" (1969)
Pictured: LEFT: “The Summer of Love” 1967 Festival in San Francisco; RIGHT: Scene from the 1967 March on the Pentagon anti-Vietnam War Protest.
The counterculture of the 1960s refers to an anti-establishment cultural phenomenon that developed first in the United States and the United Kingdom, and then spread throughout much of the Western world between the early 1960s and the mid-1970s, with London, New York City, and San Francisco being hotbeds of early counter-cultural activity.
The aggregate movement gained momentum as the American Civil Rights Movement continued to grow, and became revolutionary with the expansion of the US government's extensive military intervention in Vietnam.
As the 1960s progressed, widespread social tensions also developed concerning other issues, and tended to flow along generational lines regarding human sexuality, women's rights, traditional modes of authority, experimentation with psychoactive drugs, and differing interpretations of the American Dream.
As the era unfolded, new cultural forms and a dynamic subculture which celebrated experimentation, modern incarnations of Bohemianism, and the rise of the hippie and other alternative lifestyles, emerged.
This embracing of creativity is particularly notable in the works of British Invasion bands such as the Beatles, and filmmakers whose works became far less restricted by censorship. In addition to the trendsetting Beatles, many other creative artists, authors, and thinkers, within and across many disciplines, helped define the counterculture movement.
Several factors distinguished the counterculture of the 1960s from the anti-authoritarian movements of previous eras. The post-World War II "baby boom" generated an unprecedented number of potentially disaffected young people as prospective participants in a rethinking of the direction of American and other democratic societies.
Post-war affluence allowed many of the counterculture generation to move beyond a focus on the provision of the material necessities of life that had preoccupied their Depression-era parents.
The era was also notable in that a significant portion of the array of behaviors and "causes" within the larger movement were quickly assimilated within mainstream society, particularly in the US, even though counterculture participants numbered in the clear minority within their respective national populations.
The counterculture era essentially commenced in earnest with the assassination of John F. Kennedy in November of 1963. It became absorbed into the popular culture with the termination of U.S. combat-military involvement in Southeast Asia and the end of the draft in 1973, and ultimately with the resignation of President Richard M. Nixon in August 1974.
Many key movements were born of, or were advanced within, the counterculture of the 1960s. Each movement is relevant to the larger era. The most important stand alone, irrespective of the larger counterculture.
In the broadest sense, 1960s counterculture grew from a confluence of people, ideas, events, issues, circumstances, and technological developments which served as intellectual and social catalysts for exceptionally rapid change during the era.
Click Here to Read More.
The aggregate movement gained momentum as the American Civil Rights Movement continued to grow, and became revolutionary with the expansion of the US government's extensive military intervention in Vietnam.
As the 1960s progressed, widespread social tensions also developed concerning other issues, and tended to flow along generational lines regarding human sexuality, women's rights, traditional modes of authority, experimentation with psychoactive drugs, and differing interpretations of the American Dream.
As the era unfolded, new cultural forms and a dynamic subculture which celebrated experimentation, modern incarnations of Bohemianism, and the rise of the hippie and other alternative lifestyles, emerged.
This embracing of creativity is particularly notable in the works of British Invasion bands such as the Beatles, and filmmakers whose works became far less restricted by censorship. In addition to the trendsetting Beatles, many other creative artists, authors, and thinkers, within and across many disciplines, helped define the counterculture movement.
Several factors distinguished the counterculture of the 1960s from the anti-authoritarian movements of previous eras. The post-World War II "baby boom" generated an unprecedented number of potentially disaffected young people as prospective participants in a rethinking of the direction of American and other democratic societies.
Post-war affluence allowed many of the counterculture generation to move beyond a focus on the provision of the material necessities of life that had preoccupied their Depression-era parents.
The era was also notable in that a significant portion of the array of behaviors and "causes" within the larger movement were quickly assimilated within mainstream society, particularly in the US, even though counterculture participants numbered in the clear minority within their respective national populations.
The counterculture era essentially commenced in earnest with the assassination of John F. Kennedy in November of 1963. It became absorbed into the popular culture with the termination of U.S. combat-military involvement in Southeast Asia and the end of the draft in 1973, and ultimately with the resignation of President Richard M. Nixon in August 1974.
Many key movements were born of, or were advanced within, the counterculture of the 1960s. Each movement is relevant to the larger era. The most important stand alone, irrespective of the larger counterculture.
In the broadest sense, 1960s counterculture grew from a confluence of people, ideas, events, issues, circumstances, and technological developments which served as intellectual and social catalysts for exceptionally rapid change during the era.
Click Here to Read More.
Culture of the United States
YouTube Video of the Star Spangled Banner by Lady Gaga - Live at Super Bowl 50
Pictured: United States National Symbols include LEFT: The Official Seal; RIGHT: the Official Bird: The Bald Eagle
The culture of the United States of America is primarily Western, but is influenced by African, Native American, Asian, Polynesian, and Latin American cultures.
A strand of what may be described as American culture started its formation over 10,000 years ago with the migration of Paleo-Indians from Asia, Oceania, and Europe, into the region that is today the continental United States.
The United States of America has its own unique social and cultural characteristics such as dialect, music, arts, social habits, cuisine, and folklore.
The United States of America is an ethnically and racially diverse country as a result of large-scale migration from many ethnically and racially different countries throughout its history. Differing birth and death rates among natives, settlers, and immigrants are also a factor.
Its chief early European influences came from English settlers of colonial America during British rule. Due to colonial ties with Britain that spread the English language, British culture, legal system and other cultural inheritances, had a formative influence. Other important influences came from other parts of Europe, especially Germany.
Original elements also play a strong role, such as Jeffersonian democracy. Thomas Jefferson's Notes on the State of Virginia was perhaps the first influential domestic cultural critique by an American and a reactionary piece to the prevailing European consensus that America's domestic originality was degenerate.
American culture includes both conservative and liberal elements, scientific and religious competitiveness, political structures, risk taking and free expression, materialist and moral elements. Despite certain consistent ideological principles (e.g. individualism, egalitarianism, and faith in freedom and democracy), American culture has a variety of expressions due to its geographical scale and demographic diversity. The flexibility of U.S. culture and its highly symbolic nature lead some researchers to categorize American culture as a mythic identity; others see it as American exceptionalism .
It also includes elements that evolved from Indigenous Americans, and other ethnic cultures—most prominently the culture of African Americans, cultures from Latin America, and Asian American cultures. Many American cultural elements, especially from popular culture, have spread across the globe through modern mass media.
The United States has traditionally been thought of as a melting pot. However, beginning in the 1960s and continuing on in the present day, the country trends towards cultural diversity, pluralism, and the image of a salad bowl instead.
Due to the extent of American culture, there are many integrated but unique social subcultures within the United States. The cultural affiliations an individual in the United States may have commonly depend on social class, political orientation and a multitude of demographic characteristics such as religious background, occupation and ethnic group membership.
Click here for further amplification.
A strand of what may be described as American culture started its formation over 10,000 years ago with the migration of Paleo-Indians from Asia, Oceania, and Europe, into the region that is today the continental United States.
The United States of America has its own unique social and cultural characteristics such as dialect, music, arts, social habits, cuisine, and folklore.
The United States of America is an ethnically and racially diverse country as a result of large-scale migration from many ethnically and racially different countries throughout its history. Differing birth and death rates among natives, settlers, and immigrants are also a factor.
Its chief early European influences came from English settlers of colonial America during British rule. Due to colonial ties with Britain that spread the English language, British culture, legal system and other cultural inheritances, had a formative influence. Other important influences came from other parts of Europe, especially Germany.
Original elements also play a strong role, such as Jeffersonian democracy. Thomas Jefferson's Notes on the State of Virginia was perhaps the first influential domestic cultural critique by an American and a reactionary piece to the prevailing European consensus that America's domestic originality was degenerate.
American culture includes both conservative and liberal elements, scientific and religious competitiveness, political structures, risk taking and free expression, materialist and moral elements. Despite certain consistent ideological principles (e.g. individualism, egalitarianism, and faith in freedom and democracy), American culture has a variety of expressions due to its geographical scale and demographic diversity. The flexibility of U.S. culture and its highly symbolic nature lead some researchers to categorize American culture as a mythic identity; others see it as American exceptionalism .
It also includes elements that evolved from Indigenous Americans, and other ethnic cultures—most prominently the culture of African Americans, cultures from Latin America, and Asian American cultures. Many American cultural elements, especially from popular culture, have spread across the globe through modern mass media.
The United States has traditionally been thought of as a melting pot. However, beginning in the 1960s and continuing on in the present day, the country trends towards cultural diversity, pluralism, and the image of a salad bowl instead.
Due to the extent of American culture, there are many integrated but unique social subcultures within the United States. The cultural affiliations an individual in the United States may have commonly depend on social class, political orientation and a multitude of demographic characteristics such as religious background, occupation and ethnic group membership.
Click here for further amplification.
Cultural Diversity
YouTube Video: Cultural Diversity Examples: Avoid Stereotypes while communicating [sic]
Pictured: Images from many cultures
Cultural diversity is the quality of diverse or different cultures, as opposed to monoculture, as in the global monoculture, or a homogenization of cultures, akin to cultural decay. The phrase cultural diversity can also refer to having different cultures respect each other's differences. The phrase "cultural diversity" is also sometimes used to mean the variety of human societies or cultures in a specific region, or in the world as a whole. The culturally destructive action of globalization is often said to have a negative effect on the world's cultural diversity.
Overview:
The many separate societies that emerged around the globe differed markedly from each other, and many of these differences persist to this day. As well as the more obvious cultural differences that exist between people, such as language, dress and traditions, there are also significant variations in the way societies organize themselves, in their shared conception of morality, and in the ways they interact with their environment. Cultural diversity can be seen as analogous to biodiversity.
Opposition and support:
By analogy with biodiversity, which is thought to be essential to the long-term survival of life on earth, it can be argued that cultural diversity may be vital for the long-term survival of humanity; and that the conservation of indigenous cultures may be as important to humankind as the conservation of species and ecosystems is to life in general.
The General Conference of UNESCO took this position in 2001, asserting in Article 1 of the Universal Declaration on Cultural Diversity that "...cultural diversity is as necessary for humankind as biodiversity is for nature"
This position is rejected by some people, on several grounds. Firstly, like most evolutionary accounts of human nature, the importance of cultural diversity for survival may be an un-testable hypothesis, which can neither be proved nor disproved. Secondly, it can be argued that it is unethical deliberately to conserve "less developed" societies, because this will deny people within those societies the benefits of technological and medical advances enjoyed by those in the "developed" world.
In the same manner that the promotion of poverty in underdeveloped nations as "cultural diversity" is unethical, it is similarly unethical to promote all religious practices simply
because they are seen to contribute to cultural diversity. Particular religious practices are recognized by the WHO and UN as unethical, including female genital mutilation (FGM), polygamy, child brides, and human sacrifice.
With the onset of globalization, traditional nation-states have been placed under enormous pressures. Today, with the development of technology, information and capital are transcending geographical boundaries and reshaping the relationships between the marketplace, states and citizens. In particular, the growth of the mass media industry has largely impacted on individuals and societies across the globe.
Although beneficial in some ways, this increased accessibility has the capacity to negatively affect a society's individuality. With information being so easily distributed throughout the world, cultural meanings, values and tastes run the risk of becoming homogenized. As a result, the strength of identity of individuals and societies may begin to weaken.
Some individuals, particularly those with strong religious beliefs, maintain that it is in the best interests of individuals and of humanity as a whole that all people adhere to a specific model for society or specific aspects of such a model.
Nowadays, communication between different countries becomes more and more frequent. And more and more students choose to study overseas for experiencing culture diversity. Their goal is to broaden their horizons and develop themselves from learning overseas.
For example, according to Fengling, Chen, Du Yanjun, and Yu Ma's paper "Academic Freedom in the People's Republic of China and the United States Of America.", they pointed out that Chinese education more focus on "traditionally, teaching has consisted of spoon feeding, and learning has been largely by rote. China's traditional system of education has sought to make students accept fixed and ossified content." And "In the classroom, Chinese professors are the laws and authorities; Students in China show great respect to their teachers in general."
On another hand, in United States of America education "American students treat college professors as equals." Also "American students' are encouraged to debate topics. The free open discussion on various topics is due to the academic freedom which most American colleges and universities enjoy." Discussion above gives us an overall idea about the differences between China and the United States on education. But we cannot simply judge which one is better, because each culture has its own advantages and features.
Thanks to those difference forms the culture diversity and those make our world more colorful. For students who go abroad for education, if they can combine positive culture elements from two different cultures to their self-development, it would be a competitive advantage in their whole career. Especially, with current process of global economics, people who owned different perspectives on cultures stand at a more competitive position in current world.
Quantification:
Cultural diversity is tricky to quantify, but a good indication is thought to be a count of the number of languages spoken in a region or in the world as a whole. By this measure we may be going through a period of precipitous decline in the world's cultural diversity. Research carried out in the 1990s by David Crystal (Honorary Professor of Linguistics at the University of Wales, Bangor) suggested that at that time, on average, one language was falling into disuse every two weeks. He calculated that if that rate of the language death were to continue, then by the year 2100 more than 90% of the languages currently spoken in the world will have gone extinct.
Overpopulation, immigration and imperialism (of both the militaristic and cultural kind) are reasons that have been suggested to explain any such decline. However, it could also be argued that with the advent of globalism, a decline in cultural diversity is inevitable because information sharing often promotes homogeneity.
Cultural heritage:
The Universal Declaration on Cultural Diversity adopted by UNESCO in 2001 is a legal instrument that recognizes cultural diversity as "common heritage of humanity" and considers its safeguarding to be a concrete and ethical imperative inseparable from respect for human dignity.
Beyond the Declaration of Principles adopted in 2003 at the Geneva Phase of the World Summit on the information Society (WSIS), the UNESCO Convention on the Protection and Promotion of the Diversity of Cultural Expressions, adopted in October 2005, is also regarded as a legally binding instrument that recognizes,
It was adopted in response to "growing pressure exerted on countries to waive their right to enforce cultural policies and to put all aspects of the cultural sector on the table when negotiating international trade agreements".To date, 116 member states as well as the European Union have ratified the Convention, except the US, Australia and Israel.
It is instead a clear recognition of the specificity of cultural goods and services, as well as state sovereignty and public services in this area. Thought for world trade, this soft law instrument (strength in not binding) clearly became a crucial reference to the definition of the European policy choice. In 2009, the European Court of Justice favoured a broad view of culture — beyond cultural values through the protection of film or the objective of promoting linguistic diversity yet previously recognized .
On top of it, under this Convention, the EU and China have committed to fostering more balanced cultural exchanges, strengthening international cooperation and solidarity with business and trade opportunities in cultural and creative industries. The most motivating factor behind Beijing's willingness to work in partnership at business level might certainly be the access to creative talents and skills from foreign markets.
There is also the Convention for the Safeguarding of the Intangible Cultural Heritage ratified on June 20, 2007 by 78 states which said: The intangible cultural heritage, transmitted from generation to generation is constantly recreated by communities and groups in response to their environment, their interaction with nature and their history, and gives them a sense of identity and continuity, thus promoting respect for cultural diversity and human creativity.
Cultural diversity was also promoted by the Montreal Declaration of 2007, and by the European Union. The idea of a global multicultural heritage covers several ideas, which are not exclusive (see multiculturalism). In addition to language, diversity can also include religious or traditional practice.
On a local scale, Agenda 21 for culture, the first document of world scope that establishes the foundations for a commitment by cities and local governments to cultural development, supports local authorities committed to cultural diversity.
Ocean Model of One Human Civilization:
Philosopher Nayef Al-Rodhan argues that previous concepts of civilizations, such as Samuel P. Huntington's arguments supporting a coming "clash of civilizations," are misconstrued. Human civilization should not be thought of as consisting of numerous separate and competing civilizations, but rather it should be thought of collectively as only one human civilization.
Within this civilization are many geo-cultural domains that comprise sub-cultures. This concept presents human history as one fluid story and encourages a philosophy of history that encompasses the entire span of human time as opposed to thinking about civilization in terms of single time periods.
Al-Rodhan envisions human civilization as an ocean into which the different geo-cultural domains flow like rivers. According to him, at points where geo-cultural domains first enter the ocean of human civilization, there is likely to be a concentration or dominance of that culture.
However, over time, all the rivers of geo-cultural domains become one. Therefore, an equal mix of all cultures will exist at the middle of the ocean, although the mix might be weighted towards the dominant culture of the day. Al-Rodhan maintains that there is fluidity at the ocean's center and that cultures will have the opportunity to borrow between cultures, especially when that culture's domain or "river" is in geographical proximity to the other's. However, Al-Rodhan warns that geographical proximity can also lead to friction and conflict.
Al-Rodhan maintains that sustainable civilisational triumph will occur when all components of the geo-cultural domains can flourish, even if they flourish in different degrees. Human civilization should indeed be considered as an ocean, where the various geo-cultural domains add depth whenever the conditions for the most advanced forms of human enterprise to thrive are met. This means it is necessary to focus on boundary marking practices and concrete situations. Moreover, civilisational triumph requires some degree of socio-economic equality as well as multilateral institutions that are premised on rules and practices perceived to be fair. Finally, Al-Rodhan notes that it demands conditions under which innovation and learning can thrive. He argues that there needs to be an emphasis on expanding the boundaries of geo-cultural identities and on encouraging greater acceptance of overlapping identities.
Cultural Vigor:
"Cultural Vigor" is a concept proposed by philosopher Nayef Al-Rodhan. He defines cultural vigor as cultural resilience and strength that results from mixing and exchanges between various cultures and sub-cultures around the world. In his general theory of human nature, which he calls "emotional amoral egoism".
Al-Rodhan argues that all humans are motivated amongst others by arrogance, injustice, exceptionalism, and exclusion. According to him, these particular motivating factors are unfounded, misguided, and hinder humankind's potential for synergistic progress and prosperity. In order to combat these tendencies, Al-Rodhan argues that cultural vigor and ethnic and cultural diversity must be actively promoted by governments and civil society.
Al-Rodhan compares cultural vigor to the natural phenomenon of "hybrid vigor", arguing that in nature, molecular and genetic diversity produce stronger and more resilient organisms that are less susceptible to disease and mutational challenges.
Similar resilience can be produced through fostering cultural and ethnic diversity. Ultimately, Al-Rodhan maintains that cultural vigor will ensure humanity's future and will improve humans' ability to survive and thrive.
Defense:
The defense of cultural diversity can take several meanings:
Cultural uniformity:
Cultural diversity is presented as the antithesis of cultural uniformity.
Some (including UNESCO) fear this hypothesis of a trend towards cultural uniformity. To support this argument they emphasize different aspects:
There are several international organizations that work towards protecting threatened societies and cultures, including Survival International and UNESCO. The UNESCO Universal Declaration on Cultural Diversity, adopted by 185 Member States in 2001, represents the first international standard-setting instrument aimed at preserving and promoting cultural diversity and intercultural dialogue.
Indeed, the notion of "cultural diversity" has been echoed by more neutral organizations, particularly within the UNESCO. Beyond the Declaration of Principles adopted in 2003 at the Geneva Phase of the World Summit on the information Society (WSIS), the UNESCO Convention on the Protection and Promotion of the Diversity of Cultural Expressions was adopted on 20 October 2005, but neither ratified by the US, Australia nor by Israel.
It is instead a clear recognition of the specificity of cultural goods and services, as well as state sovereignty and public services in this area. Thought for world trade, this soft law instrument (strength in not binding) clearly became a crucial reference to the definition of the European policy choice.
In 2009, the European Court of Justice favored a broad view of culture — beyond cultural values — through the protection of film or the objective of promoting linguistic diversity yet previously recognized. On top of it, under this Convention, the EU and China have committed to fostering more balanced cultural exchanges, strengthening international cooperation and solidarity with business and trade opportunities in cultural and creative industries.
The European Commission-funded Network of Excellence on "Sustainable Development in a Diverse World" (known as "SUS.DIV") builds upon the UNESCO Declaration to investigate the relationship between cultural diversity and sustainable development.
See Also:
Overview:
The many separate societies that emerged around the globe differed markedly from each other, and many of these differences persist to this day. As well as the more obvious cultural differences that exist between people, such as language, dress and traditions, there are also significant variations in the way societies organize themselves, in their shared conception of morality, and in the ways they interact with their environment. Cultural diversity can be seen as analogous to biodiversity.
Opposition and support:
By analogy with biodiversity, which is thought to be essential to the long-term survival of life on earth, it can be argued that cultural diversity may be vital for the long-term survival of humanity; and that the conservation of indigenous cultures may be as important to humankind as the conservation of species and ecosystems is to life in general.
The General Conference of UNESCO took this position in 2001, asserting in Article 1 of the Universal Declaration on Cultural Diversity that "...cultural diversity is as necessary for humankind as biodiversity is for nature"
This position is rejected by some people, on several grounds. Firstly, like most evolutionary accounts of human nature, the importance of cultural diversity for survival may be an un-testable hypothesis, which can neither be proved nor disproved. Secondly, it can be argued that it is unethical deliberately to conserve "less developed" societies, because this will deny people within those societies the benefits of technological and medical advances enjoyed by those in the "developed" world.
In the same manner that the promotion of poverty in underdeveloped nations as "cultural diversity" is unethical, it is similarly unethical to promote all religious practices simply
because they are seen to contribute to cultural diversity. Particular religious practices are recognized by the WHO and UN as unethical, including female genital mutilation (FGM), polygamy, child brides, and human sacrifice.
With the onset of globalization, traditional nation-states have been placed under enormous pressures. Today, with the development of technology, information and capital are transcending geographical boundaries and reshaping the relationships between the marketplace, states and citizens. In particular, the growth of the mass media industry has largely impacted on individuals and societies across the globe.
Although beneficial in some ways, this increased accessibility has the capacity to negatively affect a society's individuality. With information being so easily distributed throughout the world, cultural meanings, values and tastes run the risk of becoming homogenized. As a result, the strength of identity of individuals and societies may begin to weaken.
Some individuals, particularly those with strong religious beliefs, maintain that it is in the best interests of individuals and of humanity as a whole that all people adhere to a specific model for society or specific aspects of such a model.
Nowadays, communication between different countries becomes more and more frequent. And more and more students choose to study overseas for experiencing culture diversity. Their goal is to broaden their horizons and develop themselves from learning overseas.
For example, according to Fengling, Chen, Du Yanjun, and Yu Ma's paper "Academic Freedom in the People's Republic of China and the United States Of America.", they pointed out that Chinese education more focus on "traditionally, teaching has consisted of spoon feeding, and learning has been largely by rote. China's traditional system of education has sought to make students accept fixed and ossified content." And "In the classroom, Chinese professors are the laws and authorities; Students in China show great respect to their teachers in general."
On another hand, in United States of America education "American students treat college professors as equals." Also "American students' are encouraged to debate topics. The free open discussion on various topics is due to the academic freedom which most American colleges and universities enjoy." Discussion above gives us an overall idea about the differences between China and the United States on education. But we cannot simply judge which one is better, because each culture has its own advantages and features.
Thanks to those difference forms the culture diversity and those make our world more colorful. For students who go abroad for education, if they can combine positive culture elements from two different cultures to their self-development, it would be a competitive advantage in their whole career. Especially, with current process of global economics, people who owned different perspectives on cultures stand at a more competitive position in current world.
Quantification:
Cultural diversity is tricky to quantify, but a good indication is thought to be a count of the number of languages spoken in a region or in the world as a whole. By this measure we may be going through a period of precipitous decline in the world's cultural diversity. Research carried out in the 1990s by David Crystal (Honorary Professor of Linguistics at the University of Wales, Bangor) suggested that at that time, on average, one language was falling into disuse every two weeks. He calculated that if that rate of the language death were to continue, then by the year 2100 more than 90% of the languages currently spoken in the world will have gone extinct.
Overpopulation, immigration and imperialism (of both the militaristic and cultural kind) are reasons that have been suggested to explain any such decline. However, it could also be argued that with the advent of globalism, a decline in cultural diversity is inevitable because information sharing often promotes homogeneity.
Cultural heritage:
The Universal Declaration on Cultural Diversity adopted by UNESCO in 2001 is a legal instrument that recognizes cultural diversity as "common heritage of humanity" and considers its safeguarding to be a concrete and ethical imperative inseparable from respect for human dignity.
Beyond the Declaration of Principles adopted in 2003 at the Geneva Phase of the World Summit on the information Society (WSIS), the UNESCO Convention on the Protection and Promotion of the Diversity of Cultural Expressions, adopted in October 2005, is also regarded as a legally binding instrument that recognizes,
- The distinctive nature of cultural goods, services and activities as vehicles of identity, values and meaning;
- That while cultural goods, services and activities have important economic value, they are not mere commodities or consumer goods that can only be regarded as objects of trade.
It was adopted in response to "growing pressure exerted on countries to waive their right to enforce cultural policies and to put all aspects of the cultural sector on the table when negotiating international trade agreements".To date, 116 member states as well as the European Union have ratified the Convention, except the US, Australia and Israel.
It is instead a clear recognition of the specificity of cultural goods and services, as well as state sovereignty and public services in this area. Thought for world trade, this soft law instrument (strength in not binding) clearly became a crucial reference to the definition of the European policy choice. In 2009, the European Court of Justice favoured a broad view of culture — beyond cultural values through the protection of film or the objective of promoting linguistic diversity yet previously recognized .
On top of it, under this Convention, the EU and China have committed to fostering more balanced cultural exchanges, strengthening international cooperation and solidarity with business and trade opportunities in cultural and creative industries. The most motivating factor behind Beijing's willingness to work in partnership at business level might certainly be the access to creative talents and skills from foreign markets.
There is also the Convention for the Safeguarding of the Intangible Cultural Heritage ratified on June 20, 2007 by 78 states which said: The intangible cultural heritage, transmitted from generation to generation is constantly recreated by communities and groups in response to their environment, their interaction with nature and their history, and gives them a sense of identity and continuity, thus promoting respect for cultural diversity and human creativity.
Cultural diversity was also promoted by the Montreal Declaration of 2007, and by the European Union. The idea of a global multicultural heritage covers several ideas, which are not exclusive (see multiculturalism). In addition to language, diversity can also include religious or traditional practice.
On a local scale, Agenda 21 for culture, the first document of world scope that establishes the foundations for a commitment by cities and local governments to cultural development, supports local authorities committed to cultural diversity.
Ocean Model of One Human Civilization:
Philosopher Nayef Al-Rodhan argues that previous concepts of civilizations, such as Samuel P. Huntington's arguments supporting a coming "clash of civilizations," are misconstrued. Human civilization should not be thought of as consisting of numerous separate and competing civilizations, but rather it should be thought of collectively as only one human civilization.
Within this civilization are many geo-cultural domains that comprise sub-cultures. This concept presents human history as one fluid story and encourages a philosophy of history that encompasses the entire span of human time as opposed to thinking about civilization in terms of single time periods.
Al-Rodhan envisions human civilization as an ocean into which the different geo-cultural domains flow like rivers. According to him, at points where geo-cultural domains first enter the ocean of human civilization, there is likely to be a concentration or dominance of that culture.
However, over time, all the rivers of geo-cultural domains become one. Therefore, an equal mix of all cultures will exist at the middle of the ocean, although the mix might be weighted towards the dominant culture of the day. Al-Rodhan maintains that there is fluidity at the ocean's center and that cultures will have the opportunity to borrow between cultures, especially when that culture's domain or "river" is in geographical proximity to the other's. However, Al-Rodhan warns that geographical proximity can also lead to friction and conflict.
Al-Rodhan maintains that sustainable civilisational triumph will occur when all components of the geo-cultural domains can flourish, even if they flourish in different degrees. Human civilization should indeed be considered as an ocean, where the various geo-cultural domains add depth whenever the conditions for the most advanced forms of human enterprise to thrive are met. This means it is necessary to focus on boundary marking practices and concrete situations. Moreover, civilisational triumph requires some degree of socio-economic equality as well as multilateral institutions that are premised on rules and practices perceived to be fair. Finally, Al-Rodhan notes that it demands conditions under which innovation and learning can thrive. He argues that there needs to be an emphasis on expanding the boundaries of geo-cultural identities and on encouraging greater acceptance of overlapping identities.
Cultural Vigor:
"Cultural Vigor" is a concept proposed by philosopher Nayef Al-Rodhan. He defines cultural vigor as cultural resilience and strength that results from mixing and exchanges between various cultures and sub-cultures around the world. In his general theory of human nature, which he calls "emotional amoral egoism".
Al-Rodhan argues that all humans are motivated amongst others by arrogance, injustice, exceptionalism, and exclusion. According to him, these particular motivating factors are unfounded, misguided, and hinder humankind's potential for synergistic progress and prosperity. In order to combat these tendencies, Al-Rodhan argues that cultural vigor and ethnic and cultural diversity must be actively promoted by governments and civil society.
Al-Rodhan compares cultural vigor to the natural phenomenon of "hybrid vigor", arguing that in nature, molecular and genetic diversity produce stronger and more resilient organisms that are less susceptible to disease and mutational challenges.
Similar resilience can be produced through fostering cultural and ethnic diversity. Ultimately, Al-Rodhan maintains that cultural vigor will ensure humanity's future and will improve humans' ability to survive and thrive.
Defense:
The defense of cultural diversity can take several meanings:
- A balance to be achieved: thus, the idea of defense of cultural diversity through the promotion of actions in favor of "cultural minorities" said to be disadvantaged;
- Preservation of "cultural minorities" thought to be endangered;
- In other cases, one speaks of "cultural protection", which refers to the concept of "cultural exception". This makes the link between the social vision of culture and the vision inherent in its commercialisation. The cultural exception highlights the specificity of cultural products and services, including special recognition by the European Union in its Declaration on Cultural Diversity. In this context, the objective is to defend against what is seen as a "commodification" - considered harmful to a "disadvantaged" culture — supporting its development through grants, promotion operations, etc., also known as "cultural protectionism".
- This defense may also refer to incorporating "cultural rights" provisions, conducted unsuccessfully in the early 1990s in Europe, into a layer of human rights.
Cultural uniformity:
Cultural diversity is presented as the antithesis of cultural uniformity.
Some (including UNESCO) fear this hypothesis of a trend towards cultural uniformity. To support this argument they emphasize different aspects:
- The disappearance of many languages and dialects, regarding for example the languages of France, without legal status or protection (Basque, Breton, Corsican, Occitan, Catalan, Alsatian, Flemish, Poitou, Saintonge, etc.).
- Anxiety of people on the preservation of their traditions as in New Zealand, coastal regions in Australia, North America, Central America;
- Increasing cultural preeminence of the United States through the distribution of its products in film, television, music, clothing and nutritional products promoted in audio-visual media, consumer products virtually standardized on the planet (pizza, restaurants, fast food, etc..).
There are several international organizations that work towards protecting threatened societies and cultures, including Survival International and UNESCO. The UNESCO Universal Declaration on Cultural Diversity, adopted by 185 Member States in 2001, represents the first international standard-setting instrument aimed at preserving and promoting cultural diversity and intercultural dialogue.
Indeed, the notion of "cultural diversity" has been echoed by more neutral organizations, particularly within the UNESCO. Beyond the Declaration of Principles adopted in 2003 at the Geneva Phase of the World Summit on the information Society (WSIS), the UNESCO Convention on the Protection and Promotion of the Diversity of Cultural Expressions was adopted on 20 October 2005, but neither ratified by the US, Australia nor by Israel.
It is instead a clear recognition of the specificity of cultural goods and services, as well as state sovereignty and public services in this area. Thought for world trade, this soft law instrument (strength in not binding) clearly became a crucial reference to the definition of the European policy choice.
In 2009, the European Court of Justice favored a broad view of culture — beyond cultural values — through the protection of film or the objective of promoting linguistic diversity yet previously recognized. On top of it, under this Convention, the EU and China have committed to fostering more balanced cultural exchanges, strengthening international cooperation and solidarity with business and trade opportunities in cultural and creative industries.
The European Commission-funded Network of Excellence on "Sustainable Development in a Diverse World" (known as "SUS.DIV") builds upon the UNESCO Declaration to investigate the relationship between cultural diversity and sustainable development.
See Also:
- Cross-cultural communication
- Multiculturalism
- Respect diversity
- Social cohesion
- Solidarity
- Foundation for Endangered Languages
- Mondialogo
- Purple economy
- Convention on the Protection and Promotion of the Diversity of Cultural Expressions
- World Day for Cultural Diversity for Dialogue and Development
- Intercultural relations
- Coolitude
- Cultural rights
- Cultural safety
- Intercultural relations
Cultural Globalization
YouTube Video of Globalization: Past, Present, and Future
Cultural globalization refers to the transmission of ideas, meanings and values around the world in such a way as to extend and intensify social relations. This process is marked by the common consumption of cultures that have been diffused by the Internet, popular culture media, and international travel.
This has added to processes of commodity exchange and colonization which have a longer history of carrying cultural meaning around the globe. The circulation of cultures enables individuals to partake in extended social relations that cross national and regional borders. The creation and expansion of such social relations is not merely observed on a material level. Cultural globalization involves the formation of shared norms and knowledge with which people associate their individual and collective cultural identities. It brings increasing interconnectedness among different populations and cultures.
A visible aspect of cultural globalization is the diffusion of certain cuisines such as American fast food chains. The two most successful global food and beverage outlets, McDonald's and Starbucks, are American companies often cited as examples of globalization, with over 32,000 and 18,000 locations operating worldwide, respectively as of 2008. The Big Mac Index is an informal measure of purchasing power parity among world currencies.
Measurement:
There have been numerous attempts to measure globalization, typically using indices that capture quantitative data for trade flows, political integration, and other measures. The two most prominent are the AT Kearney/Foreign Policy Globalization index and the KOF Globalization Index.
Cultural globalization, however, is much more difficult to capture using quantitative data, because it is difficult to find easily verifiable data of the flow of ideas, opinions, and fashions.
One attempt to do so was the Cultural Globalization Index, proposed by Randolph Kluver and Wayne Fu in 2004, and initially published by Foreign Policy Magazine. This effort measured cultural flow by using global trade in media products (books, periodicals, and newspapers) as a proxy for cultural flow. Kluver and Fu followed up with an extended analysis, using this method to measure cultural globalization in Southeast Asia.[6]
Perspectives:
Hybridization
Many writers suggest that cultural globalization is a long-term historical process of bringing different cultures into interrelation. Jan Pieterse suggest that cultural globalization is involving human integration and hybridization, arguing that it is possible to detect cultural mixing across continents and regions going back many centuries. They refer, for example, to the movement of religious practices, language and culture brought by Spanish colonization of the Americas. The Indian experience, to take another example, reveals both the pluralization of the impact of cultural globalization and its long-term history.
The work of such cultural historians qualifies the lineage of writers—predominantly economists and sociologists—who trace the origins of globalization to recent capitalism, facilitated through technological advances.
Homogenization:
An alternative perspective on cultural globalization emphasizes the transfiguration of worldwide diversity into a pandemic of Westernized consumer culture.[9] Some critics argue that the dominance of American culture influencing the entire world will ultimately result in the end of cultural diversity.
This process, understood as cultural imperialism, is associated with the destruction of cultural identities, dominated by a homogenized and westernized, consumer culture. The global influence of American products, businesses and culture in other countries around the world has been referred to as Americanization.
This influence is represented through that of American-based television programs which are rebroadcast throughout the world. Major American companies such as McDonald's and Coca-Cola have played a major role in the spread of American culture around the globe. Terms such as Coca-colonization have been coined to refer to the dominance of American products in foreign countries, which some critics of globalization view as a threat to the cultural identity of these nations.
Conflict intensification:
Another alternative perspective argues that in reaction to the process of cultural globalization, a "Clash of Civilizations" might appear. Indeed, Samuel Huntington emphasizes the fact that while the world is becoming smaller and interconnected, the interactions between peoples of different cultures enhance the civilization consciousness that in turn invigorate differences.
Indeed, rather than reaching a global cultural community, the differences in culture sharpened by this very process of cultural globalization will be a source of conflict. While not many commentators agree that this should be characterized as a 'Clash of Civilizations', there is general concurrence that cultural globalization is an ambivalent process bringing an intense sense of local difference and ideological contest.
Alternatively, Benjamin Barber in his book “Jihad vs. McWorld” argues for a different “cultural division” of the world. In his book the McWorld represents a world of globalization and global connectivity and interdependence, looking to create a “commercially homogeneous global network”.
This global network is divided into four imperatives; Market, Resource, Information-Technology and the Ecological imperative. On the other hand, “Jihad” represents traditionalism and maintaining one's identity.
Whereas “Clash of Civilizations” portrays a world with five coalitions of nation-states, “Jihad vs. McWorld” shows a world where struggles take place on a sub-national level. Although most of the western nations are capitalist and can be seen as “McWorld” countries, societies within these nations might be considered “Jihad” and vice versa.
See Also:
This has added to processes of commodity exchange and colonization which have a longer history of carrying cultural meaning around the globe. The circulation of cultures enables individuals to partake in extended social relations that cross national and regional borders. The creation and expansion of such social relations is not merely observed on a material level. Cultural globalization involves the formation of shared norms and knowledge with which people associate their individual and collective cultural identities. It brings increasing interconnectedness among different populations and cultures.
A visible aspect of cultural globalization is the diffusion of certain cuisines such as American fast food chains. The two most successful global food and beverage outlets, McDonald's and Starbucks, are American companies often cited as examples of globalization, with over 32,000 and 18,000 locations operating worldwide, respectively as of 2008. The Big Mac Index is an informal measure of purchasing power parity among world currencies.
Measurement:
There have been numerous attempts to measure globalization, typically using indices that capture quantitative data for trade flows, political integration, and other measures. The two most prominent are the AT Kearney/Foreign Policy Globalization index and the KOF Globalization Index.
Cultural globalization, however, is much more difficult to capture using quantitative data, because it is difficult to find easily verifiable data of the flow of ideas, opinions, and fashions.
One attempt to do so was the Cultural Globalization Index, proposed by Randolph Kluver and Wayne Fu in 2004, and initially published by Foreign Policy Magazine. This effort measured cultural flow by using global trade in media products (books, periodicals, and newspapers) as a proxy for cultural flow. Kluver and Fu followed up with an extended analysis, using this method to measure cultural globalization in Southeast Asia.[6]
Perspectives:
Hybridization
Many writers suggest that cultural globalization is a long-term historical process of bringing different cultures into interrelation. Jan Pieterse suggest that cultural globalization is involving human integration and hybridization, arguing that it is possible to detect cultural mixing across continents and regions going back many centuries. They refer, for example, to the movement of religious practices, language and culture brought by Spanish colonization of the Americas. The Indian experience, to take another example, reveals both the pluralization of the impact of cultural globalization and its long-term history.
The work of such cultural historians qualifies the lineage of writers—predominantly economists and sociologists—who trace the origins of globalization to recent capitalism, facilitated through technological advances.
Homogenization:
An alternative perspective on cultural globalization emphasizes the transfiguration of worldwide diversity into a pandemic of Westernized consumer culture.[9] Some critics argue that the dominance of American culture influencing the entire world will ultimately result in the end of cultural diversity.
This process, understood as cultural imperialism, is associated with the destruction of cultural identities, dominated by a homogenized and westernized, consumer culture. The global influence of American products, businesses and culture in other countries around the world has been referred to as Americanization.
This influence is represented through that of American-based television programs which are rebroadcast throughout the world. Major American companies such as McDonald's and Coca-Cola have played a major role in the spread of American culture around the globe. Terms such as Coca-colonization have been coined to refer to the dominance of American products in foreign countries, which some critics of globalization view as a threat to the cultural identity of these nations.
Conflict intensification:
Another alternative perspective argues that in reaction to the process of cultural globalization, a "Clash of Civilizations" might appear. Indeed, Samuel Huntington emphasizes the fact that while the world is becoming smaller and interconnected, the interactions between peoples of different cultures enhance the civilization consciousness that in turn invigorate differences.
Indeed, rather than reaching a global cultural community, the differences in culture sharpened by this very process of cultural globalization will be a source of conflict. While not many commentators agree that this should be characterized as a 'Clash of Civilizations', there is general concurrence that cultural globalization is an ambivalent process bringing an intense sense of local difference and ideological contest.
Alternatively, Benjamin Barber in his book “Jihad vs. McWorld” argues for a different “cultural division” of the world. In his book the McWorld represents a world of globalization and global connectivity and interdependence, looking to create a “commercially homogeneous global network”.
This global network is divided into four imperatives; Market, Resource, Information-Technology and the Ecological imperative. On the other hand, “Jihad” represents traditionalism and maintaining one's identity.
Whereas “Clash of Civilizations” portrays a world with five coalitions of nation-states, “Jihad vs. McWorld” shows a world where struggles take place on a sub-national level. Although most of the western nations are capitalist and can be seen as “McWorld” countries, societies within these nations might be considered “Jihad” and vice versa.
See Also:
- Military globalization
- Engaged theory
- Globalism
- Globalization
- Cultural homogenization
- Cultural imperialism
- Dimensions of globalization
Major Types of Culture
YouTube Video about Linguistics and the Nature of Language
Culture – set of patterns of human activity within a community or social group and the symbolic structures that give such activity significance. Customs, laws, dress, architectural style, social standards, religious beliefs, and traditions are all examples of cultural elements.
Click on any of the following links for amplification of the different types of culture:
Click on any of the following links for amplification of the different types of culture:
- 1 Cultural groups
- 2 Elements of culture
- 3 Types of cultures
- 4 Academic disciplines that study culture
- 5 Cultures of the world
- 6 History of culture
- 7 Politics of culture
- 8 Sociology of culture
- 9 Research fields
- 10 See also
- 11 References
- 12 External links
Museums in the United States including Categories of Museums
YouTube Video: "Welcome to the National Air and Space Museum"
Pictured: LEFT: National Museum of Natural History (Smithsonian – Washington, D.C.); RIGHT: The Spirit of St. Louis (flown by Charles Lindbergh on May 20–21, 1927, for the first non-stop flight from New York to Paris), which is on display at the National Air and Space Museum in Washington, D.C.
A museum is an institution that cares for (conserves) a collection of artifacts and other objects of artistic, cultural, historical, or scientific importance and some public museums makes them available for public viewing through exhibits that may be permanent or temporary.
Most large museums are located in major cities throughout the world and more local ones exist in smaller cities, towns and even the countryside. Museums have varying aims, ranging from serving researchers and specialists to serving the general public. The goal of serving researchers is increasingly shifting to serving the general public.
Some of the most attended museums include the Louvre in Paris, the National Museum of China in Beijing, the Smithsonian Institution in Washington, D.C., the British Museum in London, the National Gallery in London and The Metropolitan Museum of Art in New York City.
There are many types of museums, including art museums, natural history museums, science museums, war museums and children's museums.
As of the 2010s, the continuing acceleration in the digitization of information, combined with the increasing capacity of digital information storage, is causing the traditional model of museums (i.e. as static bricks-and-mortar "collections of collections" of three-dimensional specimens and artifacts) to expand to include virtual exhibits and high-resolution images of their collections that patrons can peruse, study, and explore from any place with Internet.
The city with the largest number of museums is Mexico City with over 128 museums.
According to The World Museum Community, there are more than 55,000 museums in 202 countries.
Click here for an alphabetical listing of museums in the United States: broken down by State.
Most large museums are located in major cities throughout the world and more local ones exist in smaller cities, towns and even the countryside. Museums have varying aims, ranging from serving researchers and specialists to serving the general public. The goal of serving researchers is increasingly shifting to serving the general public.
Some of the most attended museums include the Louvre in Paris, the National Museum of China in Beijing, the Smithsonian Institution in Washington, D.C., the British Museum in London, the National Gallery in London and The Metropolitan Museum of Art in New York City.
There are many types of museums, including art museums, natural history museums, science museums, war museums and children's museums.
As of the 2010s, the continuing acceleration in the digitization of information, combined with the increasing capacity of digital information storage, is causing the traditional model of museums (i.e. as static bricks-and-mortar "collections of collections" of three-dimensional specimens and artifacts) to expand to include virtual exhibits and high-resolution images of their collections that patrons can peruse, study, and explore from any place with Internet.
The city with the largest number of museums is Mexico City with over 128 museums.
According to The World Museum Community, there are more than 55,000 museums in 202 countries.
Click here for an alphabetical listing of museums in the United States: broken down by State.
Mardi Gras
YouTube Video Celebrating Mardi Gras in New Orleans (courtesy of National Geographic Society)
Pictured: Mardi Gras Celebration in New Orleans, Louisiana
Mardi Gras, also called Shrove Tuesday, or Fat Tuesday, in English, refers to events of the Carnival celebrations, beginning on or after the Christian feasts of the Epiphany (Three King's Day) and culminating on the day before Ash Wednesday. Mardi Gras is French for "Fat Tuesday", reflecting the practice of the last night of eating richer, fatty foods before the ritual fasting of the Lenten season.
Related popular practices are associated with Shrovetide celebrations before the fasting and religious obligations associated with the penitential season of Lent. In countries such as England, Mardi Gras is also known as Shrove Tuesday, which is derived from the word shrive, meaning "confess".
Traditions:
Popular practices on Mardi Gras include wearing masks and costumes, overturning social conventions, dancing, sports competitions, parades, debauchery, etc. Similar expressions to Mardi Gras appear in other European languages sharing the Christian tradition, as it is associated with the religious requirement for confession before Lent begins. In many areas, the term "Mardi Gras" has come to mean the whole period of activity related to the celebratory events, beyond just the single day. In some American cities, it is now called "Mardi Gras Day".
The festival season varies from city to city, as some traditions, such as the one in New Orleans, Louisiana, consider Mardi Gras to stretch the entire period from Twelfth Night (the last night of Christmas which begins Epiphany) to Ash Wednesday. Others treat the final three-day period before Ash Wednesday as the Mardi Gras.
In Mobile, Alabama, Mardi Gras-associated social events begin in November, followed by mystic society balls on Thanksgiving, then New Year's Eve, followed by parades and balls in January and February, celebrating up to midnight before Ash Wednesday. In earlier times, parades were held on New Year's Day.
Other cities famous for Mardi Gras celebrations include Rio de Janeiro; Barranquilla, Colombia; George Town, Cayman Islands; Port of Spain, Trinidad and Tobago; Quebec City, Quebec, Canada; Mazatlán, and Sinaloa, Mexico.
Carnival is an important celebration in Anglican and Catholic European nations. In the United Kingdom and Ireland, the week before Ash Wednesday is called "shrovetide", ending on Shrove Tuesday. It has its popular celebratory aspects, as well. Pancakes are a traditional food. Pancakes and related fried breads or pastries made with sugar, fat, and eggs are also traditionally consumed at this time in many parts of Latin America and the Caribbean.
As Celebrated in the United States:
While not observed nationally throughout the United States, a number of traditionally ethnic French cities and regions in the country have notable celebrations. Mardi Gras arrived in North America as a French Catholic tradition with the Le Moyne brothers, Pierre Le Moyne d'Iberville and Jean-Baptiste Le Moyne de Bienville, in the late 17th century, when King Louis XIV sent the pair to defend France's claim on the territory of Louisiane, which included what are now the U.S. states of Alabama, Mississippi, Louisiana and part of eastern Texas.
The expedition, led by Iberville, entered the mouth of the Mississippi River on the evening of March 2, 1699, Lundi Gras. They did not yet know it was the river explored and claimed for France by René-Robert Cavelier, Sieur de La Salle in 1683. The party proceeded upstream to a place on the east bank about 60 miles downriver from where New Orleans is today, and made camp.
This was on March 3, 1699, Mardi Gras, so in honor of this holiday, Iberville named the spot Point du Mardi Gras (French: "Mardi Gras Point") and called the nearby tributary Bayou Mardi Gras. Bienville went on to found the settlement of Mobile, Alabama in 1702 as the first capital of French Louisiana.
In 1703 French settlers in Mobile established the first organized Mardi Gras celebration tradition in what was to become the United States. The first informal mystic society, or krewe, was formed in Mobile in 1711, the Boeuf Gras Society. By 1720, Biloxi had been made capital of Louisiana. The French Mardi Gras customs had accompanied the colonists who settled there.
In 1723, the capital of Louisiana was moved to New Orleans, founded in 1718.
The first Mardi Gras parade held in New Orleans is recorded to have taken place in 1837. The tradition in New Orleans expanded to the point that it became synonymous with the city in popular perception, and embraced by residents of New Orleans beyond those of French or Catholic heritage. Mardi Gras celebrations are part of the basis of the slogan, Laissez les bons temps rouler, (Let the good times roll).
Other cities along the Gulf Coast with early French colonial heritage, from Pensacola, Florida; Galveston, Texas; to Lake Charles and Lafayette, Louisiana; and north to Natchez, Mississippi, have active Mardi Gras celebrations.
In the rural Acadiana area, many Cajuns celebrate with the Courir de Mardi Gras, a tradition that dates to medieval celebrations in France.
The American rock band Creedence Clearwater Revival created an album in 1972 called Mardi Gras.
Related popular practices are associated with Shrovetide celebrations before the fasting and religious obligations associated with the penitential season of Lent. In countries such as England, Mardi Gras is also known as Shrove Tuesday, which is derived from the word shrive, meaning "confess".
Traditions:
Popular practices on Mardi Gras include wearing masks and costumes, overturning social conventions, dancing, sports competitions, parades, debauchery, etc. Similar expressions to Mardi Gras appear in other European languages sharing the Christian tradition, as it is associated with the religious requirement for confession before Lent begins. In many areas, the term "Mardi Gras" has come to mean the whole period of activity related to the celebratory events, beyond just the single day. In some American cities, it is now called "Mardi Gras Day".
The festival season varies from city to city, as some traditions, such as the one in New Orleans, Louisiana, consider Mardi Gras to stretch the entire period from Twelfth Night (the last night of Christmas which begins Epiphany) to Ash Wednesday. Others treat the final three-day period before Ash Wednesday as the Mardi Gras.
In Mobile, Alabama, Mardi Gras-associated social events begin in November, followed by mystic society balls on Thanksgiving, then New Year's Eve, followed by parades and balls in January and February, celebrating up to midnight before Ash Wednesday. In earlier times, parades were held on New Year's Day.
Other cities famous for Mardi Gras celebrations include Rio de Janeiro; Barranquilla, Colombia; George Town, Cayman Islands; Port of Spain, Trinidad and Tobago; Quebec City, Quebec, Canada; Mazatlán, and Sinaloa, Mexico.
Carnival is an important celebration in Anglican and Catholic European nations. In the United Kingdom and Ireland, the week before Ash Wednesday is called "shrovetide", ending on Shrove Tuesday. It has its popular celebratory aspects, as well. Pancakes are a traditional food. Pancakes and related fried breads or pastries made with sugar, fat, and eggs are also traditionally consumed at this time in many parts of Latin America and the Caribbean.
As Celebrated in the United States:
While not observed nationally throughout the United States, a number of traditionally ethnic French cities and regions in the country have notable celebrations. Mardi Gras arrived in North America as a French Catholic tradition with the Le Moyne brothers, Pierre Le Moyne d'Iberville and Jean-Baptiste Le Moyne de Bienville, in the late 17th century, when King Louis XIV sent the pair to defend France's claim on the territory of Louisiane, which included what are now the U.S. states of Alabama, Mississippi, Louisiana and part of eastern Texas.
The expedition, led by Iberville, entered the mouth of the Mississippi River on the evening of March 2, 1699, Lundi Gras. They did not yet know it was the river explored and claimed for France by René-Robert Cavelier, Sieur de La Salle in 1683. The party proceeded upstream to a place on the east bank about 60 miles downriver from where New Orleans is today, and made camp.
This was on March 3, 1699, Mardi Gras, so in honor of this holiday, Iberville named the spot Point du Mardi Gras (French: "Mardi Gras Point") and called the nearby tributary Bayou Mardi Gras. Bienville went on to found the settlement of Mobile, Alabama in 1702 as the first capital of French Louisiana.
In 1703 French settlers in Mobile established the first organized Mardi Gras celebration tradition in what was to become the United States. The first informal mystic society, or krewe, was formed in Mobile in 1711, the Boeuf Gras Society. By 1720, Biloxi had been made capital of Louisiana. The French Mardi Gras customs had accompanied the colonists who settled there.
In 1723, the capital of Louisiana was moved to New Orleans, founded in 1718.
The first Mardi Gras parade held in New Orleans is recorded to have taken place in 1837. The tradition in New Orleans expanded to the point that it became synonymous with the city in popular perception, and embraced by residents of New Orleans beyond those of French or Catholic heritage. Mardi Gras celebrations are part of the basis of the slogan, Laissez les bons temps rouler, (Let the good times roll).
Other cities along the Gulf Coast with early French colonial heritage, from Pensacola, Florida; Galveston, Texas; to Lake Charles and Lafayette, Louisiana; and north to Natchez, Mississippi, have active Mardi Gras celebrations.
In the rural Acadiana area, many Cajuns celebrate with the Courir de Mardi Gras, a tradition that dates to medieval celebrations in France.
The American rock band Creedence Clearwater Revival created an album in 1972 called Mardi Gras.
Maya Angelou
YouTube Video and Pictured: Angelou reciting her poem "On the Pulse of Morning" at President Bill Clinton's inauguration, January 1993
Maya Angelou born Marguerite Annie Johnson; April 4, 1928 – May 28, 2014) was an American poet, memoirist, and civil rights activist.
She published seven autobiographies, three books of essays, several books of poetry, and was credited with a list of plays, movies, and television shows spanning over 50 years. She received dozens of awards and more than 50 honorary degrees.
Angelou is best known for her series of seven autobiographies, which focus on her childhood and early adult experiences. The first, I Know Why the Caged Bird Sings (1969), tells of her life up to the age of 17 and brought her international recognition and acclaim.
She became a poet and writer after a series of occupations as a young adult, including fry cook, prostitute, nightclub dancer and performer, cast member of the opera Porgy and Bess, coordinator for the Southern Christian Leadership Conference, and journalist in Egypt and Ghana during the decolonization of Africa.
She was an actor, writer, director, and producer of plays, movies, and public television programs. In 1982, she earned the first lifetime Reynolds Professorship of American Studies at Wake Forest University in Winston-Salem, North Carolina. She was active in the Civil Rights movement and worked with Martin Luther King, Jr. and Malcolm X.
Beginning in the 1990s, she made around 80 appearances a year on the lecture circuit, something she continued into her eighties.
In 1993, Angelou recited her poem "On the Pulse of Morning" (1993) at President Bill Clinton's inauguration, making her the first poet to make an inaugural recitation since Robert Frost at President John F. Kennedy's inauguration in 1961.
With the publication of I Know Why the Caged Bird Sings, Angelou publicly discussed aspects of her personal life. She was respected as a spokesperson for black people and women, and her works have been considered a defense of Black culture.
Attempts have been made to ban her books from some U.S. libraries, but her works are widely used in schools and universities worldwide. Angelou's major works have been labeled as autobiographical fiction, but many critics have characterized them as autobiographies. She made a deliberate attempt to challenge the common structure of the autobiography by critiquing, changing, and expanding the genre. Her books center on themes such as racism, identity, family, and travel.
She published seven autobiographies, three books of essays, several books of poetry, and was credited with a list of plays, movies, and television shows spanning over 50 years. She received dozens of awards and more than 50 honorary degrees.
Angelou is best known for her series of seven autobiographies, which focus on her childhood and early adult experiences. The first, I Know Why the Caged Bird Sings (1969), tells of her life up to the age of 17 and brought her international recognition and acclaim.
She became a poet and writer after a series of occupations as a young adult, including fry cook, prostitute, nightclub dancer and performer, cast member of the opera Porgy and Bess, coordinator for the Southern Christian Leadership Conference, and journalist in Egypt and Ghana during the decolonization of Africa.
She was an actor, writer, director, and producer of plays, movies, and public television programs. In 1982, she earned the first lifetime Reynolds Professorship of American Studies at Wake Forest University in Winston-Salem, North Carolina. She was active in the Civil Rights movement and worked with Martin Luther King, Jr. and Malcolm X.
Beginning in the 1990s, she made around 80 appearances a year on the lecture circuit, something she continued into her eighties.
In 1993, Angelou recited her poem "On the Pulse of Morning" (1993) at President Bill Clinton's inauguration, making her the first poet to make an inaugural recitation since Robert Frost at President John F. Kennedy's inauguration in 1961.
With the publication of I Know Why the Caged Bird Sings, Angelou publicly discussed aspects of her personal life. She was respected as a spokesperson for black people and women, and her works have been considered a defense of Black culture.
Attempts have been made to ban her books from some U.S. libraries, but her works are widely used in schools and universities worldwide. Angelou's major works have been labeled as autobiographical fiction, but many critics have characterized them as autobiographies. She made a deliberate attempt to challenge the common structure of the autobiography by critiquing, changing, and expanding the genre. Her books center on themes such as racism, identity, family, and travel.
Most Commonly Spoken Languages around the World
YouTube Video: United Nations, A Day in the Life of Real Interpreters
Pictured: Screen from Google Translator Website*
* -- The above Google website enables one to translate words from one language to another (for English, Spanish and French).
The languages under the above topic are listed as having 50 million or more native speakers in the 2015 edition of Ethnologue, a language reference published by SIL International.
Speaker totals are generally not reliable, as they add together estimates from different dates and (usually uncited) sources; language information is not collected on most national censuses.
Click here for a listing of languages spoken around the world by 50 million or more people.
Speaker totals are generally not reliable, as they add together estimates from different dates and (usually uncited) sources; language information is not collected on most national censuses.
Click here for a listing of languages spoken around the world by 50 million or more people.
Morality by Culture
YouTube Video: A War for the Soul of America: A History of the Culture Wars (w/ Andrew Hartman)
Illustration: Kohlberg Model of Moral Development
Morality is the differentiation of intentions, decisions, and actions between those that are distinguished as proper and those that are improper. Morality can be a body of standards or principles derived from a code of conduct from a particular philosophy, religion, or culture, or it can derive from a standard that a person believes should be universal.
Morality may also be specifically synonymous with "goodness" or "rightness."
Moral philosophy includes moral ontology, or the origin of morals, as well as moral epistemology, or knowledge about morals.
Different systems of expressing morality have been proposed, including deontological ethical systems which adhere to a set of established rules, and normative ethical systems which consider the merits of actions themselves. An example of normative ethical philosophy is the Golden Rule, which states that: "One should treat others as one would like others to treat oneself."
Immorality is the active opposition to morality (i.e. opposition to that which is good or right), while amorality is variously defined as an unawareness of, indifference toward, or disbelief in any set of moral standards or principles.
For amplification for any of the following topics, click on the hyperlink:
Morality may also be specifically synonymous with "goodness" or "rightness."
Moral philosophy includes moral ontology, or the origin of morals, as well as moral epistemology, or knowledge about morals.
Different systems of expressing morality have been proposed, including deontological ethical systems which adhere to a set of established rules, and normative ethical systems which consider the merits of actions themselves. An example of normative ethical philosophy is the Golden Rule, which states that: "One should treat others as one would like others to treat oneself."
Immorality is the active opposition to morality (i.e. opposition to that which is good or right), while amorality is variously defined as an unawareness of, indifference toward, or disbelief in any set of moral standards or principles.
For amplification for any of the following topics, click on the hyperlink:
- 1 Philosophy
- 2 Anthropology
- 3 Evolution
- 4 Neuroscience
- 5 Psychology
- 6 Morality and politics
- 7 Morality and religion
Native Americans in the United States including Native American Tribes and their Gaming Casinos Pictured: A map of North American with the location of the major tribes identified
Click here for an alphabetical List of Federally-recognized Native Tribes.
Click here for a Listing of Casinos owned by Native American Tribes.
Native Americans, also known as American Indians, Indigenous Americans and other terms, are the indigenous peoples of the United States, except Hawaii. More than 570 federally recognized tribes live within the US, about half of which are associated with Indian reservations.
The term "American Indian" excludes Native Hawaiians and some Alaska Natives, while "Native Americans" (as defined by the US Census) are American Indians, plus Alaska Natives of all ethnicities. The US Census does not include Native Hawaiians or Chamorro, instead being included in the Census grouping of "Native Hawaiian and other Pacific Islander".
The ancestors of living Native Americans arrived in what is now the United States at least 15,000 years ago, possibly much earlier, from Asia via Beringia. A vast variety of peoples, societies and cultures subsequently developed. Native Americans were greatly affected by the European colonization of the Americas, which began in 1492, and their population declined precipitously mainly due to introduced diseases as well as warfare, including biological warfare, territorial confiscation and slavery.
After its creation, the United States, as part of its policy of settler colonialism, waged war and perpetrated massacres against many Native American peoples, removed them from their ancestral lands, and subjected them to one-sided treaties and to discriminatory government policies into the 20th century.
Since the 1960s, Native American self-determination movements have resulted in changes to the lives of Native Americans, though there are still many contemporary issues faced by Native Americans. Today, there are over five million Native Americans in the United States, 78% of whom live outside reservations.
When the United States was created, established Native American tribes were generally considered semi-independent nations, as they generally lived in communities separate from British settlers.
The federal government signed treaties at a government-to-government level until the Indian Appropriations Act of 1871 ended recognition of independent native nations, and started treating them as "domestic dependent nations" subject to federal law. This law did preserve the rights and privileges agreed to under the treaties, including a large degree of tribal sovereignty. For this reason, many (but not all) Native American reservations are still independent of state law and actions of tribal citizens on these reservations are subject only to tribal courts and federal law.
The Indian Citizenship Act of 1924 granted U.S. citizenship to all Native Americans born in the United States who had not yet obtained it. This emptied the "Indians not taxed" category established by the United States Constitution, allowed natives to vote in state and federal elections, and extended the Fourteenth Amendment protections granted to people "subject to the jurisdiction" of the United States.
However, some states continued to deny Native Americans voting rights for several decades. Bill of Rights protections do not apply to tribal governments, except for those mandated by the Indian Civil Rights Act of 1968.
Click on any of the following blue hyperlinks for more about Native Americans in the United States:
Click here for a Listing of Casinos owned by Native American Tribes.
Native Americans, also known as American Indians, Indigenous Americans and other terms, are the indigenous peoples of the United States, except Hawaii. More than 570 federally recognized tribes live within the US, about half of which are associated with Indian reservations.
The term "American Indian" excludes Native Hawaiians and some Alaska Natives, while "Native Americans" (as defined by the US Census) are American Indians, plus Alaska Natives of all ethnicities. The US Census does not include Native Hawaiians or Chamorro, instead being included in the Census grouping of "Native Hawaiian and other Pacific Islander".
The ancestors of living Native Americans arrived in what is now the United States at least 15,000 years ago, possibly much earlier, from Asia via Beringia. A vast variety of peoples, societies and cultures subsequently developed. Native Americans were greatly affected by the European colonization of the Americas, which began in 1492, and their population declined precipitously mainly due to introduced diseases as well as warfare, including biological warfare, territorial confiscation and slavery.
After its creation, the United States, as part of its policy of settler colonialism, waged war and perpetrated massacres against many Native American peoples, removed them from their ancestral lands, and subjected them to one-sided treaties and to discriminatory government policies into the 20th century.
Since the 1960s, Native American self-determination movements have resulted in changes to the lives of Native Americans, though there are still many contemporary issues faced by Native Americans. Today, there are over five million Native Americans in the United States, 78% of whom live outside reservations.
When the United States was created, established Native American tribes were generally considered semi-independent nations, as they generally lived in communities separate from British settlers.
The federal government signed treaties at a government-to-government level until the Indian Appropriations Act of 1871 ended recognition of independent native nations, and started treating them as "domestic dependent nations" subject to federal law. This law did preserve the rights and privileges agreed to under the treaties, including a large degree of tribal sovereignty. For this reason, many (but not all) Native American reservations are still independent of state law and actions of tribal citizens on these reservations are subject only to tribal courts and federal law.
The Indian Citizenship Act of 1924 granted U.S. citizenship to all Native Americans born in the United States who had not yet obtained it. This emptied the "Indians not taxed" category established by the United States Constitution, allowed natives to vote in state and federal elections, and extended the Fourteenth Amendment protections granted to people "subject to the jurisdiction" of the United States.
However, some states continued to deny Native Americans voting rights for several decades. Bill of Rights protections do not apply to tribal governments, except for those mandated by the Indian Civil Rights Act of 1968.
Click on any of the following blue hyperlinks for more about Native Americans in the United States:
- Background
- History
- Demographics
- Tribal sovereignty
- Civil rights movement
- Contemporary issues
- Societal discrimination and racism
- Native American mascots in sports
- Historical depictions in art
- Terminology differences
- Gambling industry
- Financial services
- Crime on reservations
- Barriers to economic development
- Discourse in Native American economic development
- Landownership Challenges
- Land Ownership and Bureaucratic Challenges in Historical Context
- Geographic Poverty
- Trauma
- Society, language, and culture
- Interracial relations
- Racial identity
- See also:
- Indian Actors Association
- Indigenous peoples of the Americas
- Indigenous peoples of Canada
- Indigenous peoples of Mexico
- List of Alaska Native tribal entities
- List of federally recognized tribes
- List of historical Indian reservations in the United States
- List of Indian reservations in the United States
- List of Native Americans of the United States (notable Native Americans)
- List of unrecognized tribes in the United States
- List of writers from peoples indigenous to the Americas
- Mythologies of the indigenous peoples of the Americas
- Native American civil rights
- Native American Heritage Sites (National Park Service)
- Native Americans in popular culture
- Outline of United States federal Indian law and policy
- State recognized tribes in the United States
- Suicide among Native Americans in the United States
- Sexual victimization of Native American women
- Native Americans in the United States at Curlie Government
- Bureau of Indian Affair official website – Bureau of Indian Affairs, part of the U.S. Department of the Interior
- Organizations and media:
- National Congress of American Indians official website – National Congress of American Indians
- Indian Country Today Media Network official website – Indian Country Today Media Network
- First Nations Experience (FNX) – multi-media platform that is a partnership between the San Manuel Band of Mission Indians and KVCR, a PBS member station located in California's Inland Empire
- Academic collections and other resources:
- American Indian Records in the National Archives from the National Archives and Records Administration National Museum of the American Indian official website – National Museum of the American Indian, part of the Smithsonian Institution
- National Indian Law Library of the Native American Rights Fund – a law library of federal Indian and tribal law
- 1904–1924 'The North American Indian' – collection of Edward Sheriff Curtis photographs
- Southeastern Native American Documents, 1730–1842 – online collection from several archives, museums, and librarys
- Bonneville Collection of 19th-century photographs of Native Americans, University of South Carolina Library's Digital Collections Page
- Selected treaties from the Avalon Project of Yale Law School's Lillian Goldman Law Library
Hispanic and Latino Americans
YouTube Video: Carlos Santana -- Black Magic Woman [[ Official Live Video ]] HQ
Pictured: Map of the United States with Hispanic/Latino Population by State (By Ali Zifan - Own work; CC BY-SA 4.0)
Hispanic Americans and Latino Americans are American citizens who are descendants of the peoples of the Spanish-speaking countries of Latin America and the Iberian Peninsula.
More generally, it includes all persons in the United States who self-identify as Hispanic or Latino, whether of full or partial ancestry.
For the US census in 2010, American Community Survey, people counted as "Hispanic" or "Latino" are those who identify as one of the specific Hispanic or Latino categories listed on the census or ACS questionnaire ("Mexican," "Puerto Rican," or "Cuban") as well as those who indicate that they are "other Spanish, Hispanic, or Latino."
The countries or people who are in the Hispanic or Latino American groups as classified by the Census Bureau are the following: Spain, Puerto Rico, Mexico, Cuba, Dominican Republic, Costa Rica, Guatemala, Honduras, Nicaragua, Panama, El Salvador, Argentina, Bolivia, Chile, Colombia, Ecuador, Paraguay, Peru, Uruguay, and Venezuela.
The Census bureau uses the terms Hispanic and Latino interchangeably. Also important to note is that the Census office of the U.S. excludes Brazilian Americans from the Hispanic and Latino American population (Brazil is part of Latin America, but has a Portuguese language culture rather than a Spanish language culture).
Other U.S. government agencies have slightly different definitions of the term, including Brazilians and other Portuguese-speaking groups.
Origin can be viewed as the ancestry, nationality group, lineage, or country of birth of the person or the person's parents or ancestors before their arrival in the United States. People who identify as Spanish, Hispanic, or Latino may be of any race.
As the only specifically designated category of ethnicity in the United States (other than non-Hispanic/Latino), Hispanics form a pan-ethnicity incorporating a diversity of inter-related cultural and linguistic heritages.
Most Hispanic Americans are of Puerto Rican, Mexican, Cuban, Salvadoran, Dominican, Guatemalan, or Colombian origin. The predominant Hispanic origin varies widely across the country.
Hispanic Americans are the second fastest-growing ethnic group in the United States after Asian Americans. As of 2014, Hispanics constitute 17.37% of the United States population, or 55.3 million people.
The United States has the second-largest community of people of Hispanic origin other than Mexico, having surpassed Argentina, Colombia, and Spain within the last decade. This figure includes 38 million Spanish-speaking Americans.
Hispanic/Latinos overall are the second-largest ethnic group in the United States, after non-Hispanic Whites (a group which, like Hispanics and Latinos, is composed of dozens of sub-groups of differing national origin).
Hispanics have occupied territory of the present-day United States continuously since the sixteenth-century founding by the Spanish of Saint Augustine, Florida. After Native Americans,
Hispanics are the oldest ethnic group to inhabit what is today the United States. Many have Native American ancestry.
Spain colonized large areas of what is today the American Southwest and West Coast, including present-day California, New Mexico, Arizona, and Texas, all of which were under the Republic of Mexico after its independence in the 19th century and until the end of the Mexican–American War.
Click here for further amplification.
More generally, it includes all persons in the United States who self-identify as Hispanic or Latino, whether of full or partial ancestry.
For the US census in 2010, American Community Survey, people counted as "Hispanic" or "Latino" are those who identify as one of the specific Hispanic or Latino categories listed on the census or ACS questionnaire ("Mexican," "Puerto Rican," or "Cuban") as well as those who indicate that they are "other Spanish, Hispanic, or Latino."
The countries or people who are in the Hispanic or Latino American groups as classified by the Census Bureau are the following: Spain, Puerto Rico, Mexico, Cuba, Dominican Republic, Costa Rica, Guatemala, Honduras, Nicaragua, Panama, El Salvador, Argentina, Bolivia, Chile, Colombia, Ecuador, Paraguay, Peru, Uruguay, and Venezuela.
The Census bureau uses the terms Hispanic and Latino interchangeably. Also important to note is that the Census office of the U.S. excludes Brazilian Americans from the Hispanic and Latino American population (Brazil is part of Latin America, but has a Portuguese language culture rather than a Spanish language culture).
Other U.S. government agencies have slightly different definitions of the term, including Brazilians and other Portuguese-speaking groups.
Origin can be viewed as the ancestry, nationality group, lineage, or country of birth of the person or the person's parents or ancestors before their arrival in the United States. People who identify as Spanish, Hispanic, or Latino may be of any race.
As the only specifically designated category of ethnicity in the United States (other than non-Hispanic/Latino), Hispanics form a pan-ethnicity incorporating a diversity of inter-related cultural and linguistic heritages.
Most Hispanic Americans are of Puerto Rican, Mexican, Cuban, Salvadoran, Dominican, Guatemalan, or Colombian origin. The predominant Hispanic origin varies widely across the country.
Hispanic Americans are the second fastest-growing ethnic group in the United States after Asian Americans. As of 2014, Hispanics constitute 17.37% of the United States population, or 55.3 million people.
The United States has the second-largest community of people of Hispanic origin other than Mexico, having surpassed Argentina, Colombia, and Spain within the last decade. This figure includes 38 million Spanish-speaking Americans.
Hispanic/Latinos overall are the second-largest ethnic group in the United States, after non-Hispanic Whites (a group which, like Hispanics and Latinos, is composed of dozens of sub-groups of differing national origin).
Hispanics have occupied territory of the present-day United States continuously since the sixteenth-century founding by the Spanish of Saint Augustine, Florida. After Native Americans,
Hispanics are the oldest ethnic group to inhabit what is today the United States. Many have Native American ancestry.
Spain colonized large areas of what is today the American Southwest and West Coast, including present-day California, New Mexico, Arizona, and Texas, all of which were under the Republic of Mexico after its independence in the 19th century and until the end of the Mexican–American War.
Click here for further amplification.
Asian Americans
YouTube Video of Bruce Lee, movie star and martial artist
Pictured: Map identifying Asian Americans as a percentage of the overall American Population (By Ali Zifan - Own work; Map is based on here., CC BY-SA 4.0)
Asian Americans are Americans of Asian descent.
As used by the U.S. Census Bureau Asian refers to a person having ancestral origins in any of the original peoples of East Asia, Southeast Asia, or South Asia.
The term includes people who indicate their race(s) as "Asian" or reported entries such as,
Asian Americans with no other ancestry comprise 4.8% of the U.S. population, while people who are Asian alone or combined with at least one other race make up 5.6%.
As of 2012, Asian Americans had the highest educational attainment level and median household income of any racial demographic in the country, and in 2008 they had the highest median household income overall of any racial demographic.
Despite holding the highest educational attainment level and median household income of any racial demographic in American society in a 2014 census conducted by the U.S. census bureau reported that Asians in the U.S. 12% were living below the poverty line which is higher than non-Hispanic White Americans who only have 10.1% of them living below the poverty line.
This is largely due to the fact that a high percentage of Asian Americans are immigrants, and independently of race, immigrants are more likely than the native-born to be poor. Once country of birth and other demographic factors are taken into account, Asian Americans are no more likely than non-hispanic whites to live in poverty.
As used by the U.S. Census Bureau Asian refers to a person having ancestral origins in any of the original peoples of East Asia, Southeast Asia, or South Asia.
The term includes people who indicate their race(s) as "Asian" or reported entries such as,
- "Indian",
- "Sri Lankan",
- "Chinese",
- "Filipino",
- "Korean",
- "Japanese",
- "Vietnamese",
- "Pakistani",
- "Cambodian",
- "Hmong",
- and "Other Asian"
Asian Americans with no other ancestry comprise 4.8% of the U.S. population, while people who are Asian alone or combined with at least one other race make up 5.6%.
As of 2012, Asian Americans had the highest educational attainment level and median household income of any racial demographic in the country, and in 2008 they had the highest median household income overall of any racial demographic.
Despite holding the highest educational attainment level and median household income of any racial demographic in American society in a 2014 census conducted by the U.S. census bureau reported that Asians in the U.S. 12% were living below the poverty line which is higher than non-Hispanic White Americans who only have 10.1% of them living below the poverty line.
This is largely due to the fact that a high percentage of Asian Americans are immigrants, and independently of race, immigrants are more likely than the native-born to be poor. Once country of birth and other demographic factors are taken into account, Asian Americans are no more likely than non-hispanic whites to live in poverty.
Race and Ethnicity in the United States
YouTube Video: Heated Debate on Race Relations in the United States: Racism and Discrimination (1994)
Illustration: U.S. real median household income by race and ethnicity from 1967 to 2011, with the intra-group differences illustrated through history
The United States has a racially and ethnically diverse population.
The census officially recognizes six ethnic and racial categories: White American, Black or African American, Native American and Alaska Native, Asian American, Native Hawaiian and Other Pacific Islander, and people of two or more races; a race called "Some other race" is also used in the census and other surveys, but is not official.
The United States Census Bureau also classifies Americans as "Hispanic or Latino" and "Not Hispanic or Latino", which identifies Hispanic and Latino Americans as a racially diverse ethnicity that composes the largest minority group in the nation.
White Americans are the racial majority. African Americans are the largest racial minority, amounting to 13.2% of the population. Hispanic and Latino Americans amount to 17% of the population, making up the largest ethnic minority. The White, non-Hispanic or Latino population make up 62.6%of the nation's total, with the total White population (including White Hispanics and Latinos) being 77%.
White Americans are the majority in every region, but contribute the highest proportion of the population in the Midwestern United States, at 85% per the Population Estimates Program (PEP), or 83% per the American Community Survey (ACS).
Non-Hispanic Whites make up 79% of the Midwest's population, the highest ratio of any region. However, 35% of White Americans (whether all White Americans or non-Hispanic/Latino only) live in the South, the most of any region.
55% of the African American population lives in the South. A plurality or majority of the other official groups reside in the West. This region is home to 42% of Hispanic and Latino Americans, 46% of Asian Americans, 48% of American Indians and Alaska Natives, 68% of Native Hawaiians and Other Pacific Islanders, 37% of the "two or more races" population (Multiracial Americans), and 46% of those designated "some other race".
Roots TV Mini-series (ABC: 1977)
YouTube Video: Roots: mutiny on a slave ship
Pictured: Roots Miniseries 25th Anniversary DVD cover, 2001 By Source, Fair use
Roots is an American television miniseries based on Alex Haley's 1976 novel, Roots: The Saga of an American Family; the series first aired, on ABC-TV, in 1977.
Roots received 37 Emmy Award nominations and won nine. It won also a Golden Globe and a Peabody Award.
It received unprecedented Nielsen ratings for the finale, which still holds a record as the third highest rated episode for any type of television series, and the second most watched overall series finale in U.S. television history.
It was produced on a budget of $6.6 million. The series introduced LeVar Burton in the role of Kunta Kinte.
A sequel, Roots: The Next Generations, first aired in 1979, and a second sequel, Roots: The Gift, a Christmas TV movie, starring Burton and Louis Gossett Jr. first aired in 1988.
A related film, Alex Haley's Queen, is based on the life of Queen Jackson Haley, who was Alex Haley's paternal grandmother.
For amplification about the TV mini-series, click on any of the following hyperlinks:
Roots received 37 Emmy Award nominations and won nine. It won also a Golden Globe and a Peabody Award.
It received unprecedented Nielsen ratings for the finale, which still holds a record as the third highest rated episode for any type of television series, and the second most watched overall series finale in U.S. television history.
It was produced on a budget of $6.6 million. The series introduced LeVar Burton in the role of Kunta Kinte.
A sequel, Roots: The Next Generations, first aired in 1979, and a second sequel, Roots: The Gift, a Christmas TV movie, starring Burton and Louis Gossett Jr. first aired in 1988.
A related film, Alex Haley's Queen, is based on the life of Queen Jackson Haley, who was Alex Haley's paternal grandmother.
For amplification about the TV mini-series, click on any of the following hyperlinks:
- 1 Plot
- 2 Cast
- 3 Production
- 4 Legal issues
- 5 Broadcast history
- 6 DVD release
- 7 Awards and nominations
- 8 Historical accuracy
- 9 Remake
- 10 See also
Rosa Parks
YouTube Video: Rosa Parks - Civil Rights Activist | Mini Bio | BIO
Pictured: Rosa Parks in 1955, with Martin Luther King, Jr. in the background
Rosa Louise McCauley Parks (February 4, 1913 – October 24, 2005) was an African American civil rights activist, whom the United States Congress called "the first lady of civil rights" and "the mother of the freedom movement". Her birthday, February 4, and the day she was arrested, December 1, have both become Rosa Parks Day, commemorated in California and Missouri (February 4), and Ohio and Oregon (December 1).
On December 1, 1955, in Montgomery, Alabama, Parks refused to obey bus driver James F. Blake's order to give up her seat in the colored section to a white passenger, after the white section was filled. Parks was not the first person to resist bus segregation.
Others had taken similar steps, including Bayard Rustin in 1942, Irene Morgan in 1946, Sarah Louise Keys in 1952, and the members of the ultimately successful Browder v. Gayle lawsuit (Claudette Colvin, Aurelia Browder, Susie McDonald, and Mary Louise Smith) who were arrested in Montgomery for not giving up their bus seats months before Parks.
NAACP organizers believed that Parks was the best candidate for seeing through a court challenge after her arrest for civil disobedience in violating Alabama segregation laws, although eventually her case became bogged down in the state courts while theBrowder v. Gayle case succeeded.
Parks' act of defiance and the Montgomery Bus Boycott became important symbols of the modern Civil Rights Movement. She became an international icon of resistance to racial segregation. She organized and collaborated with civil rights leaders, including Edgar Nixon, president of the local chapter of the NAACP; and Martin Luther King, Jr., a new minister in town who gained national prominence in the civil rights movement.
At the time, Parks was secretary of the Montgomery chapter of the NAACP. She had recently attended the Highlander Folk School, a Tennessee center for training activists for workers' rights and racial equality. She acted as a private citizen "tired of giving in". Although widely honored in later years, she also suffered for her act; she was fired from her job as a seamstress in a local department store, and received death threats for years afterwards. Her situation also opened doors.
Shortly after the boycott, she moved to Detroit, where she briefly found similar work. From 1965 to 1988 she served as secretary and receptionist to John Conyers, an African-American U.S. Representative. She was also active in the Black Power movement and the support of political prisoners in the US.
After retirement, Parks wrote her autobiography and lived a largely private life in Detroit. In her final years, she suffered from dementia. Parks received national recognition, including the NAACP's 1979 Spingarn Medal, the Presidential Medal of Freedom, the Congressional Gold Medal, and a posthumous statue in the United States Capitol's National Statuary Hall.
Upon her death in 2005, she was the first woman and third non-U.S. government official to lie in honor at the Capitol Rotunda.
On December 1, 1955, in Montgomery, Alabama, Parks refused to obey bus driver James F. Blake's order to give up her seat in the colored section to a white passenger, after the white section was filled. Parks was not the first person to resist bus segregation.
Others had taken similar steps, including Bayard Rustin in 1942, Irene Morgan in 1946, Sarah Louise Keys in 1952, and the members of the ultimately successful Browder v. Gayle lawsuit (Claudette Colvin, Aurelia Browder, Susie McDonald, and Mary Louise Smith) who were arrested in Montgomery for not giving up their bus seats months before Parks.
NAACP organizers believed that Parks was the best candidate for seeing through a court challenge after her arrest for civil disobedience in violating Alabama segregation laws, although eventually her case became bogged down in the state courts while theBrowder v. Gayle case succeeded.
Parks' act of defiance and the Montgomery Bus Boycott became important symbols of the modern Civil Rights Movement. She became an international icon of resistance to racial segregation. She organized and collaborated with civil rights leaders, including Edgar Nixon, president of the local chapter of the NAACP; and Martin Luther King, Jr., a new minister in town who gained national prominence in the civil rights movement.
At the time, Parks was secretary of the Montgomery chapter of the NAACP. She had recently attended the Highlander Folk School, a Tennessee center for training activists for workers' rights and racial equality. She acted as a private citizen "tired of giving in". Although widely honored in later years, she also suffered for her act; she was fired from her job as a seamstress in a local department store, and received death threats for years afterwards. Her situation also opened doors.
Shortly after the boycott, she moved to Detroit, where she briefly found similar work. From 1965 to 1988 she served as secretary and receptionist to John Conyers, an African-American U.S. Representative. She was also active in the Black Power movement and the support of political prisoners in the US.
After retirement, Parks wrote her autobiography and lived a largely private life in Detroit. In her final years, she suffered from dementia. Parks received national recognition, including the NAACP's 1979 Spingarn Medal, the Presidential Medal of Freedom, the Congressional Gold Medal, and a posthumous statue in the United States Capitol's National Statuary Hall.
Upon her death in 2005, she was the first woman and third non-U.S. government official to lie in honor at the Capitol Rotunda.
Martin Luther King, Jr.
YouTube Video: Martin Luther King - "I Have a Dream" Speech given on August 28, 1963
Pictured: U.S. President Lyndon B. Johnson hands a pen to civil rights leader Rev. Martin Luther King Jr. after signing the Civil Rights Act of 1964
Martin Luther King, Jr. (January 15, 1929 – April 4, 1968) was an American Baptist minister, activist, humanitarian, and leader in the African-American Civil Rights Movement.
He is best known for his role in the advancement of civil rights using nonviolent civil disobedience based on his Christian beliefs.
King became a civil rights activist early in his career. He led the 1955 Montgomery Bus Boycott and helped found the Southern Christian Leadership Conference (SCLC) in 1957, serving as its first president.
With the SCLC, King led an unsuccessful 1962 struggle against segregation in Albany, Georgia (the Albany Movement), and helped organize the 1963 nonviolent protests in Birmingham, Alabama. King also helped to organize the 1963 March on Washington, where he delivered his famous "I Have a Dream" speech. There, he established his reputation as one of the greatest orators in American history.
On October 14, 1964, King received the Nobel Peace Prize for combating racial inequality through nonviolence. In 1965, he helped to organize the Selma to Montgomery marches, and the following year he and SCLC took the movement north to Chicago to work on segregated housing.
In the final years of his life, King expanded his focus to include poverty and speak against the Vietnam War, alienating many of his liberal allies with a 1967 speech titled "Beyond Vietnam".
In 1968, King was planning a national occupation of Washington, D.C., to be called the Poor People's Campaign, when he was assassinated on April 4 in Memphis, Tennessee. His death was followed by riots in many U.S. cities.
King was posthumously awarded the Presidential Medal of Freedom and the Congressional Gold Medal. Martin Luther King, Jr. Day was established as a holiday in numerous cities and states beginning in 1971, and as a U.S. federal holiday in 1986.
Hundreds of streets in the U.S. have been renamed in his honor, and a county in Washington State was also renamed for him.
The Martin Luther King, Jr. Memorial on the National Mall in Washington, D.C., was dedicated in 2011.
He is best known for his role in the advancement of civil rights using nonviolent civil disobedience based on his Christian beliefs.
King became a civil rights activist early in his career. He led the 1955 Montgomery Bus Boycott and helped found the Southern Christian Leadership Conference (SCLC) in 1957, serving as its first president.
With the SCLC, King led an unsuccessful 1962 struggle against segregation in Albany, Georgia (the Albany Movement), and helped organize the 1963 nonviolent protests in Birmingham, Alabama. King also helped to organize the 1963 March on Washington, where he delivered his famous "I Have a Dream" speech. There, he established his reputation as one of the greatest orators in American history.
On October 14, 1964, King received the Nobel Peace Prize for combating racial inequality through nonviolence. In 1965, he helped to organize the Selma to Montgomery marches, and the following year he and SCLC took the movement north to Chicago to work on segregated housing.
In the final years of his life, King expanded his focus to include poverty and speak against the Vietnam War, alienating many of his liberal allies with a 1967 speech titled "Beyond Vietnam".
In 1968, King was planning a national occupation of Washington, D.C., to be called the Poor People's Campaign, when he was assassinated on April 4 in Memphis, Tennessee. His death was followed by riots in many U.S. cities.
King was posthumously awarded the Presidential Medal of Freedom and the Congressional Gold Medal. Martin Luther King, Jr. Day was established as a holiday in numerous cities and states beginning in 1971, and as a U.S. federal holiday in 1986.
Hundreds of streets in the U.S. have been renamed in his honor, and a county in Washington State was also renamed for him.
The Martin Luther King, Jr. Memorial on the National Mall in Washington, D.C., was dedicated in 2011.
Social Influence of Rock Music in the United States
YouTube Video of Ross and Rachel (From the TV comedy “Friends”) with U2’s song “With or Without You” in the background
Pictured: LEFT: Chuck Berry made his national TV debut on American Bandstand (ABC:1957–1987, then Syndicated: 1987–1988) and USA Network (1989);
RIGHT: From Left: “The Band” members Richard Manuel, Robbie Robertson, Rick Danko, Levon Helm and Garth Hudson performing in the movie “The Last Waltz” (1978, and directed by Martin Scorsese)
The popularity and worldwide scope of rock music resulted in a powerful impact on society. Rock and roll influenced daily life, fashion, attitudes and language in a way few other social developments have equalled.
As the original generations of rock and roll fans matured, the music became an accepted and deeply interwoven thread in popular culture. Beginning in the early 1970s, rock songs and acts began to be used in a few television commercials; within a decade this practice became widespread. Starting in the 1980s rock music was often featured in film and television program soundtracks
Race:
In the cross-over of African American "race music" to a growing white youth audience, the popularization of rock and roll involved both black performers reaching a white audience and white performers appropriating African-American music.
Rock and roll appeared at a time when racial tensions in the United States were entering a new phase, with the beginnings of the civil rights movement for desegregation, leading to the Supreme Court ruling that abolished the policy of "separate but equal" in 1954, but leaving a policy which would be extremely difficult to enforce in parts of the United States.
The coming together of white youth audiences and black music in rock and roll, inevitably provoked strong white racist reactions within the US, with many whites condemning its breaking down of barriers based on color. Many observers saw rock and roll as heralding the way for desegregation, in creating a new form of music that encouraged racial cooperation and shared experience.
Many authors have argued that early rock and roll was instrumental in the way both white and black teenagers identified themselves.
Sex and drugs:
See also: Wine, women and song The rock and roll lifestyle was popularly associated with sex and drugs. Many of rock and roll's early stars (as well as their jazz and blues counterparts) were known as hard-drinking, hard-living characters.
During the 1960s the lifestyles of many stars became more publicly known, aided by the growth of the underground rock press. Musicians had always attracted attention of "groupies" (girls who followed musicians) who spent time with and often did sexual favors for band members.
As the stars' lifestyles became more public, the popularity and promotion of recreational drug use by musicians may have influenced use of drugs and the perception of acceptability of drug use among the youth of the period.
For example, when in the late 1960s the Beatles, who had previously been marketed as clean-cut youths, started publicly acknowledging using LSD, many fans followed. Journalist Al Aronowitz wrote "...whatever the Beatles did was acceptable, especially for young people." Jerry Garcia, of the rock band Grateful Dead said, "For some people, taking LSD and going to Grateful Dead show functions like a rite of passage ... we don't have a product to sell; but we do have a mechanism that works."
In the late 1960s and early 1970s, much of the rock and roll cachet associated with drug use dissipated as rock music suffered a series of drug-related deaths, including the 27 Club-member deaths of Jimi Hendrix, Janis Joplin and Jim Morrison. Although some amount of drug use remained common among rock musicians, a greater respect for the dangers of drug consumption was observed, and many anti-drug songs became part of the rock lexicon, notably "The Needle and the Damage Done" by Neil Young (1972).
Many rock musicians have acknowledged battling addictions to many substances including alcohol, cocaine and heroin; many of these have successfully undergone drug rehabilitation programs, but others have died.
In the early 1980s. along with the rise of the band Minor Threat, a straight edge lifestyle became popular. The straight edge philosophy of abstinence from recreational drugs, alcohol, tobacco and sex became associated with some hardcore punks through the years, and both remain popular with youth today
Fashion:
Rock music and fashion have been inextricably linked. In the mid-1960s of the UK, rivalry arose between "Mods" (who favored 'modern' Italian-led fashion) and "Rockers" (who wore motorcycle leathers), each style had their own favored musical acts. (The controversy would form the backdrop for The Who's rock opera Quadrophenia).
In the 1960s, The Beatles brought mop-top haircuts, collarless blazers, and Beatle Boots into fashion.
Rock musicians were also early adopters of hippie fashion and popularised such styles as long hair and the Nehru jacket. As rock music genres became more segmented, what an artist wore became as important as the music itself in defining the artist's intent and relationship to the audience.
In the early 1970s, glam rock became widely influential featuring glittery fashions, high heels and camp. In the late 1970s, disco acts helped bring flashy urban styles to the mainstream, while punk groups began wearing mock-conservative attire, (including suit jackets and skinny ties), in an attempt to be as unlike mainstream rock musicians, who still favored blue jeans and hippie-influenced clothes.
Heavy Metal bands in the 1980s often favored a strong visual image. For some bands, this consisted of leather or denim jackets and pants, spike/studs and long hair. Visual image was a strong component of the glam metal movement.
In the early 1990s, the popularity of grunge brought in a punk influenced fashion of its own, including torn jeans, old shoes, flannel shirts, backwards baseball hats, and people grew their hair against the clean-cut image that was popular at the time in heavily commercialized pop music culture.
Musicians continue to be fashion icons; pop-culture magazines such as Rolling Stone often include fashion layouts featuring musicians as models.
Authenticity:
Rock musicians and fans have consistently struggled with the paradox of "selling out"—to be considered "authentic", rock music must keep a certain distance from the commercial world and its constructs; however it is widely believed that certain compromises must be made in order to become successful and to make music available to the public.
This dilemma has created friction between musicians and fans, with some bands going to great lengths to avoid the appearance of "selling out" (while still finding ways to make a lucrative living). In some styles of rock, such as punk and heavy metal, a performer who is believed to have "sold out" to commercial interests may be labelled with the pejorative term "poseur".
If a performer first comes to public attention with one style, any further stylistic development may be seen as selling out to long-time fans. On the other hand, managers and producers may progressively take more control of the artist, as happened, for instance, in Elvis Presley's swift transition in species from "The Hillbilly Cat" to "your teddy bear". It can be difficult to define the difference between seeking a wider audience and selling out.
Ray Charles left behind his classic formulation of rhythm and blues to sing country music, pop songs and soft-drink commercials. In the process, he went from a niche audience to worldwide fame. In the end, it is a moral judgement made by the artist, the management, and the audience.
Charitable and social causes:
Love and peace were very common themes in rock music during the 1960s and 1970s. Rock musicians have often attempted to address social issues directly as commentary or as calls to action. During the Vietnam War the first rock protest songs were heard, inspired by the songs of folk musicians such as Woody Guthrie and Bob Dylan, which ranged from abstract evocations of peace Peter, Paul and Mary's "If I Had a Hammer" to blunt anti-establishment diatribes Crosby, Stills, Nash & Young's "Ohio". Other musicians, notably John Lennon and Yoko Ono, were vocal in their anti-war sentiment both in their music and in public statements.
Famous rock musicians have adopted causes ranging from the environment (Marvin Gaye's "Mercy Mercy Me (The Ecology)") and the Anti-Apartheid Movement (Peter Gabriel's "Biko"), to violence in Northern Ireland (U2's "Sunday Bloody Sunday") and worldwide economic policy (the Dead Kennedys' "Kill the Poor"). Another notable protest song is Patti Smith's recording "People Have the Power." On occasion this involvement would go beyond simple songwriting and take the form of sometimes-spectacular concerts or televised events, often raising money for charity and awareness of global issues.
Rock and roll as social activism reached a milestone in the Live Aid concerts, held July 13, 1985, which were an outgrowth of the 1984 charity single "Do They Know It's Christmas?" and became the largest musical concert in history with performers on two main stages, one in London, England and the other in Philadelphia, USA (plus some other acts performing in other countries) and televised worldwide.
The concert lasted 16 hours and featured nearly everybody who was in the forefront of rock and pop in 1985. The charity event raised millions of dollars for famine relief in Africa. Live Aid became a model for many other fund-raising and consciousness-raising efforts, including the Farm Aid concerts for family farmers in North America, and televised performances benefiting victims of the September 11 attacks. Live Aid itself was reprised in 2005 with the Live 8 concert, to raise awareness of global economic policy. Environmental issues have also been a common theme, one example being Live Earth.
As the original generations of rock and roll fans matured, the music became an accepted and deeply interwoven thread in popular culture. Beginning in the early 1970s, rock songs and acts began to be used in a few television commercials; within a decade this practice became widespread. Starting in the 1980s rock music was often featured in film and television program soundtracks
Race:
In the cross-over of African American "race music" to a growing white youth audience, the popularization of rock and roll involved both black performers reaching a white audience and white performers appropriating African-American music.
Rock and roll appeared at a time when racial tensions in the United States were entering a new phase, with the beginnings of the civil rights movement for desegregation, leading to the Supreme Court ruling that abolished the policy of "separate but equal" in 1954, but leaving a policy which would be extremely difficult to enforce in parts of the United States.
The coming together of white youth audiences and black music in rock and roll, inevitably provoked strong white racist reactions within the US, with many whites condemning its breaking down of barriers based on color. Many observers saw rock and roll as heralding the way for desegregation, in creating a new form of music that encouraged racial cooperation and shared experience.
Many authors have argued that early rock and roll was instrumental in the way both white and black teenagers identified themselves.
Sex and drugs:
See also: Wine, women and song The rock and roll lifestyle was popularly associated with sex and drugs. Many of rock and roll's early stars (as well as their jazz and blues counterparts) were known as hard-drinking, hard-living characters.
During the 1960s the lifestyles of many stars became more publicly known, aided by the growth of the underground rock press. Musicians had always attracted attention of "groupies" (girls who followed musicians) who spent time with and often did sexual favors for band members.
As the stars' lifestyles became more public, the popularity and promotion of recreational drug use by musicians may have influenced use of drugs and the perception of acceptability of drug use among the youth of the period.
For example, when in the late 1960s the Beatles, who had previously been marketed as clean-cut youths, started publicly acknowledging using LSD, many fans followed. Journalist Al Aronowitz wrote "...whatever the Beatles did was acceptable, especially for young people." Jerry Garcia, of the rock band Grateful Dead said, "For some people, taking LSD and going to Grateful Dead show functions like a rite of passage ... we don't have a product to sell; but we do have a mechanism that works."
In the late 1960s and early 1970s, much of the rock and roll cachet associated with drug use dissipated as rock music suffered a series of drug-related deaths, including the 27 Club-member deaths of Jimi Hendrix, Janis Joplin and Jim Morrison. Although some amount of drug use remained common among rock musicians, a greater respect for the dangers of drug consumption was observed, and many anti-drug songs became part of the rock lexicon, notably "The Needle and the Damage Done" by Neil Young (1972).
Many rock musicians have acknowledged battling addictions to many substances including alcohol, cocaine and heroin; many of these have successfully undergone drug rehabilitation programs, but others have died.
In the early 1980s. along with the rise of the band Minor Threat, a straight edge lifestyle became popular. The straight edge philosophy of abstinence from recreational drugs, alcohol, tobacco and sex became associated with some hardcore punks through the years, and both remain popular with youth today
Fashion:
Rock music and fashion have been inextricably linked. In the mid-1960s of the UK, rivalry arose between "Mods" (who favored 'modern' Italian-led fashion) and "Rockers" (who wore motorcycle leathers), each style had their own favored musical acts. (The controversy would form the backdrop for The Who's rock opera Quadrophenia).
In the 1960s, The Beatles brought mop-top haircuts, collarless blazers, and Beatle Boots into fashion.
Rock musicians were also early adopters of hippie fashion and popularised such styles as long hair and the Nehru jacket. As rock music genres became more segmented, what an artist wore became as important as the music itself in defining the artist's intent and relationship to the audience.
In the early 1970s, glam rock became widely influential featuring glittery fashions, high heels and camp. In the late 1970s, disco acts helped bring flashy urban styles to the mainstream, while punk groups began wearing mock-conservative attire, (including suit jackets and skinny ties), in an attempt to be as unlike mainstream rock musicians, who still favored blue jeans and hippie-influenced clothes.
Heavy Metal bands in the 1980s often favored a strong visual image. For some bands, this consisted of leather or denim jackets and pants, spike/studs and long hair. Visual image was a strong component of the glam metal movement.
In the early 1990s, the popularity of grunge brought in a punk influenced fashion of its own, including torn jeans, old shoes, flannel shirts, backwards baseball hats, and people grew their hair against the clean-cut image that was popular at the time in heavily commercialized pop music culture.
Musicians continue to be fashion icons; pop-culture magazines such as Rolling Stone often include fashion layouts featuring musicians as models.
Authenticity:
Rock musicians and fans have consistently struggled with the paradox of "selling out"—to be considered "authentic", rock music must keep a certain distance from the commercial world and its constructs; however it is widely believed that certain compromises must be made in order to become successful and to make music available to the public.
This dilemma has created friction between musicians and fans, with some bands going to great lengths to avoid the appearance of "selling out" (while still finding ways to make a lucrative living). In some styles of rock, such as punk and heavy metal, a performer who is believed to have "sold out" to commercial interests may be labelled with the pejorative term "poseur".
If a performer first comes to public attention with one style, any further stylistic development may be seen as selling out to long-time fans. On the other hand, managers and producers may progressively take more control of the artist, as happened, for instance, in Elvis Presley's swift transition in species from "The Hillbilly Cat" to "your teddy bear". It can be difficult to define the difference between seeking a wider audience and selling out.
Ray Charles left behind his classic formulation of rhythm and blues to sing country music, pop songs and soft-drink commercials. In the process, he went from a niche audience to worldwide fame. In the end, it is a moral judgement made by the artist, the management, and the audience.
Charitable and social causes:
Love and peace were very common themes in rock music during the 1960s and 1970s. Rock musicians have often attempted to address social issues directly as commentary or as calls to action. During the Vietnam War the first rock protest songs were heard, inspired by the songs of folk musicians such as Woody Guthrie and Bob Dylan, which ranged from abstract evocations of peace Peter, Paul and Mary's "If I Had a Hammer" to blunt anti-establishment diatribes Crosby, Stills, Nash & Young's "Ohio". Other musicians, notably John Lennon and Yoko Ono, were vocal in their anti-war sentiment both in their music and in public statements.
Famous rock musicians have adopted causes ranging from the environment (Marvin Gaye's "Mercy Mercy Me (The Ecology)") and the Anti-Apartheid Movement (Peter Gabriel's "Biko"), to violence in Northern Ireland (U2's "Sunday Bloody Sunday") and worldwide economic policy (the Dead Kennedys' "Kill the Poor"). Another notable protest song is Patti Smith's recording "People Have the Power." On occasion this involvement would go beyond simple songwriting and take the form of sometimes-spectacular concerts or televised events, often raising money for charity and awareness of global issues.
Rock and roll as social activism reached a milestone in the Live Aid concerts, held July 13, 1985, which were an outgrowth of the 1984 charity single "Do They Know It's Christmas?" and became the largest musical concert in history with performers on two main stages, one in London, England and the other in Philadelphia, USA (plus some other acts performing in other countries) and televised worldwide.
The concert lasted 16 hours and featured nearly everybody who was in the forefront of rock and pop in 1985. The charity event raised millions of dollars for famine relief in Africa. Live Aid became a model for many other fund-raising and consciousness-raising efforts, including the Farm Aid concerts for family farmers in North America, and televised performances benefiting victims of the September 11 attacks. Live Aid itself was reprised in 2005 with the Live 8 concert, to raise awareness of global economic policy. Environmental issues have also been a common theme, one example being Live Earth.
Social Stereotypes in Art and Culture
YouTube Video with Examples of Stereotypes
Pictured: Cartoon about Social Manipulation (Sic)
In social psychology, a stereotype is a thought that can be adopted about specific types of individuals or certain ways of doing things.
These thoughts or beliefs may or may not accurately reflect reality. However, this is only a fundamental psychological definition of a stereotype.
Within psychology and spanning across other disciplines, there are different conceptualizations and theories of stereotyping that provide their own expanded definition. Some of these definitions share commonalities, though each one may also harbor unique aspects that may contradict the others.
Click on any of the following blue hyperlinks for amplification:
These thoughts or beliefs may or may not accurately reflect reality. However, this is only a fundamental psychological definition of a stereotype.
Within psychology and spanning across other disciplines, there are different conceptualizations and theories of stereotyping that provide their own expanded definition. Some of these definitions share commonalities, though each one may also harbor unique aspects that may contradict the others.
Click on any of the following blue hyperlinks for amplification:
- 1 Etymology
- 2 Relationship with other types of intergroup attitudes
- 3 Content
- 4 Functions
- 5 Formation
- 6 Activation
- 7 Accuracy
- 8 Effects
- 9 Role in art and culture
- 10 See also
Counterculture of the 1960s
YouTube Video 1960s Counterculture Documentary
Pictured: LEFT: A small part of the crowd of 400,000, after the rain, Woodstock, United States, August 1969; RIGHT: The Doors performing for Danish television in 1968 (photo by By Polfoto/Jan Persson - Den Store Danske - The Doors)
The counterculture of the 1960s refers to an anti-establishment cultural phenomenon that developed first in the United States and the United Kingdom, and then spread throughout much of the Western world between the early 1960s and the mid-1970s, with London, New York City, and San Francisco being hotbeds of early countercultural activity.
The aggregate movement gained momentum as the American Civil Rights Movement continued to grow, and became revolutionary with the expansion of the US government's extensive military intervention in Vietnam.
As the 1960s progressed, widespread social tensions also developed concerning other issues, and tended to flow along generational lines regarding human sexuality, women's rights, traditional modes of authority, experimentation with psychoactive drugs, and differing interpretations of the American Dream.
As the era unfolded, new cultural forms and a dynamic subculture which celebrated experimentation, modern incarnations of Bohemianism, and the rise of the hippie and other alternative lifestyles, emerged.
This embracing of creativity is particularly notable in the works of British Invasion bands such as the Beatles, and filmmakers whose works became far less restricted by censorship. In addition to the trendsetting Beatles, many other creative artists, authors, and thinkers, within and across many disciplines, helped define the counterculture movement.
Several factors distinguished the counterculture of the 1960s from the anti-authoritarian movements of previous eras. The post-World War II "baby boom" generated an unprecedented number of potentially disaffected young people as prospective participants in a rethinking of the direction of American and other democratic societies.
Post-war affluence allowed many of the counterculture generation to move beyond a focus on the provision of the material necessities of life that had preoccupied their Depression-era parents. The era was also notable in that a significant portion of the array of behaviors and "causes" within the larger movement were quickly assimilated within mainstream society, particularly in the US, even though counterculture participants numbered in the clear minority within their respective national populations.
The counterculture era essentially commenced in earnest with the assassination of John F. Kennedy in November of 1963. It became absorbed into the popular culture with the termination of U.S. combat-military involvement in Southeast Asia and the end of the draft in 1973, and ultimately with the resignation of President Richard M. Nixon in August 1974.
Many key movements were born of, or were advanced within, the counterculture of the 1960s. Each movement is relevant to the larger era. The most important stand alone, irrespective of the larger counterculture.
In the broadest sense, 1960s counterculture grew from a confluence of people, ideas, events, issues, circumstances, and technological developments which served as intellectual and social catalysts for exceptionally rapid change during the era.
Click here for further amplification.
The aggregate movement gained momentum as the American Civil Rights Movement continued to grow, and became revolutionary with the expansion of the US government's extensive military intervention in Vietnam.
As the 1960s progressed, widespread social tensions also developed concerning other issues, and tended to flow along generational lines regarding human sexuality, women's rights, traditional modes of authority, experimentation with psychoactive drugs, and differing interpretations of the American Dream.
As the era unfolded, new cultural forms and a dynamic subculture which celebrated experimentation, modern incarnations of Bohemianism, and the rise of the hippie and other alternative lifestyles, emerged.
This embracing of creativity is particularly notable in the works of British Invasion bands such as the Beatles, and filmmakers whose works became far less restricted by censorship. In addition to the trendsetting Beatles, many other creative artists, authors, and thinkers, within and across many disciplines, helped define the counterculture movement.
Several factors distinguished the counterculture of the 1960s from the anti-authoritarian movements of previous eras. The post-World War II "baby boom" generated an unprecedented number of potentially disaffected young people as prospective participants in a rethinking of the direction of American and other democratic societies.
Post-war affluence allowed many of the counterculture generation to move beyond a focus on the provision of the material necessities of life that had preoccupied their Depression-era parents. The era was also notable in that a significant portion of the array of behaviors and "causes" within the larger movement were quickly assimilated within mainstream society, particularly in the US, even though counterculture participants numbered in the clear minority within their respective national populations.
The counterculture era essentially commenced in earnest with the assassination of John F. Kennedy in November of 1963. It became absorbed into the popular culture with the termination of U.S. combat-military involvement in Southeast Asia and the end of the draft in 1973, and ultimately with the resignation of President Richard M. Nixon in August 1974.
Many key movements were born of, or were advanced within, the counterculture of the 1960s. Each movement is relevant to the larger era. The most important stand alone, irrespective of the larger counterculture.
In the broadest sense, 1960s counterculture grew from a confluence of people, ideas, events, issues, circumstances, and technological developments which served as intellectual and social catalysts for exceptionally rapid change during the era.
Click here for further amplification.
The Beatnik Era
YouTube Video: from "Dobie Gillis" TV Beatnik": The World According to Maynard G. Krebs - 07 I'm Like Lost... Doomed
Pictured: Beatniks, late 1950s - Consisted with turtleneck, beret, long hair for men; leotards and ballet slippers for women.
Beatnik was a media stereotype prevalent throughout the 1950s to mid-1960s that displayed the more superficial aspects of the Beat Generation literary movement of the 1950s. Elements of the beatnik trope included pseudo-intellectualism, drug use, and a cartoonish depiction of real-life people along with the spiritual quest of Jack Kerouac's autobiographical fiction.
History
Kerouac introduced the phrase "Beat Generation" in 1948, generalizing from his social circle to characterize the underground, anticonformist youth gathering in New York at that time.
The name came up in conversation with the novelist John Clellon Holmes, who published an early Beat Generation novel, Go (1952), along with a manifesto in The New York Times Magazine: "This Is the Beat Generation"
In 1954, Nolan Miller published his third novel, Why I Am So Beat (Putnam), detailing the weekend parties of four students.
The adjective "beat" was introduced to the group by Herbert Huncke, though Kerouac expanded the meaning of the term. "Beat" came from underworld slang—the world of hustlers, drug addicts, and petty thieves, where Allen Ginsberg and Kerouac sought inspiration.
"Beat" was slang for "beaten down" or downtrodden, but to Kerouac and Ginsberg, it also had a spiritual connotation as in "beatitude". Other adjectives discussed by Holmes and Kerouac were "found" and "furtive". Kerouac felt he had identified (and was the embodiment of) a new trend analogous to the influential Lost Generation.
In "Aftermath: The Philosophy of the Beat Generation", Kerouac criticized what he saw as a distortion of his visionary, spiritual ideas: "The Beat Generation, that was a vision that we had, John Clellon Holmes and I, and Allen Ginsberg in an even wilder way, in the late Forties, of a generation of crazy, illuminated hipsters suddenly rising and roaming America, serious, bumming and hitchhiking everywhere, ragged, beatific, beautiful in an ugly graceful new way—a vision gleaned from the way we had heard the word "beat" spoken on street corners on Times Square and in the Village, in other cities in the downtown city night of postwar America—beat, meaning down and out but full of intense conviction."
"We'd even heard old 1910 Daddy Hipsters of the streets speak the word that way, with a melancholy sneer. It never meant juvenile delinquents, it meant characters of a special spirituality who didn't gang up but were solitary Bartlebies staring out the dead wall window of our civilization..."
Kerouac explained what he meant by "beat" at a Brandeis Forum, "Is There A Beat Generation?", on November 8, 1958, at New York's Hunter College Playhouse. Panelists for the seminar were Kerouac, James A. Wechsler, Princeton anthropologist Ashley Montagu, and author Kingsley Amis. Wechsler, Montague, and Amis all wore suits, while Kerouac was clad in black jeans, ankle boots, and a checkered shirt.
Reading from a prepared text, Kerouac reflected on his beat beginnings: "It is because I am Beat, that is, I believe in beatitude and that God so loved the world that He gave His only begotten son to it... Who knows, but that the universe is not one vast sea of compassion actually, the veritable holy honey, beneath all this show of personality and cruelty?"
Kerouac's address was later published as "The Origins of the Beat Generation" (Playboy, June 1959). In that article, Kerouac noted how his original beatific philosophy had been ignored amid maneuvers by several pundits, among them Herb Caen, the San Francisco newspaperman, to alter Kerouac's concept with jokes and jargon:
"I went one afternoon to the church of my childhood and had a vision of what I must have really meant with "Beat"... the vision of the word Beat as being to mean beatific... People began to call themselves beatniks, beats, jazzniks, bopniks, bugniks and finally I was called the "avatar" of all this."
In light of what he considered beat to mean and what beatnik had come to mean, he once observed to a reporter, "I'm not a beatnik, I'm a Catholic", showing the reporter a painting of Pope Paul VI and saying, "You know who painted that? Me."
Stereotype:
In her memoir, Minor Characters, Joyce Johnson described how the stereotype was absorbed into American culture: "Beat Generation" sold books, sold black turtleneck sweaters and bongos, berets and dark glasses, sold a way of life that seemed like dangerous fun—thus to be either condemned or imitated. Suburban couples could have beatnik parties on Saturday nights and drink too much and fondle each other’s wives."
Kerouac biographer Ann Charters noted that the term "Beat" was appropriated to become a Madison Avenue marketing tool: "The term caught on because it could mean anything. It could even be exploited in the affluent wake of the decade’s extraordinary technological inventions. Almost immediately, for example, advertisements by "hip" record companies in New York used the idea of the Beat Generation to sell their new long playing vinyl records."
Lee Streiff, an acquaintance of many members of the movement who went on to become one of its chroniclers, believed that the news media saddled the movement for the long term with a set of false images: "Reporters are not generally well versed in artistic movements, or the history of literature or art. And most are certain that their readers, or viewers, are of limited intellectual ability and must have things explained simply, in any case."
Thus, the reporters in the media tried to relate something that was new to already preexisting frameworks and images that were only vaguely appropriate in their efforts to explain and simplify.
With a variety of oversimplified and conventional formulas at their disposal, they fell back on the nearest stereotypical approximation of what the phenomenon resembled, as they saw it. And even worse, they did not see it clearly and completely at that. They got a quotation here and a photograph there — and it was their job to wrap it up in a comprehensible package — and if it seemed to violate the prevailing mandatory conformist doctrine, they would also be obliged to give it a negative spin as well.
And in this, they were aided and abetted by the Poetic Establishment of the day. Thus, what came out in the media: from newspapers, magazines, TV, and the movies, was a product of the stereotypes of the 30s and 40s — though garbled — of a cross between a 1920s Greenwich Village bohemian artist and a Bop musician, whose visual image was completed by mixing in Daliesque paintings, a beret, a Vandyck beard, a turtleneck sweater, a pair of sandals, and set of bongo drums.
A few authentic elements were added to the collective image: poets reading their poems, for example, but even this was made unintelligible by making all of the poets speak in some kind of phony Bop idiom. The consequence is, that even though we may know now that these images do not accurately reflect the reality of the Beat movement, we still subconsciously look for them when we look back to the 50s. We have not even yet completely escaped the visual imagery that has been so insistently forced upon us.
What is a Beatnik?
The word "beatnik" was coined by Herb Caen in an article in the San Francisco Chronicle on April 2, 1958. Caen coined the term by adding the Russian suffix -nik to the Beat Generation. Caen's column with the word came six months after the launch of Sputnik.
Objecting to the term, Allen Ginsberg wrote to the New York Times to deplore "the foul word beatnik", commenting, "If beatniks and not illuminated Beat poets overrun this country, they will have been created not by Kerouac but by industries of mass communication which continue to brainwash man."
For the rest, click here.
History
Kerouac introduced the phrase "Beat Generation" in 1948, generalizing from his social circle to characterize the underground, anticonformist youth gathering in New York at that time.
The name came up in conversation with the novelist John Clellon Holmes, who published an early Beat Generation novel, Go (1952), along with a manifesto in The New York Times Magazine: "This Is the Beat Generation"
In 1954, Nolan Miller published his third novel, Why I Am So Beat (Putnam), detailing the weekend parties of four students.
The adjective "beat" was introduced to the group by Herbert Huncke, though Kerouac expanded the meaning of the term. "Beat" came from underworld slang—the world of hustlers, drug addicts, and petty thieves, where Allen Ginsberg and Kerouac sought inspiration.
"Beat" was slang for "beaten down" or downtrodden, but to Kerouac and Ginsberg, it also had a spiritual connotation as in "beatitude". Other adjectives discussed by Holmes and Kerouac were "found" and "furtive". Kerouac felt he had identified (and was the embodiment of) a new trend analogous to the influential Lost Generation.
In "Aftermath: The Philosophy of the Beat Generation", Kerouac criticized what he saw as a distortion of his visionary, spiritual ideas: "The Beat Generation, that was a vision that we had, John Clellon Holmes and I, and Allen Ginsberg in an even wilder way, in the late Forties, of a generation of crazy, illuminated hipsters suddenly rising and roaming America, serious, bumming and hitchhiking everywhere, ragged, beatific, beautiful in an ugly graceful new way—a vision gleaned from the way we had heard the word "beat" spoken on street corners on Times Square and in the Village, in other cities in the downtown city night of postwar America—beat, meaning down and out but full of intense conviction."
"We'd even heard old 1910 Daddy Hipsters of the streets speak the word that way, with a melancholy sneer. It never meant juvenile delinquents, it meant characters of a special spirituality who didn't gang up but were solitary Bartlebies staring out the dead wall window of our civilization..."
Kerouac explained what he meant by "beat" at a Brandeis Forum, "Is There A Beat Generation?", on November 8, 1958, at New York's Hunter College Playhouse. Panelists for the seminar were Kerouac, James A. Wechsler, Princeton anthropologist Ashley Montagu, and author Kingsley Amis. Wechsler, Montague, and Amis all wore suits, while Kerouac was clad in black jeans, ankle boots, and a checkered shirt.
Reading from a prepared text, Kerouac reflected on his beat beginnings: "It is because I am Beat, that is, I believe in beatitude and that God so loved the world that He gave His only begotten son to it... Who knows, but that the universe is not one vast sea of compassion actually, the veritable holy honey, beneath all this show of personality and cruelty?"
Kerouac's address was later published as "The Origins of the Beat Generation" (Playboy, June 1959). In that article, Kerouac noted how his original beatific philosophy had been ignored amid maneuvers by several pundits, among them Herb Caen, the San Francisco newspaperman, to alter Kerouac's concept with jokes and jargon:
"I went one afternoon to the church of my childhood and had a vision of what I must have really meant with "Beat"... the vision of the word Beat as being to mean beatific... People began to call themselves beatniks, beats, jazzniks, bopniks, bugniks and finally I was called the "avatar" of all this."
In light of what he considered beat to mean and what beatnik had come to mean, he once observed to a reporter, "I'm not a beatnik, I'm a Catholic", showing the reporter a painting of Pope Paul VI and saying, "You know who painted that? Me."
Stereotype:
In her memoir, Minor Characters, Joyce Johnson described how the stereotype was absorbed into American culture: "Beat Generation" sold books, sold black turtleneck sweaters and bongos, berets and dark glasses, sold a way of life that seemed like dangerous fun—thus to be either condemned or imitated. Suburban couples could have beatnik parties on Saturday nights and drink too much and fondle each other’s wives."
Kerouac biographer Ann Charters noted that the term "Beat" was appropriated to become a Madison Avenue marketing tool: "The term caught on because it could mean anything. It could even be exploited in the affluent wake of the decade’s extraordinary technological inventions. Almost immediately, for example, advertisements by "hip" record companies in New York used the idea of the Beat Generation to sell their new long playing vinyl records."
Lee Streiff, an acquaintance of many members of the movement who went on to become one of its chroniclers, believed that the news media saddled the movement for the long term with a set of false images: "Reporters are not generally well versed in artistic movements, or the history of literature or art. And most are certain that their readers, or viewers, are of limited intellectual ability and must have things explained simply, in any case."
Thus, the reporters in the media tried to relate something that was new to already preexisting frameworks and images that were only vaguely appropriate in their efforts to explain and simplify.
With a variety of oversimplified and conventional formulas at their disposal, they fell back on the nearest stereotypical approximation of what the phenomenon resembled, as they saw it. And even worse, they did not see it clearly and completely at that. They got a quotation here and a photograph there — and it was their job to wrap it up in a comprehensible package — and if it seemed to violate the prevailing mandatory conformist doctrine, they would also be obliged to give it a negative spin as well.
And in this, they were aided and abetted by the Poetic Establishment of the day. Thus, what came out in the media: from newspapers, magazines, TV, and the movies, was a product of the stereotypes of the 30s and 40s — though garbled — of a cross between a 1920s Greenwich Village bohemian artist and a Bop musician, whose visual image was completed by mixing in Daliesque paintings, a beret, a Vandyck beard, a turtleneck sweater, a pair of sandals, and set of bongo drums.
A few authentic elements were added to the collective image: poets reading their poems, for example, but even this was made unintelligible by making all of the poets speak in some kind of phony Bop idiom. The consequence is, that even though we may know now that these images do not accurately reflect the reality of the Beat movement, we still subconsciously look for them when we look back to the 50s. We have not even yet completely escaped the visual imagery that has been so insistently forced upon us.
What is a Beatnik?
The word "beatnik" was coined by Herb Caen in an article in the San Francisco Chronicle on April 2, 1958. Caen coined the term by adding the Russian suffix -nik to the Beat Generation. Caen's column with the word came six months after the launch of Sputnik.
Objecting to the term, Allen Ginsberg wrote to the New York Times to deplore "the foul word beatnik", commenting, "If beatniks and not illuminated Beat poets overrun this country, they will have been created not by Kerouac but by industries of mass communication which continue to brainwash man."
For the rest, click here.
The Hippies Era
YouTube Video: San Francisco, Haight Ashbury, 1967 Summer of Love, Verano del Amor
Pictured: LEFT: A Hippie-Painted VW Beetle; RIGHT: Janis Joplin “The Queen Hippie herself”
A hippie (or hippy) is a member of a liberal counterculture, originally a youth movement that started in the United States and United Kingdom during the mid-1960s and spread to other countries around the world.
The word hippie came from hipster and was initially used to describe beatniks who had moved into New York City's Greenwich Village and San Francisco's Haight-Ashbury district.
The term hippie was first popularized in San Francisco by Herb Caen, who was a journalist for the San Francisco Chronicle. The origins of the terms hip and hep are uncertain, although by the 1940s both had become part of African American jive slang and meant "sophisticated; currently fashionable; fully up-to-date".
The Beats adopted the term hip, and early hippies inherited the language and countercultural values of the Beat Generation. Hippies created their own communities, listened to psychedelic music, embraced the sexual revolution, and used drugs such as cannabis, LSD, peyote and psilocybin mushrooms to explore altered states of consciousness.
In January 1967, the Human Be-In in Golden Gate Park in San Francisco popularized hippie culture, leading to the Summer of Love on the West Coast of the United States, and the 1969 Woodstock Festival on the East Coast. Hippies in Mexico, known as jipitecas, formed La Onda and gathered at Avándaro, while in New Zealand, nomadic housetruckers practiced alternative lifestyles and promoted sustainable energy at Nambassa.
In the United Kingdom in 1970 many gathered at the gigantic Isle of Wight Festival with a crowd of around 400,000 people.
In later years, mobile "peace convoys" of New Age travelers made summer pilgrimages to free music festivals at Stonehenge and elsewhere.
In Australia, hippies gathered at Nimbin for the 1973 Aquarius Festival and the annual Cannabis Law Reform Rally or MardiGrass. "Piedra Roja Festival", a major hippie event in Chile, was held in 1970.
Hippie fashion and values had a major effect on culture, influencing popular music, television, film, literature, and the arts. Since the 1960s, many aspects of hippie culture have been assimilated by mainstream society.
The religious and cultural diversity espoused by the hippies has gained widespread acceptance, and Eastern philosophy and spiritual concepts have reached a larger audience.
The word hippie came from hipster and was initially used to describe beatniks who had moved into New York City's Greenwich Village and San Francisco's Haight-Ashbury district.
The term hippie was first popularized in San Francisco by Herb Caen, who was a journalist for the San Francisco Chronicle. The origins of the terms hip and hep are uncertain, although by the 1940s both had become part of African American jive slang and meant "sophisticated; currently fashionable; fully up-to-date".
The Beats adopted the term hip, and early hippies inherited the language and countercultural values of the Beat Generation. Hippies created their own communities, listened to psychedelic music, embraced the sexual revolution, and used drugs such as cannabis, LSD, peyote and psilocybin mushrooms to explore altered states of consciousness.
In January 1967, the Human Be-In in Golden Gate Park in San Francisco popularized hippie culture, leading to the Summer of Love on the West Coast of the United States, and the 1969 Woodstock Festival on the East Coast. Hippies in Mexico, known as jipitecas, formed La Onda and gathered at Avándaro, while in New Zealand, nomadic housetruckers practiced alternative lifestyles and promoted sustainable energy at Nambassa.
In the United Kingdom in 1970 many gathered at the gigantic Isle of Wight Festival with a crowd of around 400,000 people.
In later years, mobile "peace convoys" of New Age travelers made summer pilgrimages to free music festivals at Stonehenge and elsewhere.
In Australia, hippies gathered at Nimbin for the 1973 Aquarius Festival and the annual Cannabis Law Reform Rally or MardiGrass. "Piedra Roja Festival", a major hippie event in Chile, was held in 1970.
Hippie fashion and values had a major effect on culture, influencing popular music, television, film, literature, and the arts. Since the 1960s, many aspects of hippie culture have been assimilated by mainstream society.
The religious and cultural diversity espoused by the hippies has gained widespread acceptance, and Eastern philosophy and spiritual concepts have reached a larger audience.
The Information Age
YouTube Video Claude Shannon* - his work ushered in the Digital Revolution. This fascinating program explores his life and the major influence his work had on today's digital world through interviews with his friends and colleagues. [1/2002] [Science] [Show ID: 6090]
* -- Claude Shannon
Pictured: Nothing embodies today's Information Age like Smartphones!
The Information Age (also known as the Computer Age, Digital Age, or New Media Age) is a period in human history characterized by the shift from traditional industry that the Industrial Revolution brought through industrialization, to an economy based on information computerization.
The onset of the Information Age is associated with the Digital Revolution, just as the Industrial Revolution marked the onset of the Industrial Age.
During the information age, the phenomenon is that the digital industry creates a knowledge-based society surrounded by a high-tech global economy that spans over its influence on how the manufacturing throughput and the service sector operate in an efficient and convenient way.
In a commercialized society, the information industry is able to allow individuals to explore their personalized needs, therefore simplifying the procedure of making decisions for transactions and significantly lowering costs for both the producers and buyers. This is accepted overwhelmingly by participants throughout the entire economic activities for efficacy purposes, and new economic incentives would then be indigenously encouraged, such as the knowledge economy.
The Information Age formed by capitalizing on computer micro-miniaturization advances.
This evolution of technology in daily life and social organization has led to the fact that the modernization of information and communication processes has become the driving force of social evolution.
For amplification, click on any of the following hyperlinks:
Internet
Progression
The onset of the Information Age is associated with the Digital Revolution, just as the Industrial Revolution marked the onset of the Industrial Age.
During the information age, the phenomenon is that the digital industry creates a knowledge-based society surrounded by a high-tech global economy that spans over its influence on how the manufacturing throughput and the service sector operate in an efficient and convenient way.
In a commercialized society, the information industry is able to allow individuals to explore their personalized needs, therefore simplifying the procedure of making decisions for transactions and significantly lowering costs for both the producers and buyers. This is accepted overwhelmingly by participants throughout the entire economic activities for efficacy purposes, and new economic incentives would then be indigenously encouraged, such as the knowledge economy.
The Information Age formed by capitalizing on computer micro-miniaturization advances.
This evolution of technology in daily life and social organization has led to the fact that the modernization of information and communication processes has become the driving force of social evolution.
For amplification, click on any of the following hyperlinks:
Internet
Progression
- Library expansion
- Information storage
- Information transmission
- Computation
- Relation to economics
- Innovations
- 5 See also:
- Attention economy
- Big data
- Cognitive-cultural economy
- Computer crime
- Cyberterrorism
- Cyberwarfare
- Datamation - First print magazine dedicated solely to covering information technology.
- Digital dark age
- Digital detox
- Digital divide
- Digital transformation
- Human timeline
- Imagination age – hypothesized successor of the information age: a period in which creativity and imagination become the primary creators of economic value
- Information revolution
- Information society
- Internet governance
- Netocracy
- Social Age
- Technological determinism
- The Hacker Ethic and the Spirit of the Information Age
Generation X
YouTube Video from the Gen X-inspired Movie "The Breakfast Club" (1985)
Pictures: LEFT: Illustration of "Gen X"; and RIGHT: Time Magazine Cover
Generation X, commonly abbreviated to Gen X, is the generation born after the Western Post–World War II baby boom. Most demographers and commentators use birth dates ranging from the early 1960s to the early 1980s.
The term Generation X was coined by the Magnum photographer Robert Capa in the early 1950s. He used it later as a title for a photo-essay about young men and women growing up immediately after the Second World War.
The project first appeared in Picture Post in the UK and Holiday in the U.S. in 1953. Describing his intention, Capa said "We named this unknown generation, The Generation X, and even in our first enthusiasm we realised that we had something far bigger than our talents and pockets could cope with."
The name was popularized by Canadian author Douglas Coupland's 1991 novel Generation X: Tales for an Accelerated Culture, concerning young adults during the late 1980s and their lifestyles.
While Coupland's book helped to popularize the phrase Generation X, he erroneously attributed it to English rock musician Billy Idol in a 1989 magazine article. In fact, Idol had been a member of the punk band Generation X from 1976 to 1981, which was named after Deverson and Hamblett's 1965 sociology book on British youth, Generation X—a copy of which was owned by Idol's mother.
Gen X is the generation born after the Western World War II baby boom describing a generational change from the Baby Boomers.
In a 2012 article for the Joint Center for Housing Studies of Harvard University, George Masnick wrote that the "Census counted 82.1 million" Gen Xers in the U.S. The Harvard Center uses 1965 to 1984 to define Gen X so that Boomers, Xers, and Millennials "cover equal 20-year age spans".
Masnick concluded that immigration filled in any birth year deficits during low fertility years of the late 1960s and early 1970s.
Jon Miller at the Longitudinal Study of American Youth at the University of Michigan wrote that "Generation X refers to adults born between 1961 and 1981" and it "includes 84 million people" in the U.S.
John Markert at Cumberland University employs a similar approach, but wrote a 2004 article in which "Generations should be discrete twenty-year periods" but with "ten-year cohorts" and 5-year "bihorts" (his word) non-simultaneously, classifies Generation X as those born in the years 1966 to 1985. Markert censures other methods and tactics to define Generation X in his article stating that "inconsistent use of dates by the same author" simply results "in an apple to lemon measurement standard".
In contrast to this 20-year approach to defining generations, which, when applied to Generation X, can push the age of the conventionally post-war baby boomers well into or before World War II, or, conversely, set the upper-age of generation X as 1986 or later, many writers have adopted a more conservative span of 15 years or fewer.
Some, such as Tamara Erickson, in What's Next Gen X?, and Elwood Watson, in Generation X Professors Speak, use the dates of 1965-1979. While in at least one of their studies, the Pew Research Center defines Generation X births as from 1965 to 1980.
In 2011, "The Generation X Report", based on annual surveys used in the longitudinal study of today's adults, found Gen Xers, defined in the report as people born between 1961 and 1981, to be highly educated, active, balanced, happy, and family-oriented.
The study contrasted with the slacker, disenfranchised stereotype associated with youth in the 1970s and 1980s. Various questions and responses from approximately 4,000 people who were surveyed each year from 1987 through 2010 made up the study. Clive Thompson, writing in Wired in 2014 claimed that the differences between Generation X and its predecessors had been over-hyped, quoting Kali Trzesniewski, a scholar of life-span changes, as saying that the basic personality metrics of Americans had remained stable for decades.
In 2012, the Corporation for National and Community Service ranked Gen X volunteer rates in the U.S. at "29.4% per year", the highest compared with other generations. The rankings were based on a three-year moving average between 2009 and 2011.
In the preface to Generation X Goes Global: Mapping a Youth Culture in Motion, a collection of global essays, Professor Christine Henseler summarizes it as "a generation whose worldview is based on change, on the need to combat corruption, dictatorships, abuse, AIDS, a generation in search of human dignity and individual freedom, the need for stability, love, tolerance, and human rights for all".
In cinema, directors Quentin Tarantino, David Fincher, Jane Campion, Steven Soderbergh, Kevin Smith, Richard Linklater, and Todd Solondz have been called Generation X filmmakers.
Smith is most known for his View Askewniverse films, the flagship film being Clerks, which is set in New Jersey circa 1994, and focuses on two bored, convenience-store clerks in their twenties. Linklater's Slacker similarly explores young adult characters who were more interested in philosophizing than settling with a long-term career and family.
Solondz' Welcome to the Dollhouse touched on themes of school bullying, school violence, teen drug use, peer pressure and broken or dysfunctional families, set in a junior high school environment in New Jersey during the early to mid-1990s.
While not a member of Gen X himself, director John Hughes has been recognized as having created a series of classics "that an entire generation took ownership of with films like The Breakfast Club (see YouTube Video above), Sixteen Candles and Weird Science".
Gen Xers are often called the MTV Generation. They experienced the emergence of music videos, grunge, alternative rock and hip hop. Some called Xers the "latchkey generation" because their personal identity was in part shaped by the independence of being left alone after school when they were children.
Compared with previous generations, Generation X represents a more heterogeneous generation, embracing social diversity in terms of such characteristics as race, class, religion, ethnicity, culture, language, gender identity, and sexual orientation.
Unlike their parents who challenged leaders with an intent to replace them, Gen Xers are less likely to idolize leaders and are more inclined to work toward long-term institutional and systematic change through economic, media and consumer actions.
The U.S. Census Bureau reports that Generation X holds the highest education levels when looking at current age groups.
Pursuant to a study by Elwood Carlson on "how different generations respond in unique ways to common problems in some political, social, and consumption choices", the Population Reference Bureau, a private demographic research organization based in Washington, D.C., cited Generation X birth years as falling between 1965 and 1982.
On the first page of the study, authors William Strauss and Neil Howe's definition of a "cohort generation" is cited. They define Generation X by the years 1961 to 1981.
In 2008, Details magazine editor-at-large Jeff Gordinier released his book X Saves the World -- How Generation X Got the Shaft but Can Still Keep Everything from Sucking.
The term Generation X was coined by the Magnum photographer Robert Capa in the early 1950s. He used it later as a title for a photo-essay about young men and women growing up immediately after the Second World War.
The project first appeared in Picture Post in the UK and Holiday in the U.S. in 1953. Describing his intention, Capa said "We named this unknown generation, The Generation X, and even in our first enthusiasm we realised that we had something far bigger than our talents and pockets could cope with."
The name was popularized by Canadian author Douglas Coupland's 1991 novel Generation X: Tales for an Accelerated Culture, concerning young adults during the late 1980s and their lifestyles.
While Coupland's book helped to popularize the phrase Generation X, he erroneously attributed it to English rock musician Billy Idol in a 1989 magazine article. In fact, Idol had been a member of the punk band Generation X from 1976 to 1981, which was named after Deverson and Hamblett's 1965 sociology book on British youth, Generation X—a copy of which was owned by Idol's mother.
Gen X is the generation born after the Western World War II baby boom describing a generational change from the Baby Boomers.
In a 2012 article for the Joint Center for Housing Studies of Harvard University, George Masnick wrote that the "Census counted 82.1 million" Gen Xers in the U.S. The Harvard Center uses 1965 to 1984 to define Gen X so that Boomers, Xers, and Millennials "cover equal 20-year age spans".
Masnick concluded that immigration filled in any birth year deficits during low fertility years of the late 1960s and early 1970s.
Jon Miller at the Longitudinal Study of American Youth at the University of Michigan wrote that "Generation X refers to adults born between 1961 and 1981" and it "includes 84 million people" in the U.S.
John Markert at Cumberland University employs a similar approach, but wrote a 2004 article in which "Generations should be discrete twenty-year periods" but with "ten-year cohorts" and 5-year "bihorts" (his word) non-simultaneously, classifies Generation X as those born in the years 1966 to 1985. Markert censures other methods and tactics to define Generation X in his article stating that "inconsistent use of dates by the same author" simply results "in an apple to lemon measurement standard".
In contrast to this 20-year approach to defining generations, which, when applied to Generation X, can push the age of the conventionally post-war baby boomers well into or before World War II, or, conversely, set the upper-age of generation X as 1986 or later, many writers have adopted a more conservative span of 15 years or fewer.
Some, such as Tamara Erickson, in What's Next Gen X?, and Elwood Watson, in Generation X Professors Speak, use the dates of 1965-1979. While in at least one of their studies, the Pew Research Center defines Generation X births as from 1965 to 1980.
In 2011, "The Generation X Report", based on annual surveys used in the longitudinal study of today's adults, found Gen Xers, defined in the report as people born between 1961 and 1981, to be highly educated, active, balanced, happy, and family-oriented.
The study contrasted with the slacker, disenfranchised stereotype associated with youth in the 1970s and 1980s. Various questions and responses from approximately 4,000 people who were surveyed each year from 1987 through 2010 made up the study. Clive Thompson, writing in Wired in 2014 claimed that the differences between Generation X and its predecessors had been over-hyped, quoting Kali Trzesniewski, a scholar of life-span changes, as saying that the basic personality metrics of Americans had remained stable for decades.
In 2012, the Corporation for National and Community Service ranked Gen X volunteer rates in the U.S. at "29.4% per year", the highest compared with other generations. The rankings were based on a three-year moving average between 2009 and 2011.
In the preface to Generation X Goes Global: Mapping a Youth Culture in Motion, a collection of global essays, Professor Christine Henseler summarizes it as "a generation whose worldview is based on change, on the need to combat corruption, dictatorships, abuse, AIDS, a generation in search of human dignity and individual freedom, the need for stability, love, tolerance, and human rights for all".
In cinema, directors Quentin Tarantino, David Fincher, Jane Campion, Steven Soderbergh, Kevin Smith, Richard Linklater, and Todd Solondz have been called Generation X filmmakers.
Smith is most known for his View Askewniverse films, the flagship film being Clerks, which is set in New Jersey circa 1994, and focuses on two bored, convenience-store clerks in their twenties. Linklater's Slacker similarly explores young adult characters who were more interested in philosophizing than settling with a long-term career and family.
Solondz' Welcome to the Dollhouse touched on themes of school bullying, school violence, teen drug use, peer pressure and broken or dysfunctional families, set in a junior high school environment in New Jersey during the early to mid-1990s.
While not a member of Gen X himself, director John Hughes has been recognized as having created a series of classics "that an entire generation took ownership of with films like The Breakfast Club (see YouTube Video above), Sixteen Candles and Weird Science".
Gen Xers are often called the MTV Generation. They experienced the emergence of music videos, grunge, alternative rock and hip hop. Some called Xers the "latchkey generation" because their personal identity was in part shaped by the independence of being left alone after school when they were children.
Compared with previous generations, Generation X represents a more heterogeneous generation, embracing social diversity in terms of such characteristics as race, class, religion, ethnicity, culture, language, gender identity, and sexual orientation.
Unlike their parents who challenged leaders with an intent to replace them, Gen Xers are less likely to idolize leaders and are more inclined to work toward long-term institutional and systematic change through economic, media and consumer actions.
The U.S. Census Bureau reports that Generation X holds the highest education levels when looking at current age groups.
Pursuant to a study by Elwood Carlson on "how different generations respond in unique ways to common problems in some political, social, and consumption choices", the Population Reference Bureau, a private demographic research organization based in Washington, D.C., cited Generation X birth years as falling between 1965 and 1982.
On the first page of the study, authors William Strauss and Neil Howe's definition of a "cohort generation" is cited. They define Generation X by the years 1961 to 1981.
In 2008, Details magazine editor-at-large Jeff Gordinier released his book X Saves the World -- How Generation X Got the Shaft but Can Still Keep Everything from Sucking.
The Millennials (aka "Gen Y")
YouTube Video from NBC's Saturday Night Live's take on Millennials
Millennials (also known as the Millennial Generation or Generation Y, abbreviated to Gen Y) are the demographic cohort following Generation X. There are no precise dates for when the generation starts and ends; most researchers and commentators use birth years ranging from the early 1980s to around 2000.
Terminology:
Authors William Strauss and Neil Howe are widely credited with naming the Millennials. They coined the term in 1987, around the time the children born in 1982 were entering preschool, and the media were first identifying their prospective link to the millennial year as the high school graduating class of the year 2000. They wrote about the cohort in their 1991 book Generations: The History of America's Future, 1584 to 2069, and released a book in 2000 titled Millennials Rising: The Next Great Generation.
In August 1993, an Ad Age editorial coined the phrase Generation Y to describe those who were aged 11 or younger as well as the teenagers of the upcoming ten years who were defined as different from Generation X.
Since then, the company has sometimes used 1982 as the starting birth year. According to Horovitz, in 2012, Ad Age "threw in the towel by conceding that Millennials is a better name than Gen Y", and by 2014, a past director of data strategy at Ad Age said to NPR "the Generation Y label was a placeholder until we found out more about them".
Alternative names for this group proposed in the past include Generation We, Global Generation, Generation Next and the Net Generation. Millennials are sometimes also called Echo Boomers, referring to the generation's size relative to the Baby Boomer generation and due to the significant increase in birth rates during the 1980s and into the 1990s.
In the United States, birth rates peaked in August 1990 and a 20th-century trend toward smaller families in developed countries continued. Newsweek used the term Generation 9/11 to refer to young people who were between the ages of 10 and 20 years during the September 11 attacks.
The first reference to "Generation 9/11" was made in the cover story of the November 12, 2001 issue of Newsweek. In his book The Lucky Few: Between the Greatest Generation and the Baby Boom, author Elwood Carlson called the generation the "New Boomers" (born 1983–2001), based on the upswing in births after 1983, finishing with the "political and social challenges" that occurred after the terrorist acts of September 11, 2001, and the "persistent economic difficulties" of the time.
Chinese Millennials (in China, commonly called the 1980s and 1990s generations) were examined and contrasted with American Millennials at a 2015 conference in Shanghai organized by University of Southern California's US-China Institute. Findings included Millennials' marriage, childbearing, and child raising preferences, life and career ambitions, and attitudes towards volunteerism and activism.
Date and age range defining Authors Strauss and Howe use 1982 as the Millennials' starting birth year and 2004 as the last birth year, but Howe described the dividing line between Millennials and the following Generation Z as "tentative" saying, "you can’t be sure where history will someday draw a cohort dividing line until a generation fully comes of age."
In 2009, Australian McCrindle Research Center used 1980–1994 as Generation Y birth dates, starting with a recorded rise in birth rates, and fitting their newer definition of a generational span as 15 years. An earlier McCrindle report in 2006 gave a range of 1982–2000, in a document titled "Report on the Attitudes and Views of Generations X and Y on Superannuation".
In 2013, a global generational study conducted by PricewaterhouseCoopers with the University of Southern California and the London Business School defined Millennials as those born between 1980–1995.
In May 2013, a Time magazine cover story identified Millennials as those born from 1980 or 1981 to the year 2000.
For the purpose of a 2015 study, the Pew Research Center, an American think tank organization, defined Millennials as those born between 1981–1997. According to Pew, as the 16-year span of Millennial birth years is "already about as wide a range as those of the other living generations, [...] it seems likely that in the near future the youngest adults will be members of a post-Millennial generation."
In 2014, a comparative study from Dale Carnegie Training and MSW Research was released which studies Millennials compared to other generations in the workplace. This study described "Millennial" birth years as being between 1980–1996. Gallup Inc., an American research-based global performance-management consulting company, also tends to use 1980–1996 as birth years.
In 2015, the official body of Statistics Canada defined 1992 as the last year of birth for Generation Y.
Various other sources put the births of Millennials between 1983 and 2000, particularly in the United States and Canada.
Traits Authors William Strauss and Neil Howe believe that each generation has common characteristics that give it a specific character, with four basic generational archetypes, repeating in a cycle. According to their theory, they predicted Millennials will become more like the "civic-minded" G.I. Generation with a strong sense of community both local and global.
Strauss and Howe's research has been influential, but it also has critics. Jean Twenge, the author of the 2006 book Generation Me, considers Millennials, along with younger members of Generation X, to be part of what she calls "Generation Me". Twenge attributes Millennials with the traits of confidence and tolerance, but also identifies a sense of entitlement and narcissism based on personality surveys that showed increasing narcissism among Millennials compared to preceding generations when they were teens and in their twenties. She questions the predictions of Strauss and Howe that this generation will come out civic-minded.
The University of Michigan's "Monitoring the Future" study of high school seniors (conducted continually since 1975) and the American Freshman survey, conducted by UCLA's Higher Education Research Institute of new college students since 1966, showed an increase in the proportion of students who consider wealth a very important attribute, from 45% for Baby Boomers (surveyed between 1967 and 1985) to 70% for Gen Xers, and 75% for Millennials.
The percentage who said it was important to keep abreast of political affairs fell, from 50% for Baby Boomers to 39% for Gen Xers, and 35% for Millennials. The notion of "developing a meaningful philosophy of life" decreased the most across generations, from 73% for Boomers to 45% for Millennials. The willingness to be involved in an environmental cleanup program dropped from 33% for Baby Boomers to 21% for Millennials.
In March 2014, the Pew Research Center issued a report about how "Millennials in adulthood" are "detached from institutions and networked with friends." The report said Millennials are somewhat more upbeat than older adults about America's future, with 49% of Millennials saying the country’s best years are ahead though they're the first in the modern era to have higher levels of student loan debt and unemployment.
Fred Bonner, a Samuel DeWitt Proctor Chair in Education at Rutgers University and author of Diverse Millennial Students in College: Implications for Faculty and Student Affairs, believes that much of the commentary on the Millennial Generation may be partially accurate, but overly general and that many of the traits they describe apply primarily to "white, affluent teenagers who accomplish great things as they grow up in the suburbs, who confront anxiety when applying to super-selective colleges, and who multitask with ease as their helicopter parents hover reassuringly above them."
During class discussions, Bonner listened to black and Hispanic students describe how some or all of the so-called core traits did not apply to them. They often said that the "special" trait, in particular, is unrecognizable. Other socio-economic groups often do not display the same attributes commonly attributed to Millennials. "It's not that many diverse parents don't want to treat their kids as special," he says, "but they often don't have the social and cultural capital, the time and resources, to do that."
In 2008, author Ron Alsop called the Millennials "Trophy Kids," a term that reflects a trend in competitive sports, as well as many other aspects of life, where mere participation is frequently enough for a reward. It has been reported that it's an issue in corporate environments.
Some employers are concerned that Millennials have too great expectations from the workplace. Some studies predict they will switch jobs frequently, holding many more jobs than Gen Xers due to their great expectations. Newer research shows that Millennials change jobs for the same reasons as other generations—namely, more money and a more innovative work environment.
They look for versatility and flexibility in the workplace, and strive for a strong work–life balance in their jobs and have similar career aspirations to other generations, valuing financial security and a diverse workplace just as much as their older colleagues.
Educational sociologist Andy Furlong described Millennials as optimistic, engaged, and team players.
In his book, Fast Future, author David Burstein describes Millennials' approach to social change as "pragmatic idealism" with a deep desire to make the world a better place combined with an understanding that doing so requires building new institutions while working inside and outside existing institutions.
Political viewsAccording to a 2013 article in The Economist, surveys of political attitudes among Millennials in the United Kingdom suggest increasingly liberal attitudes with regard to social and cultural issues, as well as higher overall support for classical liberal economic policies than preceding generations. They are more likely to support same-sex marriage and the legalization of drugs.
The Economist parallels this with Millennials in the United States, whose attitudes are more supportive of social liberal policies and same-sex marriage relative to other demographics, though less supportive of abortion than Gen X were in the early 1990s. They are also more likely to oppose animal testing for medical purposes.
A 2014 poll for the libertarian Reason magazine suggested that US Millennials were social liberals and fiscal centrists more often than their global peers. The magazine predicted that millennials would become more conservative on fiscal issues once they started paying taxes.
Demographics in the United States William Strauss and Neil Howe projected in their 1991 book Generations that the U.S. Millennial population would be 76 million.
Later Neil Howe revised the number to over 95 million people (in the U.S.). As of 2012, it was estimated that there were approximately 80 million U.S. Millennials. The estimated number of the U.S. Millennials in 2015 is 83.1 million people.
Economic prospects:
Economic prospects for some Millennials have declined largely due to the Great Recession in the late 2000s.
Several governments have instituted major youth employment schemes out of fear of social unrest due to the dramatically increased rates of youth unemployment. In Europe, youth unemployment levels were very high (56% in Spain, 44% in Italy, 35% in the Baltic states, 19.1% in Britain and more than 20% in many more).
In 2009, leading commentators began to worry about the long term social and economic effects of the unemployment. Unemployment levels in other areas of the world were also high, with the youth unemployment rate in the U.S. reaching a record 19.1% in July 2010 since the statistic started being gathered in 1948.
In Canada, unemployment among youths in July 2009 was 15.9%, the highest it had been in 11 years. Underemployment is also a major factor. In the U.S. the economic difficulties have led to dramatic increases in youth poverty, unemployment, and the numbers of young people living with their parents. In April 2012, it was reported that half of all new college graduates in the US were still either unemployed or underemployed. It has been argued that this unemployment rate and poor economic situation has given Millennials a rallying call with the 2011 Occupy Wall Street movement. However, according to Christine Kelly, Occupy is not a youth movement and has participants that vary from the very young to very old.
A variety of names have emerged in different European countries particularly hard hit following the financial crisis of 2007–2008 to designate young people with limited employment and career prospects. These groups can be considered to be more or less synonymous with Millennials, or at least major sub-groups in those countries.
The Generation of €700 is a term popularized by the Greek mass media and refers to educated Greek twixters of urban centers who generally fail to establish a career. In Greece, young adults are being "excluded from the labor market" and some "leave their country of origin to look for better options". They're being "marginalized and face uncertain working conditions" in jobs that are unrelated to their educational background, and receive the minimum allowable base salary of €700. This generation evolved in circumstances leading to the Greek debt crisis and some participated in the 2010–2011 Greek protests.
In Spain, they're referred to as the mileurista (for €1,000), in France "The Precarious Generation," and as in Spain, Italy also has the "milleurista"; generation of 1,000 euros.
In 2015, Millennials in New York City were reported as earning 20% less than the generation before them, as a result of entering the workforce during the great recession. Despite higher college attendance rates than Generation X, many were stuck in low-paid jobs, with the percentage of degree-educated young adults working in low-wage industries rising from 23% to 33% between 2000 and 2014.
Geographic (not demographic) designation coined by Fast Company for American employees who need to make several changes in career throughout their working lives due to the chaotic nature of the job market following the Great Recession.
Societal change has been accelerated by the use of social media, smartphones, mobile computing, and other new technologies. Those in "Generation Flux" have birth-years in the ranges of both Generation X and Millennials. "Generation Sell" was used by author William Deresiewicz to describe Millennial's interest in small businesses.
According to Forbes, Millennials will make up approximately half of the U.S. workforce by 2020. Millennials are the most educated and culturally diverse group of all generations and they are also hard to please when it comes to employers.
To address these new challenges, many large firms are currently studying the social and behavioral patterns of Millennials and are trying to devise programs that decrease inter-generational estrangement, and increase relationships of reciprocal understanding between older employees and Millennials.
The UK's Institute of Leadership & Management researched the gap in understanding between Millennial recruits and their managers in collaboration with Ashridge Business School. The findings included high expectations for advancement, salary and for a coaching relationship with their manager, and suggested that organizations will need to adapt to accommodate and make the best use of Millennials.
In an example of a company trying to do just this, Goldman Sachs conducted training programs that used actors to portray Millennials who assertively sought more feedback, responsibility, and involvement in decision making. After the performance, employees discussed and debated the generational differences they saw played out.
According to a Bloomberg L.P. article, Millennials have benefited the least from the economic recovery following the Great Recession, as average incomes for this generation have fallen at twice the general adult population's total drop and are likely to be on a path toward lower incomes for at least another decade: "Three and a half years after the worst recession since the Great Depression, the earnings and employment gap between those in the under-35 population and their parents and grandparents threatens to unravel the American dream of each generation doing better than the last. The nation's younger workers have benefited least from an economic recovery that has been the most uneven in recent history."
USA Today reported in 2014 that Millennials were "entering the workplace in the face of demographic change and an increasingly multi-generational workplace". Even though research has shown that Millennials are joining the workforce during a tough economic time they still have remained optimistic, as shown when about nine out of ten Millennials surveyed by the Pew Research Center said that they currently have enough money or that they will eventually reach their long-term financial goals.
Peter Pan generation:
American sociologist Kathleen Shaputis labeled Millennials as the boomerang generation or Peter Pan generation, because of the members perceived tendency for delaying some rites of passage into adulthood for longer periods than most generations before them. These labels were also a reference to a trend toward members living with their parents for longer periods than previous generations.
According to Kimberly Palmer, "High housing prices, the rising cost of higher education, and the relative affluence of the older generation are among the factors driving the trend." However, other explanations are seen as contributing.
Questions regarding a clear definition of what it means to be an adult also impacts a debate about delayed transitions into adulthood and the emergence of a new life stage, Emerging Adulthood. For instance, a 2012 study by professors at Brigham Young University found that college students are more likely to define "adult" based on certain personal abilities and characteristics rather than more traditional "rite of passage" events.
Larry Nelson, one of the three marriage, family, and human development professors to perform the study, also noted that "In prior generations, you get married and you start a career and you do that immediately. What young people today are seeing is that approach has led to divorces, to people unhappy with their careers … The majority want to get married […] they just want to do it right the first time, the same thing with their careers."
The economy has had a dampening effect on Millennials' ability to date and get married. In 2012, the average American couple spent an average of over $27,000 on their wedding. A 2013 joint study by sociologists at the University of Virginia and Harvard University found that the decline and disappearance of stable full-time jobs with health insurance and pensions for people who lack a college degree has had profound effects on working-class Americans, who now are less likely to marry and have children within marriage than those with college degrees.
Data from a 2014 study of US Millennials revealed over 56% of that group considers themselves as part of the working class, while only nearly 35% consider themselves in the middle class; this class identity is the lowest polling of any generation.
Religion:
In the United States, Millennials are the least likely to be religious. There is a trend towards irreligion that has been increasing since the 1940s. 29 percent of Americans born between 1983 and 1994 are irreligious, as opposed to 21 percent born between 1963 and 1981, 15 percent born between 1948 and 1962 and only 7 percent born before 1948. A 2005 study looked at 1,385 people aged 18 to 25 and found that more than half of those in the study said that they pray regularly before a meal.
One-third said that they discussed religion with friends, attended religious services, and read religious material weekly. Twenty-three percent of those studied did not identify themselves as religious practitioners. A Pew Research Center study on Millennials shows that of those between 18–29 years old, only 3% of these emerging adults self-identified as "atheists" and only 4% self-identified as "agnostics". Overall, 25% of Millennials are "Nones" and 75% are religiously affiliated.
Over half of Millennials polled in the United Kingdom in 2013 said they had 'no religion nor attended a place of worship', other than for a wedding or a funeral. 25% said they 'believe in a God', while 19% believed in a 'spiritual greater power' and 38% said they did not believe in God nor any other 'greater spiritual power'. The poll also found 41% thought religion is 'the cause of evil' in the world more often than good.
Digital technologyIn their 2007 book, authors Junco and Mastrodicasa expanded on the work of William Strauss and Neil Howe to include research-based information about the personality profiles of Millennials, especially as it relates to higher education. They conducted a large-sample (7,705) research study of college students. They found that Next Generation college students, born between 1983–1992, were frequently in touch with their parents and they used technology at higher rates than people from other generations.
In their survey, they found that 97% of these students owned a computer, 94% owned a mobile phone, and 56% owned an MP3 player. They also found that students spoke with their parents an average of 1.5 times a day about a wide range of topics. Other findings in the Junco and Mastrodicasa survey revealed 76% of students used instant messaging, 92% of those reported multitasking while instant messaging, 40% of them used television to get most of their news, and 34% of students surveyed used the Internet as their primary news source.
Gen Xers and Millennials were the first to grow up with computers in their homes. In a 1999 speech at the New York Institute of Technology, Microsoft Chairman and CEO Bill Gates encouraged America's teachers to use technology to serve the needs of the first generation of kids to grow up with the Internet.
Many Millennials enjoy a 250+-channel home cable TV universe. One of the more popular forms of media use by Millienials is social networking. In 2010, research was published in the Elon Journal of Undergraduate Research which claimed that students who used social media and decided to quit showed the same withdrawal symptoms of a drug addict who quit their stimulant.
Marc Prensky coined the term "digital native" to describe "K through college" students in 2001, explaining they "represent the first generations to grow up with this new technology." Millennials are identified as "digital natives" by the Pew Research Center which conducted a survey titled Millennials in Adulthood.
Millennials use social networking sites, such as Facebook, to create a different sense of belonging, make acquaintances, and to remain connected with friends. The Millennials are the generation that uses social media the most, with 59% of Millennials using it to find information on people and events, compared to only 29% of previous generations. 88% of Millennials use Facebook as their primary source of news.
In the Frontline episode "Generation Like", there is discussion about Millennials and their dependence on technology and ways to commoditize the social media sphere.
Cultural identity:
Strauss & Howe's book titled Millennials Rising: The Next Great Generation describes the Millennial generation as "civic-minded", rejecting the attitudes of the Baby Boomers and Generation X.
Since the 2000 U.S. Census, which allowed people to select more than one racial group, Millennials in abundance have asserted the ideal that all their heritages should be respected, counted, and acknowledged.
Generally speaking, Millennials are the children of Baby Boomers or Generation Xers, while some older members may have parents from the Silent Generation. A 2013 poll in the United Kingdom found that Generation Y was more "open-minded than their parents on controversial topics". Of those surveyed, nearly 65% supported same-sex marriage.
A 2013 Pew Research Poll concluded that 84% of Millennials, born since 1980 and then between the ages of 18 and 32, favor legalizing the use of marijuana.
In 2015, the Pew Research Center also conducted research regarding generational identity. It was discovered that Millennials, or members of Generation Y, are less likely to strongly identify with the generational term when compared to Generation X or to the baby boomers. It was also found that Millennials chose most often to define itself with more negative terms such as self-absorbed, wasteful or greedy. In this 2015 report, Pew defined Millennials with birth years ranging from 1981 onward.
Millennials came of age in a time where the entertainment industry began to be affected by the Internet:
On top of Millennials being the most ethnically and racially diverse compared to the generations older than they are, they are also on pace to be the most educated. As of 2008, 39.6% of Millennials between the ages of 18–24 were enrolled in college, which was an American record.
Along with being educated, Millennials are also very upbeat. As stated above in the economic prospects section, about 9 out of 10 Millennials feel as though they have enough money or that they will reach their long-term financial goals, even during the tough economic times, and they are more optimistic about the future of the U.S.
Additionally, Millennials are also more open to change than older generations. According to the Pew Research Center that did a survey in 2008, Millennials are the most likely of any generation to self-identify as liberals and are also more supportive of progressive domestic social agenda than older generations. Finally, Millennials are the least overtly religious than the older generations. About one in four Millennials are unaffiliated with any religion, which is much more than the older generations when they were the ages of Millennials.
Inclusion and self-identification debate:
Due in part to the frequent birth-year overlap and resulting incongruence existing between attempts to define Generation X and Millennials, a growing number of individuals born in the late 1970s or early 1980s see themselves as either belonging to, and identifying with, neither of these traditional generations, or, rather, as trans-generational, included and identified with both, either in whole or to a degree. Some attempts to define those individuals born in the Generation X and Millennial overlapped years have given rise to inter- or micro-generations, such as Xennials, The Lucky Ones, Generation Catalano, and the Oregon Trail Generation.
In an article written for Popsugar, author Ashley Paige states that while Generation Z birth dates are commonly described as starting in 1995, she relates to them more than Millennials: "With a birth date near the end of 1994, I feel like I can't fully identify with millennials, especially in regards to 'the golden days.' Yeah, I remember when no one had cell phones — but only till sixth grade. Yeah, I thought Titanic was amazing — in 2005, when I was first allowed to see it. And no, I can't recall a time in my life when there wasn't a computer in my house. In second grade, I learned to use Google."
See also:
Terminology:
Authors William Strauss and Neil Howe are widely credited with naming the Millennials. They coined the term in 1987, around the time the children born in 1982 were entering preschool, and the media were first identifying their prospective link to the millennial year as the high school graduating class of the year 2000. They wrote about the cohort in their 1991 book Generations: The History of America's Future, 1584 to 2069, and released a book in 2000 titled Millennials Rising: The Next Great Generation.
In August 1993, an Ad Age editorial coined the phrase Generation Y to describe those who were aged 11 or younger as well as the teenagers of the upcoming ten years who were defined as different from Generation X.
Since then, the company has sometimes used 1982 as the starting birth year. According to Horovitz, in 2012, Ad Age "threw in the towel by conceding that Millennials is a better name than Gen Y", and by 2014, a past director of data strategy at Ad Age said to NPR "the Generation Y label was a placeholder until we found out more about them".
Alternative names for this group proposed in the past include Generation We, Global Generation, Generation Next and the Net Generation. Millennials are sometimes also called Echo Boomers, referring to the generation's size relative to the Baby Boomer generation and due to the significant increase in birth rates during the 1980s and into the 1990s.
In the United States, birth rates peaked in August 1990 and a 20th-century trend toward smaller families in developed countries continued. Newsweek used the term Generation 9/11 to refer to young people who were between the ages of 10 and 20 years during the September 11 attacks.
The first reference to "Generation 9/11" was made in the cover story of the November 12, 2001 issue of Newsweek. In his book The Lucky Few: Between the Greatest Generation and the Baby Boom, author Elwood Carlson called the generation the "New Boomers" (born 1983–2001), based on the upswing in births after 1983, finishing with the "political and social challenges" that occurred after the terrorist acts of September 11, 2001, and the "persistent economic difficulties" of the time.
Chinese Millennials (in China, commonly called the 1980s and 1990s generations) were examined and contrasted with American Millennials at a 2015 conference in Shanghai organized by University of Southern California's US-China Institute. Findings included Millennials' marriage, childbearing, and child raising preferences, life and career ambitions, and attitudes towards volunteerism and activism.
Date and age range defining Authors Strauss and Howe use 1982 as the Millennials' starting birth year and 2004 as the last birth year, but Howe described the dividing line between Millennials and the following Generation Z as "tentative" saying, "you can’t be sure where history will someday draw a cohort dividing line until a generation fully comes of age."
In 2009, Australian McCrindle Research Center used 1980–1994 as Generation Y birth dates, starting with a recorded rise in birth rates, and fitting their newer definition of a generational span as 15 years. An earlier McCrindle report in 2006 gave a range of 1982–2000, in a document titled "Report on the Attitudes and Views of Generations X and Y on Superannuation".
In 2013, a global generational study conducted by PricewaterhouseCoopers with the University of Southern California and the London Business School defined Millennials as those born between 1980–1995.
In May 2013, a Time magazine cover story identified Millennials as those born from 1980 or 1981 to the year 2000.
For the purpose of a 2015 study, the Pew Research Center, an American think tank organization, defined Millennials as those born between 1981–1997. According to Pew, as the 16-year span of Millennial birth years is "already about as wide a range as those of the other living generations, [...] it seems likely that in the near future the youngest adults will be members of a post-Millennial generation."
In 2014, a comparative study from Dale Carnegie Training and MSW Research was released which studies Millennials compared to other generations in the workplace. This study described "Millennial" birth years as being between 1980–1996. Gallup Inc., an American research-based global performance-management consulting company, also tends to use 1980–1996 as birth years.
In 2015, the official body of Statistics Canada defined 1992 as the last year of birth for Generation Y.
Various other sources put the births of Millennials between 1983 and 2000, particularly in the United States and Canada.
Traits Authors William Strauss and Neil Howe believe that each generation has common characteristics that give it a specific character, with four basic generational archetypes, repeating in a cycle. According to their theory, they predicted Millennials will become more like the "civic-minded" G.I. Generation with a strong sense of community both local and global.
Strauss and Howe's research has been influential, but it also has critics. Jean Twenge, the author of the 2006 book Generation Me, considers Millennials, along with younger members of Generation X, to be part of what she calls "Generation Me". Twenge attributes Millennials with the traits of confidence and tolerance, but also identifies a sense of entitlement and narcissism based on personality surveys that showed increasing narcissism among Millennials compared to preceding generations when they were teens and in their twenties. She questions the predictions of Strauss and Howe that this generation will come out civic-minded.
The University of Michigan's "Monitoring the Future" study of high school seniors (conducted continually since 1975) and the American Freshman survey, conducted by UCLA's Higher Education Research Institute of new college students since 1966, showed an increase in the proportion of students who consider wealth a very important attribute, from 45% for Baby Boomers (surveyed between 1967 and 1985) to 70% for Gen Xers, and 75% for Millennials.
The percentage who said it was important to keep abreast of political affairs fell, from 50% for Baby Boomers to 39% for Gen Xers, and 35% for Millennials. The notion of "developing a meaningful philosophy of life" decreased the most across generations, from 73% for Boomers to 45% for Millennials. The willingness to be involved in an environmental cleanup program dropped from 33% for Baby Boomers to 21% for Millennials.
In March 2014, the Pew Research Center issued a report about how "Millennials in adulthood" are "detached from institutions and networked with friends." The report said Millennials are somewhat more upbeat than older adults about America's future, with 49% of Millennials saying the country’s best years are ahead though they're the first in the modern era to have higher levels of student loan debt and unemployment.
Fred Bonner, a Samuel DeWitt Proctor Chair in Education at Rutgers University and author of Diverse Millennial Students in College: Implications for Faculty and Student Affairs, believes that much of the commentary on the Millennial Generation may be partially accurate, but overly general and that many of the traits they describe apply primarily to "white, affluent teenagers who accomplish great things as they grow up in the suburbs, who confront anxiety when applying to super-selective colleges, and who multitask with ease as their helicopter parents hover reassuringly above them."
During class discussions, Bonner listened to black and Hispanic students describe how some or all of the so-called core traits did not apply to them. They often said that the "special" trait, in particular, is unrecognizable. Other socio-economic groups often do not display the same attributes commonly attributed to Millennials. "It's not that many diverse parents don't want to treat their kids as special," he says, "but they often don't have the social and cultural capital, the time and resources, to do that."
In 2008, author Ron Alsop called the Millennials "Trophy Kids," a term that reflects a trend in competitive sports, as well as many other aspects of life, where mere participation is frequently enough for a reward. It has been reported that it's an issue in corporate environments.
Some employers are concerned that Millennials have too great expectations from the workplace. Some studies predict they will switch jobs frequently, holding many more jobs than Gen Xers due to their great expectations. Newer research shows that Millennials change jobs for the same reasons as other generations—namely, more money and a more innovative work environment.
They look for versatility and flexibility in the workplace, and strive for a strong work–life balance in their jobs and have similar career aspirations to other generations, valuing financial security and a diverse workplace just as much as their older colleagues.
Educational sociologist Andy Furlong described Millennials as optimistic, engaged, and team players.
In his book, Fast Future, author David Burstein describes Millennials' approach to social change as "pragmatic idealism" with a deep desire to make the world a better place combined with an understanding that doing so requires building new institutions while working inside and outside existing institutions.
Political viewsAccording to a 2013 article in The Economist, surveys of political attitudes among Millennials in the United Kingdom suggest increasingly liberal attitudes with regard to social and cultural issues, as well as higher overall support for classical liberal economic policies than preceding generations. They are more likely to support same-sex marriage and the legalization of drugs.
The Economist parallels this with Millennials in the United States, whose attitudes are more supportive of social liberal policies and same-sex marriage relative to other demographics, though less supportive of abortion than Gen X were in the early 1990s. They are also more likely to oppose animal testing for medical purposes.
A 2014 poll for the libertarian Reason magazine suggested that US Millennials were social liberals and fiscal centrists more often than their global peers. The magazine predicted that millennials would become more conservative on fiscal issues once they started paying taxes.
Demographics in the United States William Strauss and Neil Howe projected in their 1991 book Generations that the U.S. Millennial population would be 76 million.
Later Neil Howe revised the number to over 95 million people (in the U.S.). As of 2012, it was estimated that there were approximately 80 million U.S. Millennials. The estimated number of the U.S. Millennials in 2015 is 83.1 million people.
Economic prospects:
Economic prospects for some Millennials have declined largely due to the Great Recession in the late 2000s.
Several governments have instituted major youth employment schemes out of fear of social unrest due to the dramatically increased rates of youth unemployment. In Europe, youth unemployment levels were very high (56% in Spain, 44% in Italy, 35% in the Baltic states, 19.1% in Britain and more than 20% in many more).
In 2009, leading commentators began to worry about the long term social and economic effects of the unemployment. Unemployment levels in other areas of the world were also high, with the youth unemployment rate in the U.S. reaching a record 19.1% in July 2010 since the statistic started being gathered in 1948.
In Canada, unemployment among youths in July 2009 was 15.9%, the highest it had been in 11 years. Underemployment is also a major factor. In the U.S. the economic difficulties have led to dramatic increases in youth poverty, unemployment, and the numbers of young people living with their parents. In April 2012, it was reported that half of all new college graduates in the US were still either unemployed or underemployed. It has been argued that this unemployment rate and poor economic situation has given Millennials a rallying call with the 2011 Occupy Wall Street movement. However, according to Christine Kelly, Occupy is not a youth movement and has participants that vary from the very young to very old.
A variety of names have emerged in different European countries particularly hard hit following the financial crisis of 2007–2008 to designate young people with limited employment and career prospects. These groups can be considered to be more or less synonymous with Millennials, or at least major sub-groups in those countries.
The Generation of €700 is a term popularized by the Greek mass media and refers to educated Greek twixters of urban centers who generally fail to establish a career. In Greece, young adults are being "excluded from the labor market" and some "leave their country of origin to look for better options". They're being "marginalized and face uncertain working conditions" in jobs that are unrelated to their educational background, and receive the minimum allowable base salary of €700. This generation evolved in circumstances leading to the Greek debt crisis and some participated in the 2010–2011 Greek protests.
In Spain, they're referred to as the mileurista (for €1,000), in France "The Precarious Generation," and as in Spain, Italy also has the "milleurista"; generation of 1,000 euros.
In 2015, Millennials in New York City were reported as earning 20% less than the generation before them, as a result of entering the workforce during the great recession. Despite higher college attendance rates than Generation X, many were stuck in low-paid jobs, with the percentage of degree-educated young adults working in low-wage industries rising from 23% to 33% between 2000 and 2014.
Geographic (not demographic) designation coined by Fast Company for American employees who need to make several changes in career throughout their working lives due to the chaotic nature of the job market following the Great Recession.
Societal change has been accelerated by the use of social media, smartphones, mobile computing, and other new technologies. Those in "Generation Flux" have birth-years in the ranges of both Generation X and Millennials. "Generation Sell" was used by author William Deresiewicz to describe Millennial's interest in small businesses.
According to Forbes, Millennials will make up approximately half of the U.S. workforce by 2020. Millennials are the most educated and culturally diverse group of all generations and they are also hard to please when it comes to employers.
To address these new challenges, many large firms are currently studying the social and behavioral patterns of Millennials and are trying to devise programs that decrease inter-generational estrangement, and increase relationships of reciprocal understanding between older employees and Millennials.
The UK's Institute of Leadership & Management researched the gap in understanding between Millennial recruits and their managers in collaboration with Ashridge Business School. The findings included high expectations for advancement, salary and for a coaching relationship with their manager, and suggested that organizations will need to adapt to accommodate and make the best use of Millennials.
In an example of a company trying to do just this, Goldman Sachs conducted training programs that used actors to portray Millennials who assertively sought more feedback, responsibility, and involvement in decision making. After the performance, employees discussed and debated the generational differences they saw played out.
According to a Bloomberg L.P. article, Millennials have benefited the least from the economic recovery following the Great Recession, as average incomes for this generation have fallen at twice the general adult population's total drop and are likely to be on a path toward lower incomes for at least another decade: "Three and a half years after the worst recession since the Great Depression, the earnings and employment gap between those in the under-35 population and their parents and grandparents threatens to unravel the American dream of each generation doing better than the last. The nation's younger workers have benefited least from an economic recovery that has been the most uneven in recent history."
USA Today reported in 2014 that Millennials were "entering the workplace in the face of demographic change and an increasingly multi-generational workplace". Even though research has shown that Millennials are joining the workforce during a tough economic time they still have remained optimistic, as shown when about nine out of ten Millennials surveyed by the Pew Research Center said that they currently have enough money or that they will eventually reach their long-term financial goals.
Peter Pan generation:
American sociologist Kathleen Shaputis labeled Millennials as the boomerang generation or Peter Pan generation, because of the members perceived tendency for delaying some rites of passage into adulthood for longer periods than most generations before them. These labels were also a reference to a trend toward members living with their parents for longer periods than previous generations.
According to Kimberly Palmer, "High housing prices, the rising cost of higher education, and the relative affluence of the older generation are among the factors driving the trend." However, other explanations are seen as contributing.
Questions regarding a clear definition of what it means to be an adult also impacts a debate about delayed transitions into adulthood and the emergence of a new life stage, Emerging Adulthood. For instance, a 2012 study by professors at Brigham Young University found that college students are more likely to define "adult" based on certain personal abilities and characteristics rather than more traditional "rite of passage" events.
Larry Nelson, one of the three marriage, family, and human development professors to perform the study, also noted that "In prior generations, you get married and you start a career and you do that immediately. What young people today are seeing is that approach has led to divorces, to people unhappy with their careers … The majority want to get married […] they just want to do it right the first time, the same thing with their careers."
The economy has had a dampening effect on Millennials' ability to date and get married. In 2012, the average American couple spent an average of over $27,000 on their wedding. A 2013 joint study by sociologists at the University of Virginia and Harvard University found that the decline and disappearance of stable full-time jobs with health insurance and pensions for people who lack a college degree has had profound effects on working-class Americans, who now are less likely to marry and have children within marriage than those with college degrees.
Data from a 2014 study of US Millennials revealed over 56% of that group considers themselves as part of the working class, while only nearly 35% consider themselves in the middle class; this class identity is the lowest polling of any generation.
Religion:
In the United States, Millennials are the least likely to be religious. There is a trend towards irreligion that has been increasing since the 1940s. 29 percent of Americans born between 1983 and 1994 are irreligious, as opposed to 21 percent born between 1963 and 1981, 15 percent born between 1948 and 1962 and only 7 percent born before 1948. A 2005 study looked at 1,385 people aged 18 to 25 and found that more than half of those in the study said that they pray regularly before a meal.
One-third said that they discussed religion with friends, attended religious services, and read religious material weekly. Twenty-three percent of those studied did not identify themselves as religious practitioners. A Pew Research Center study on Millennials shows that of those between 18–29 years old, only 3% of these emerging adults self-identified as "atheists" and only 4% self-identified as "agnostics". Overall, 25% of Millennials are "Nones" and 75% are religiously affiliated.
Over half of Millennials polled in the United Kingdom in 2013 said they had 'no religion nor attended a place of worship', other than for a wedding or a funeral. 25% said they 'believe in a God', while 19% believed in a 'spiritual greater power' and 38% said they did not believe in God nor any other 'greater spiritual power'. The poll also found 41% thought religion is 'the cause of evil' in the world more often than good.
Digital technologyIn their 2007 book, authors Junco and Mastrodicasa expanded on the work of William Strauss and Neil Howe to include research-based information about the personality profiles of Millennials, especially as it relates to higher education. They conducted a large-sample (7,705) research study of college students. They found that Next Generation college students, born between 1983–1992, were frequently in touch with their parents and they used technology at higher rates than people from other generations.
In their survey, they found that 97% of these students owned a computer, 94% owned a mobile phone, and 56% owned an MP3 player. They also found that students spoke with their parents an average of 1.5 times a day about a wide range of topics. Other findings in the Junco and Mastrodicasa survey revealed 76% of students used instant messaging, 92% of those reported multitasking while instant messaging, 40% of them used television to get most of their news, and 34% of students surveyed used the Internet as their primary news source.
Gen Xers and Millennials were the first to grow up with computers in their homes. In a 1999 speech at the New York Institute of Technology, Microsoft Chairman and CEO Bill Gates encouraged America's teachers to use technology to serve the needs of the first generation of kids to grow up with the Internet.
Many Millennials enjoy a 250+-channel home cable TV universe. One of the more popular forms of media use by Millienials is social networking. In 2010, research was published in the Elon Journal of Undergraduate Research which claimed that students who used social media and decided to quit showed the same withdrawal symptoms of a drug addict who quit their stimulant.
Marc Prensky coined the term "digital native" to describe "K through college" students in 2001, explaining they "represent the first generations to grow up with this new technology." Millennials are identified as "digital natives" by the Pew Research Center which conducted a survey titled Millennials in Adulthood.
Millennials use social networking sites, such as Facebook, to create a different sense of belonging, make acquaintances, and to remain connected with friends. The Millennials are the generation that uses social media the most, with 59% of Millennials using it to find information on people and events, compared to only 29% of previous generations. 88% of Millennials use Facebook as their primary source of news.
In the Frontline episode "Generation Like", there is discussion about Millennials and their dependence on technology and ways to commoditize the social media sphere.
Cultural identity:
Strauss & Howe's book titled Millennials Rising: The Next Great Generation describes the Millennial generation as "civic-minded", rejecting the attitudes of the Baby Boomers and Generation X.
Since the 2000 U.S. Census, which allowed people to select more than one racial group, Millennials in abundance have asserted the ideal that all their heritages should be respected, counted, and acknowledged.
Generally speaking, Millennials are the children of Baby Boomers or Generation Xers, while some older members may have parents from the Silent Generation. A 2013 poll in the United Kingdom found that Generation Y was more "open-minded than their parents on controversial topics". Of those surveyed, nearly 65% supported same-sex marriage.
A 2013 Pew Research Poll concluded that 84% of Millennials, born since 1980 and then between the ages of 18 and 32, favor legalizing the use of marijuana.
In 2015, the Pew Research Center also conducted research regarding generational identity. It was discovered that Millennials, or members of Generation Y, are less likely to strongly identify with the generational term when compared to Generation X or to the baby boomers. It was also found that Millennials chose most often to define itself with more negative terms such as self-absorbed, wasteful or greedy. In this 2015 report, Pew defined Millennials with birth years ranging from 1981 onward.
Millennials came of age in a time where the entertainment industry began to be affected by the Internet:
On top of Millennials being the most ethnically and racially diverse compared to the generations older than they are, they are also on pace to be the most educated. As of 2008, 39.6% of Millennials between the ages of 18–24 were enrolled in college, which was an American record.
Along with being educated, Millennials are also very upbeat. As stated above in the economic prospects section, about 9 out of 10 Millennials feel as though they have enough money or that they will reach their long-term financial goals, even during the tough economic times, and they are more optimistic about the future of the U.S.
Additionally, Millennials are also more open to change than older generations. According to the Pew Research Center that did a survey in 2008, Millennials are the most likely of any generation to self-identify as liberals and are also more supportive of progressive domestic social agenda than older generations. Finally, Millennials are the least overtly religious than the older generations. About one in four Millennials are unaffiliated with any religion, which is much more than the older generations when they were the ages of Millennials.
Inclusion and self-identification debate:
Due in part to the frequent birth-year overlap and resulting incongruence existing between attempts to define Generation X and Millennials, a growing number of individuals born in the late 1970s or early 1980s see themselves as either belonging to, and identifying with, neither of these traditional generations, or, rather, as trans-generational, included and identified with both, either in whole or to a degree. Some attempts to define those individuals born in the Generation X and Millennial overlapped years have given rise to inter- or micro-generations, such as Xennials, The Lucky Ones, Generation Catalano, and the Oregon Trail Generation.
In an article written for Popsugar, author Ashley Paige states that while Generation Z birth dates are commonly described as starting in 1995, she relates to them more than Millennials: "With a birth date near the end of 1994, I feel like I can't fully identify with millennials, especially in regards to 'the golden days.' Yeah, I remember when no one had cell phones — but only till sixth grade. Yeah, I thought Titanic was amazing — in 2005, when I was first allowed to see it. And no, I can't recall a time in my life when there wasn't a computer in my house. In second grade, I learned to use Google."
See also:
Generation Z
YouTube Video: Introducing Generation Z
Generation Z (commonly abbreviated to Gen Z, also known as iGeneration, Homeland Generation or Plurals) are the generation of people born after Generation Y/Millennials.
The generation is most commonly defined with birth years starting in the mid-1990s, although the early or late 1990s and early 2000s have also been used as starting birth years for this generation.
A significant aspect of this generation is its widespread usage of the internet from a young age. Members of Generation Z are typically thought of as being comfortable with technology, and interacting on social media websites for a significant portion of their socializing. Some commentators have suggested that growing up through the Great Recession has given the cohort a feeling of unsettlement and insecurity.
Authors William Strauss and Neil Howe wrote several books on the subject of generations and are widely credited with coining the term Millennials. Howe has said "No one knows who will name the next generation after the Millennials". In 2005, their company sponsored an online contest in which respondents voted overwhelmingly for the name Homeland Generation. That was not long after the September 11th terrorist attacks, and one fallout of the disaster was that Americans may have felt more safe staying at home. Howe has described himself as "not totally wed" to the name and cautioned that "names are being invented by people who have a great press release. Everyone is looking for a hook."
In 2012, USA Today sponsored an online contest for readers to choose the name of the next generation after the Millennials. The name Generation Z was suggested, although journalist Bruce Horovitz thought that some might find the term "off-putting". Some other names that were proposed included: iGeneration, Gen Tech, Gen Wii, Net Gen, Digital Natives, and Plurals.
iGeneration (or iGen) is a name that several individuals claim to have coined. Psychology professor and author Jean Twenge claims that the name iGen "just popped into her head" while she was driving near Silicon Valley, and that she had intended to use it as the title of her 2006 book Generation Me until it was overridden by her publisher. Demographer Cheryl Russell claims to have first used the term in 2009.
Matt Carmichael, a past director of data strategy at Ad Age, said in 2012 "we think iGen is the name that best fits and will best lead to understanding of this generation".
In 2014, an NPR news intern noted that iGeneration "seems to be winning" as the name for the post-Millennials. It has been described as "a wink and nod to Apple's iPod and iPhone", while former Ad Age writer Matt Carmichael notes that the lowercase "i" in iGeneration "leaves room for interpretation" and "could be any number of things: It could be for interactive, it could be for international, it could be for something we haven't thought of yet."
In response to naming a generation after a branded product, Randy Apuzzo, technologist and CEO of Zesty.io, published an article titled "Always Connected: Generation Z, the Digitarians", in which he calls the new generation 'Digitarians' because they are the first generation that has been "always connected to the internet" and were raised with touch devices.
Statistics Canada has noted that the cohort is sometimes referred to as "the Internet generation," as it is the first generation to have been born after the invention of the Internet. Criticism of the term has been that it could "sound like an adapter used to charge your phone on the bus".
Frank N. Magid Associates, an advertising and marketing agency, nicknamed this cohort "The Pluralist Generation" or 'Plurals'. Turner Broadcasting System also advocated calling the post-millennial generation 'Plurals'. The Futures Company has named this cohort "The Centennials".
In Japan, the cohort is described as "Neo-Digital Natives", a step beyond the previous cohort described as "Digital Natives". Digital Natives primarily communicate by text or voice, while neo-digital natives use video or movies. This emphasizes the shift from PC to mobile and text to video among the neo-digital population.
MTV has labeled the generation that follows Millennials "The Founders", based on the results of a survey they conducted in March 2015. MTV President Sean Atkins commented, “they have this self-awareness that systems have been broken, but they can’t be the generation that says we’ll break it even more.”
Author Neil Howe defines the cohort as people born from approximately 2005–2025, but describes the dividing line between Generation Z and Millennials as "tentative" saying, "you can’t be sure where history will someday draw a cohort dividing line until a generation fully comes of age". He noted that the Millennials' range beginning in 1982 would point to the next generation's window starting between 2000 and 2006.
Australia's McCrindle Research Center defines Generation Z as those born between 1995–2009, starting with a recorded rise in birth rates, and fitting their newer definition of a generational span as 15 years. A previous McCrindle report used 2001 as the starting point for this generation.
Adweek defines Generation Z as those born in 1995 or later. Many other researchers also put the starting date for Generation Z at around the mid-1990s, in or after 1995 and 1996.
In Japan, generations are defined by a ten year span with "Neo-Digital natives" beginning after 1996.
Statistics Canada defines Generation Z as starting with the birth year 1993. The Futures Company uses 1997 as the first year of birth for this cohort. Frank N. Magid Associates, an advertising and marketing agency, uses birth years starting from 1997 into the present day.
According to Forbes, in 2015 Generation Z made up 25% of the U.S. population, making them a larger cohort than the Baby Boomers or Millennials. Frank N. Magid Associates estimates that in the United States, 55% of Plurals are Caucasian, 24% are Hispanic, 14% are African-American, 4% are Asian, and 4% are multiracial or other.
The oldest members of Generation Z will be the first to come of age after same-sex marriage was legalized nationally in 2015. Generation Z is the first to overwhelmingly approve of same-sex marriage in their adolescence. According to a Frank N. Magid Associates white paper, generation Z exhibits positive feelings about the increasing ethnic diversity in the U.S., and they are more likely than older generations to have social circles that include people from different ethnic groups, races and religions.
Generation Z are predominantly the children of Generation X. According to marketing firm Frank N. Magid they are "the least likely to believe that there is such a thing as the American Dream", while Baby Boomers and their Millennial children are more likely to believe in it. According to Public Relations Society of America, the Great Recession has taught Generation Z to be independent, and has led to an entrepreneurial desire, after seeing their parents and older siblings struggle in the workforce.
Business Insider describes Generation Z as more conservative, more money-oriented, more entrepreneurial and pragmatic about money compared to Millennials. A 2013 survey by Ameritrade found that 46% of Generation Z in the United States (considered here to be those between the ages of 14 and 23) were concerned about student debt, while 36% were worried about being able to afford a college education at all. This generation is faced with a growing income gap and a shrinking middle-class, which all have led to increasing stress levels in families.
Both the September 11 terrorist attacks and the Great Recession have greatly influenced the attitudes of this generation in the United States. The oldest members of generation Z were 7 to 8 years-old when the 9/11 attacks occurred. Turner suggests it is likely that both events have resulted in a feeling of unsettlement and insecurity among the people of Generation Z with the environment in which they were being raised.
The economic recession of 2008 is particularly important to historical events that have shaped Generation Z, due to the ways in which their childhoods may have been affected by the recession's shadow; that is, the financial stresses felt by their parents. Although the Millennials experienced these events during their coming of age, Generation Z lived through them as part of their childhood, affecting their realism and world-view. Obama's rise to presidency has also played a fundamental role in providing an identity to Generation Z.
A 2014 study Generation Z Goes to College found that Generation Z students self-identify as being loyal, compassionate, thoughtful, open-minded, responsible, and determined. How they see their Generation Z peers is quite different than their own self-identity. They view their peers as competitive, spontaneous, adventuresome, and curious; all characteristics that they do not see readily in themselves.
Generation Z is generally more risk-adverse in certain activities than the Millennials. In 2013, 66% of teenagers (older members of Generation Z) had tried alcohol, down from 82% in 1991. Also in 2013, 8% of Gen. Z teenagers never or rarely wear a seat belt when riding in a car with someone else, as opposed to 26% in 1991.
A 2016 U.S. study found that church attendance during young adulthood was 41% among Generation Z, compared with 18 percent for millennials at the same ages, 21 percent of Generation X, and 26 percent of baby boomers.
Generation Z is the first to have internet technology so readily available at a very young age. With the web revolution that occurred throughout the 1990s, they have been exposed to an unprecedented amount of technology in their upbringing.
As technology became more compact and affordable, the popularity of smartphones in the United States grew exponentially. With 77% of 12–17 year olds owning a cellphone in 2015, technology has strongly influenced Generation Z in terms of communication and education. Forbes magazine suggested that by the time Generation Z entered the workplace, digital technology would be an aspect of almost all career paths.
Anthony Turner characterizes Generation Z as having a 'digital bond to the internet', and argues that it may help youth to escape from emotional and mental struggles they face offline. According to US consultancy Sparks and Honey in 2014, 41% of Generation Z spend more than three hours per day using computers for purposes other than schoolwork, compared to 22% in 2004.
In 2015, Generation Z comprised the largest portion of the U.S. population, at nearly 26%, edging out Millennials (24.5%), and the generation is estimated to generate $44 billion in annual spending.
About three-quarters of 13–17 years olds use their cellphones daily, more than they watch TV. Over half of surveyed mothers say the demo influences them in purchasing decisions for toys, apparel, dinner choices, entertainment, TV, mobile and computers. Among social media, only Instagram is in popularity in the demo.
In 2015, an estimated 150,000 apps, 10% of those in Apple's app store, were educational and aimed at children up to college level. While researchers and parents agree the change in educational paradigm is significant, the results of the changes are mixed. On one hand, smartphones offer the potential for deeper involvement in learning and more individualized instruction, thereby making this generation potentially better educated and more well-rounded.
On the other hand, some researchers and parents are concerned that the prevalence of smart phones will cause technology dependence and a lack of self-regulation that will hinder child development.
An online newspaper about texting, SMS and MMS writes that teens own cellphones without necessarily needing them. As children become teenagers, receiving a phone is considered a rite of passage in some countries, allowing the owner to be further connected with their peers and it is now a social norm to have one at an early age.
An article from the Pew Research Center stated that "nearly three-quarters of teens have or have access to a smartphone and 30% have a basic phone, while just 12% of teens 13 to 17 say they have no cell phone of any type".
These numbers are only on the rise and the fact that the majority of Gen Z's own a cell phone has become one of this generations defining characteristics. As a result of this "24% of teens go online 'almost constantly'".
Teens are much more likely to share different types of information, as of 2012, compared to in 2006. However, they will take certain steps to protect certain information that they do not want being shared. They are more likely to "follow" others on social media than "share" and use different types of social media for different purposes. Focus group testing found that while teens may be annoyed by many aspects of Facebook, they continue to use it because participation is important in terms of socializing with friends and peers.
Twitter and Instagram are seen to be gaining popularity in member of Generation Z, with 24% (and growing) of teens with access to the Internet having Twitter accounts. This is, in part, due to parents not typically using these social networking sites.
Snapchat is also seen to have gained attraction in Generation Z because videos, pictures, messages send much faster than regular messaging. Speed and reliability are important factors in how members of Generation Z choice of social networking platform. This need for quick communication is presented in popular Generation Z apps like Vine (service) and the prevalent use of emojis.
In a study performed by psychologists it was found that young people use the internet as a way to gain access to information and to interact with others. Mobile technology, social media, and internet usage have become increasingly important to modern adolescents over the past decade.
Very few, however, are changed from what they gain access to online. Youths are using the internet as a tool to gain social skills, that they then apply to real life situations, and learn about things that interest them. Teens spend most of their time online in private communication with people they interact with outside the internet on a regular basis.
While social media is used for keeping up with global news and connections, it is mainly used for developing and maintaining relationships with people with whom they are close in proximity. The use of social media has become integrated into the daily lives of most Gen Z'ers who have access to mobile technology.
They use it on a daily basis to keep in contact with friends and family, particularly those who they see every day. As a result, the increased use of mobile technology has caused Gen Z'ers to spend more time on their smartphones, and social media and has caused online relationship development to become a new generational norm. Gen Z'ers are generally against the idea of photoshopping and they are against changing themselves to be considered perfect.
The parents of the Gen Z'ers fear the overuse of the internet by their children. Parents have a disliking for the access to inappropriate information and images as well as social networking sites where children can gain access to people from all over. Children reversely felt annoyed with their parents and complained about parents being overly controlling when it came to their internet usage. Gen Z uses social media and other sites to strengthen bonds with friends and to develop new ones. They interact with people who they otherwise would not have met in the real world, becoming a tool for identity creation.
Social media is known to be a vehicle to express how members of Generation Z go about their daily lives and also express their beliefs. On the one hand, this understanding of the use of social media makes even more prevalent the issues of racism in society.
On the other hand, when people attend events in support of certain social justice movements, members of Generation Z are much more likely to post on their social media pages about the event. In part, this is to further prove that they stand by their beliefs. Moreover, this also spreads awareness of the movement and leads to the growth of a movement.
Jason Dorsey, a notable Gen Y speaker who runs the Center for Generational Kinetics, stated in a TEDxHouston talk that this generation begins after 1996 to present. He stressed notable differences in the way that they both consume technology, in terms of smartphone usage at an earlier age. 18% of Generation Z thinks that it is okay for a 13 year old to have a smartphone compared to earlier generations which say 4%.
Education:
According to a Northeastern University Survey, 81% of Generation Z believes obtaining a college degree is necessary in achieving career goals. As Generation Z enters high school, and they start preparing for college, a primary concern is paying for a college education without acquiring debt. Students report working hard in high school in hopes of earning scholarships and the hope that parents will pay the college costs not covered by scholarships.
Students also report interest in ROTC programs as a means of covering college costs. According to NeaToday, a publication by the National Education Association, two thirds of Gen Zers entering college are concerned about affording college. One third plan to rely on grants and scholarships and one quarter hope that their parents will cover the bulk of college costs.
While the cost of attending college is incredibly high for most Gen Zers, according to NeaToday, 65% say the benefits of graduating college exceed the costs. Generation Z college students prefer intra-personal and independent learning over group work, yet like to do their solo work alongside others in a social manner when studying. They like their learning to be practical and hands-on and want their professors to help them engage with and apply the content rather than simply share what they could otherwise find on their own online.
"Generation Z" is revolutionizing the educational system in many aspects. Thanks in part to a rise in the popularity of entrepreneurship, high schools and colleges across the globe are including entrepreneurship in their curriculums.
Employment prospects:
According to Hal Brotheim in Introducing Generation Z, they will be better future employees. With the skills needed to take advantage of advanced technologies, they will be significantly more helpful to the typical company in today's high tech world. Brotheim argues that their valuable characteristics are their acceptance of new ideas and different conception of freedom from the previous generations.
Despite the technological proficiency they possess, members of Generation Z actually prefer person to person contact as opposed to online interaction. As a result of the social media and technology they are accustomed to, Generation Z is well prepared for a global business environment.
Another important note to point out is Generation Z no longer wants just a job: they seek more than that. They want a feeling of fulfillment and excitement in their job that helps move the world forward. Generation Z is eager to be involved in their community and their futures.
Before college, Generation Z is already out in their world searching how to take advantage of relevant professional opportunities that will give them experience for the future.
In India, a 2016 survey by India's employee engagement and employer rating platform, JobBuzz.in, showed Generation Z professionals started out better in the job market compared to Generation Y.
Successors:
Mark McCrindle has suggested "Generation Alpha" and "Generation Glass" as names for the generation following Generation Z. McCrindle has predicted that this next generation will be "the most formally educated generation ever, the most technology supplied generation ever, and globally the wealthiest generation ever". He chose the name "Generation Alpha", noting that scientific disciplines often move to the Greek alphabet after exhausting the Roman alphabet.
Author Alexandra Levit has suggested that there may not be a need to name the next generation, as technology has rendered the traditional 15–20 year cohorts obsolete. Levit notes that she "can't imagine my college student babysitter having the same experience as my four-year-old", despite both being in Generation Z.
Matt Carmichael, former director of data strategy at Advertising Age, noted in 2015 that many groups were "competing to come up with the clever name" for the generation following Generation Z.
The generation is most commonly defined with birth years starting in the mid-1990s, although the early or late 1990s and early 2000s have also been used as starting birth years for this generation.
A significant aspect of this generation is its widespread usage of the internet from a young age. Members of Generation Z are typically thought of as being comfortable with technology, and interacting on social media websites for a significant portion of their socializing. Some commentators have suggested that growing up through the Great Recession has given the cohort a feeling of unsettlement and insecurity.
Authors William Strauss and Neil Howe wrote several books on the subject of generations and are widely credited with coining the term Millennials. Howe has said "No one knows who will name the next generation after the Millennials". In 2005, their company sponsored an online contest in which respondents voted overwhelmingly for the name Homeland Generation. That was not long after the September 11th terrorist attacks, and one fallout of the disaster was that Americans may have felt more safe staying at home. Howe has described himself as "not totally wed" to the name and cautioned that "names are being invented by people who have a great press release. Everyone is looking for a hook."
In 2012, USA Today sponsored an online contest for readers to choose the name of the next generation after the Millennials. The name Generation Z was suggested, although journalist Bruce Horovitz thought that some might find the term "off-putting". Some other names that were proposed included: iGeneration, Gen Tech, Gen Wii, Net Gen, Digital Natives, and Plurals.
iGeneration (or iGen) is a name that several individuals claim to have coined. Psychology professor and author Jean Twenge claims that the name iGen "just popped into her head" while she was driving near Silicon Valley, and that she had intended to use it as the title of her 2006 book Generation Me until it was overridden by her publisher. Demographer Cheryl Russell claims to have first used the term in 2009.
Matt Carmichael, a past director of data strategy at Ad Age, said in 2012 "we think iGen is the name that best fits and will best lead to understanding of this generation".
In 2014, an NPR news intern noted that iGeneration "seems to be winning" as the name for the post-Millennials. It has been described as "a wink and nod to Apple's iPod and iPhone", while former Ad Age writer Matt Carmichael notes that the lowercase "i" in iGeneration "leaves room for interpretation" and "could be any number of things: It could be for interactive, it could be for international, it could be for something we haven't thought of yet."
In response to naming a generation after a branded product, Randy Apuzzo, technologist and CEO of Zesty.io, published an article titled "Always Connected: Generation Z, the Digitarians", in which he calls the new generation 'Digitarians' because they are the first generation that has been "always connected to the internet" and were raised with touch devices.
Statistics Canada has noted that the cohort is sometimes referred to as "the Internet generation," as it is the first generation to have been born after the invention of the Internet. Criticism of the term has been that it could "sound like an adapter used to charge your phone on the bus".
Frank N. Magid Associates, an advertising and marketing agency, nicknamed this cohort "The Pluralist Generation" or 'Plurals'. Turner Broadcasting System also advocated calling the post-millennial generation 'Plurals'. The Futures Company has named this cohort "The Centennials".
In Japan, the cohort is described as "Neo-Digital Natives", a step beyond the previous cohort described as "Digital Natives". Digital Natives primarily communicate by text or voice, while neo-digital natives use video or movies. This emphasizes the shift from PC to mobile and text to video among the neo-digital population.
MTV has labeled the generation that follows Millennials "The Founders", based on the results of a survey they conducted in March 2015. MTV President Sean Atkins commented, “they have this self-awareness that systems have been broken, but they can’t be the generation that says we’ll break it even more.”
Author Neil Howe defines the cohort as people born from approximately 2005–2025, but describes the dividing line between Generation Z and Millennials as "tentative" saying, "you can’t be sure where history will someday draw a cohort dividing line until a generation fully comes of age". He noted that the Millennials' range beginning in 1982 would point to the next generation's window starting between 2000 and 2006.
Australia's McCrindle Research Center defines Generation Z as those born between 1995–2009, starting with a recorded rise in birth rates, and fitting their newer definition of a generational span as 15 years. A previous McCrindle report used 2001 as the starting point for this generation.
Adweek defines Generation Z as those born in 1995 or later. Many other researchers also put the starting date for Generation Z at around the mid-1990s, in or after 1995 and 1996.
In Japan, generations are defined by a ten year span with "Neo-Digital natives" beginning after 1996.
Statistics Canada defines Generation Z as starting with the birth year 1993. The Futures Company uses 1997 as the first year of birth for this cohort. Frank N. Magid Associates, an advertising and marketing agency, uses birth years starting from 1997 into the present day.
According to Forbes, in 2015 Generation Z made up 25% of the U.S. population, making them a larger cohort than the Baby Boomers or Millennials. Frank N. Magid Associates estimates that in the United States, 55% of Plurals are Caucasian, 24% are Hispanic, 14% are African-American, 4% are Asian, and 4% are multiracial or other.
The oldest members of Generation Z will be the first to come of age after same-sex marriage was legalized nationally in 2015. Generation Z is the first to overwhelmingly approve of same-sex marriage in their adolescence. According to a Frank N. Magid Associates white paper, generation Z exhibits positive feelings about the increasing ethnic diversity in the U.S., and they are more likely than older generations to have social circles that include people from different ethnic groups, races and religions.
Generation Z are predominantly the children of Generation X. According to marketing firm Frank N. Magid they are "the least likely to believe that there is such a thing as the American Dream", while Baby Boomers and their Millennial children are more likely to believe in it. According to Public Relations Society of America, the Great Recession has taught Generation Z to be independent, and has led to an entrepreneurial desire, after seeing their parents and older siblings struggle in the workforce.
Business Insider describes Generation Z as more conservative, more money-oriented, more entrepreneurial and pragmatic about money compared to Millennials. A 2013 survey by Ameritrade found that 46% of Generation Z in the United States (considered here to be those between the ages of 14 and 23) were concerned about student debt, while 36% were worried about being able to afford a college education at all. This generation is faced with a growing income gap and a shrinking middle-class, which all have led to increasing stress levels in families.
Both the September 11 terrorist attacks and the Great Recession have greatly influenced the attitudes of this generation in the United States. The oldest members of generation Z were 7 to 8 years-old when the 9/11 attacks occurred. Turner suggests it is likely that both events have resulted in a feeling of unsettlement and insecurity among the people of Generation Z with the environment in which they were being raised.
The economic recession of 2008 is particularly important to historical events that have shaped Generation Z, due to the ways in which their childhoods may have been affected by the recession's shadow; that is, the financial stresses felt by their parents. Although the Millennials experienced these events during their coming of age, Generation Z lived through them as part of their childhood, affecting their realism and world-view. Obama's rise to presidency has also played a fundamental role in providing an identity to Generation Z.
A 2014 study Generation Z Goes to College found that Generation Z students self-identify as being loyal, compassionate, thoughtful, open-minded, responsible, and determined. How they see their Generation Z peers is quite different than their own self-identity. They view their peers as competitive, spontaneous, adventuresome, and curious; all characteristics that they do not see readily in themselves.
Generation Z is generally more risk-adverse in certain activities than the Millennials. In 2013, 66% of teenagers (older members of Generation Z) had tried alcohol, down from 82% in 1991. Also in 2013, 8% of Gen. Z teenagers never or rarely wear a seat belt when riding in a car with someone else, as opposed to 26% in 1991.
A 2016 U.S. study found that church attendance during young adulthood was 41% among Generation Z, compared with 18 percent for millennials at the same ages, 21 percent of Generation X, and 26 percent of baby boomers.
Generation Z is the first to have internet technology so readily available at a very young age. With the web revolution that occurred throughout the 1990s, they have been exposed to an unprecedented amount of technology in their upbringing.
As technology became more compact and affordable, the popularity of smartphones in the United States grew exponentially. With 77% of 12–17 year olds owning a cellphone in 2015, technology has strongly influenced Generation Z in terms of communication and education. Forbes magazine suggested that by the time Generation Z entered the workplace, digital technology would be an aspect of almost all career paths.
Anthony Turner characterizes Generation Z as having a 'digital bond to the internet', and argues that it may help youth to escape from emotional and mental struggles they face offline. According to US consultancy Sparks and Honey in 2014, 41% of Generation Z spend more than three hours per day using computers for purposes other than schoolwork, compared to 22% in 2004.
In 2015, Generation Z comprised the largest portion of the U.S. population, at nearly 26%, edging out Millennials (24.5%), and the generation is estimated to generate $44 billion in annual spending.
About three-quarters of 13–17 years olds use their cellphones daily, more than they watch TV. Over half of surveyed mothers say the demo influences them in purchasing decisions for toys, apparel, dinner choices, entertainment, TV, mobile and computers. Among social media, only Instagram is in popularity in the demo.
In 2015, an estimated 150,000 apps, 10% of those in Apple's app store, were educational and aimed at children up to college level. While researchers and parents agree the change in educational paradigm is significant, the results of the changes are mixed. On one hand, smartphones offer the potential for deeper involvement in learning and more individualized instruction, thereby making this generation potentially better educated and more well-rounded.
On the other hand, some researchers and parents are concerned that the prevalence of smart phones will cause technology dependence and a lack of self-regulation that will hinder child development.
An online newspaper about texting, SMS and MMS writes that teens own cellphones without necessarily needing them. As children become teenagers, receiving a phone is considered a rite of passage in some countries, allowing the owner to be further connected with their peers and it is now a social norm to have one at an early age.
An article from the Pew Research Center stated that "nearly three-quarters of teens have or have access to a smartphone and 30% have a basic phone, while just 12% of teens 13 to 17 say they have no cell phone of any type".
These numbers are only on the rise and the fact that the majority of Gen Z's own a cell phone has become one of this generations defining characteristics. As a result of this "24% of teens go online 'almost constantly'".
Teens are much more likely to share different types of information, as of 2012, compared to in 2006. However, they will take certain steps to protect certain information that they do not want being shared. They are more likely to "follow" others on social media than "share" and use different types of social media for different purposes. Focus group testing found that while teens may be annoyed by many aspects of Facebook, they continue to use it because participation is important in terms of socializing with friends and peers.
Twitter and Instagram are seen to be gaining popularity in member of Generation Z, with 24% (and growing) of teens with access to the Internet having Twitter accounts. This is, in part, due to parents not typically using these social networking sites.
Snapchat is also seen to have gained attraction in Generation Z because videos, pictures, messages send much faster than regular messaging. Speed and reliability are important factors in how members of Generation Z choice of social networking platform. This need for quick communication is presented in popular Generation Z apps like Vine (service) and the prevalent use of emojis.
In a study performed by psychologists it was found that young people use the internet as a way to gain access to information and to interact with others. Mobile technology, social media, and internet usage have become increasingly important to modern adolescents over the past decade.
Very few, however, are changed from what they gain access to online. Youths are using the internet as a tool to gain social skills, that they then apply to real life situations, and learn about things that interest them. Teens spend most of their time online in private communication with people they interact with outside the internet on a regular basis.
While social media is used for keeping up with global news and connections, it is mainly used for developing and maintaining relationships with people with whom they are close in proximity. The use of social media has become integrated into the daily lives of most Gen Z'ers who have access to mobile technology.
They use it on a daily basis to keep in contact with friends and family, particularly those who they see every day. As a result, the increased use of mobile technology has caused Gen Z'ers to spend more time on their smartphones, and social media and has caused online relationship development to become a new generational norm. Gen Z'ers are generally against the idea of photoshopping and they are against changing themselves to be considered perfect.
The parents of the Gen Z'ers fear the overuse of the internet by their children. Parents have a disliking for the access to inappropriate information and images as well as social networking sites where children can gain access to people from all over. Children reversely felt annoyed with their parents and complained about parents being overly controlling when it came to their internet usage. Gen Z uses social media and other sites to strengthen bonds with friends and to develop new ones. They interact with people who they otherwise would not have met in the real world, becoming a tool for identity creation.
Social media is known to be a vehicle to express how members of Generation Z go about their daily lives and also express their beliefs. On the one hand, this understanding of the use of social media makes even more prevalent the issues of racism in society.
On the other hand, when people attend events in support of certain social justice movements, members of Generation Z are much more likely to post on their social media pages about the event. In part, this is to further prove that they stand by their beliefs. Moreover, this also spreads awareness of the movement and leads to the growth of a movement.
Jason Dorsey, a notable Gen Y speaker who runs the Center for Generational Kinetics, stated in a TEDxHouston talk that this generation begins after 1996 to present. He stressed notable differences in the way that they both consume technology, in terms of smartphone usage at an earlier age. 18% of Generation Z thinks that it is okay for a 13 year old to have a smartphone compared to earlier generations which say 4%.
Education:
According to a Northeastern University Survey, 81% of Generation Z believes obtaining a college degree is necessary in achieving career goals. As Generation Z enters high school, and they start preparing for college, a primary concern is paying for a college education without acquiring debt. Students report working hard in high school in hopes of earning scholarships and the hope that parents will pay the college costs not covered by scholarships.
Students also report interest in ROTC programs as a means of covering college costs. According to NeaToday, a publication by the National Education Association, two thirds of Gen Zers entering college are concerned about affording college. One third plan to rely on grants and scholarships and one quarter hope that their parents will cover the bulk of college costs.
While the cost of attending college is incredibly high for most Gen Zers, according to NeaToday, 65% say the benefits of graduating college exceed the costs. Generation Z college students prefer intra-personal and independent learning over group work, yet like to do their solo work alongside others in a social manner when studying. They like their learning to be practical and hands-on and want their professors to help them engage with and apply the content rather than simply share what they could otherwise find on their own online.
"Generation Z" is revolutionizing the educational system in many aspects. Thanks in part to a rise in the popularity of entrepreneurship, high schools and colleges across the globe are including entrepreneurship in their curriculums.
Employment prospects:
According to Hal Brotheim in Introducing Generation Z, they will be better future employees. With the skills needed to take advantage of advanced technologies, they will be significantly more helpful to the typical company in today's high tech world. Brotheim argues that their valuable characteristics are their acceptance of new ideas and different conception of freedom from the previous generations.
Despite the technological proficiency they possess, members of Generation Z actually prefer person to person contact as opposed to online interaction. As a result of the social media and technology they are accustomed to, Generation Z is well prepared for a global business environment.
Another important note to point out is Generation Z no longer wants just a job: they seek more than that. They want a feeling of fulfillment and excitement in their job that helps move the world forward. Generation Z is eager to be involved in their community and their futures.
Before college, Generation Z is already out in their world searching how to take advantage of relevant professional opportunities that will give them experience for the future.
In India, a 2016 survey by India's employee engagement and employer rating platform, JobBuzz.in, showed Generation Z professionals started out better in the job market compared to Generation Y.
Successors:
Mark McCrindle has suggested "Generation Alpha" and "Generation Glass" as names for the generation following Generation Z. McCrindle has predicted that this next generation will be "the most formally educated generation ever, the most technology supplied generation ever, and globally the wealthiest generation ever". He chose the name "Generation Alpha", noting that scientific disciplines often move to the Greek alphabet after exhausting the Roman alphabet.
Author Alexandra Levit has suggested that there may not be a need to name the next generation, as technology has rendered the traditional 15–20 year cohorts obsolete. Levit notes that she "can't imagine my college student babysitter having the same experience as my four-year-old", despite both being in Generation Z.
Matt Carmichael, former director of data strategy at Advertising Age, noted in 2015 that many groups were "competing to come up with the clever name" for the generation following Generation Z.
List of (Cultural) Generations
YouTube Video: Cam Marston a Humorous Look at the Four Generations in Today's Workplace
Pictured: comparison of generations by generational marker
Provides a list by generation such as Baby Boomers, Gen X, Millennials, Generation Z and immigrant generations.
The Continuing "Culture War" in the United States
YouTube Video A Church Divided: Christians Debate Homosexuality
In the United States and the United Kingdom, culture war refers to a conflict between traditionalist or conservative values and progressive or liberal values.
Beginning in the 1990s, culture wars have influenced the debate over public school history and science curricula in the United States, along with many other issues.
The expression culture war entered the vocabulary of United States politics with the publication of Culture Wars: The Struggle to Define America by James Davison Hunter in 1991.
Hunter perceived a dramatic realignment and polarization that had transformed United States politics and culture, including the issues of,
For Further Amplification, click on any of the following (blue) hyperlinks:
Beginning in the 1990s, culture wars have influenced the debate over public school history and science curricula in the United States, along with many other issues.
The expression culture war entered the vocabulary of United States politics with the publication of Culture Wars: The Struggle to Define America by James Davison Hunter in 1991.
Hunter perceived a dramatic realignment and polarization that had transformed United States politics and culture, including the issues of,
- abortion,
- federal and state gun laws,
- global warming,
- immigration,
- separation of church and state,
- privacy,
- recreational drug use,
- homosexuality,
- and censorship.
For Further Amplification, click on any of the following (blue) hyperlinks:
- Origins
- United States of America
- Battleground issues in the "culture wars"
- References
- Further reading
- Primary sources
- External links
Monterey Pop Festival (1967)
YouTube Video Janis Joplin - Ball and Chain (Monterey pop festival) 1967 Ca.
The Monterey International Pop Music Festival was a three-day concert event held June 16 to June 18, 1967 at the Monterey County Fairgrounds in Monterey, California.
Crowd estimates for the festival have ranged from 25,000-90,000 people, who congregated in and around the festival grounds. The fairgrounds’ enclosed performance arena, where the music took place, had an approved festival capacity of 7,000, but it was estimated that 8,500 jammed into it for Saturday night’s show.
Festival-goers who wanted to see the musical performances were required to have either an 'all-festival' ticket or a separate ticket for each of the five scheduled concert events they wanted to attend in the arena: Friday night, Saturday afternoon and night, and Sunday afternoon and night. Ticket prices varied by seating area, and ranged from $3 to $6.50 ($21–46, adjusted for inflation).
The festival is remembered for the first major American appearances by The Jimi Hendrix Experience, The Who and Ravi Shankar, the first large-scale public performance of Janis Joplin and the introduction of Otis Redding.
The Monterey Pop Festival embodied the theme of California as a focal point for the counterculture and is generally regarded as one of the beginnings of the "Summer of Love" in 1967; the first rock festival had been held just one week earlier at Mount Tamalpais in Marin County, the KFRC Fantasy Fair and Magic Mountain Music Festival.
Because Monterey was widely promoted and heavily attended, featured historic performances, and was the subject of a popular theatrical documentary film, it became an inspiration and a template for future music festivals, including the Woodstock Festival two years later.
Crowd estimates for the festival have ranged from 25,000-90,000 people, who congregated in and around the festival grounds. The fairgrounds’ enclosed performance arena, where the music took place, had an approved festival capacity of 7,000, but it was estimated that 8,500 jammed into it for Saturday night’s show.
Festival-goers who wanted to see the musical performances were required to have either an 'all-festival' ticket or a separate ticket for each of the five scheduled concert events they wanted to attend in the arena: Friday night, Saturday afternoon and night, and Sunday afternoon and night. Ticket prices varied by seating area, and ranged from $3 to $6.50 ($21–46, adjusted for inflation).
The festival is remembered for the first major American appearances by The Jimi Hendrix Experience, The Who and Ravi Shankar, the first large-scale public performance of Janis Joplin and the introduction of Otis Redding.
The Monterey Pop Festival embodied the theme of California as a focal point for the counterculture and is generally regarded as one of the beginnings of the "Summer of Love" in 1967; the first rock festival had been held just one week earlier at Mount Tamalpais in Marin County, the KFRC Fantasy Fair and Magic Mountain Music Festival.
Because Monterey was widely promoted and heavily attended, featured historic performances, and was the subject of a popular theatrical documentary film, it became an inspiration and a template for future music festivals, including the Woodstock Festival two years later.
Baby Boomers Generation
YouTube Video: Tribute to the Baby Boomer Generation
Baby boomers are the demographic group born during the post–World War II baby boom, approximately between the years 1946 and 1964. This includes people who are between 52 and 70 years old in 2016, according to the U.S. Census Bureau.
The term "baby boomer" is also used in a cultural context, so it is difficult to achieve broad consensus of a precise date definition. Different people, organizations, and scholars have varying opinions on who is a baby boomer, both technically and culturally. Ascribing universal attributes to such a generation is difficult, and some believe it is inherently impossible, but many have attempted to determine their cultural similarities and historical impact, and the term has thus gained widespread popular usage.
Baby boomers are associated with a rejection or redefinition of traditional values. Many commentators, however, have disputed the extent of that rejection, noting the widespread continuity of values with older and younger generations.
In Europe and North America, boomers are widely associated with privilege, as many grew up in a time of widespread government subsidies in post-war housing and education, and increasing affluence.
As a group, baby boomers were the wealthiest, most active, and most physically fit generation up to the era in which they arrived, and were among the first to grow up genuinely expecting the world to improve with time. They were also the generation that received peak levels of income; they could therefore reap the benefits of abundant levels of food, apparel, retirement programs, and sometimes even "midlife crisis" products.
The increased consumerism for this generation has been regularly criticized as excessive.
One feature of the boomers was that they have tended to think of themselves as a special generation, very different from those that had come before. In the 1960s, as the relatively large numbers of young people became teenagers and young adults, they, and those around them, created a very specific rhetoric around their statistical cohort, and the changes they were bringing about.
This rhetoric had an important impact in the self perceptions of the boomers, as well as their tendency to define the world in terms of generations, which was a relatively new phenomenon. The baby boom has been described variously as a "shockwave" and as "the pig in the python."
The term "Generation Jones" has sometimes been used to distinguish those born from 1954 to 1964 from the earlier baby boomers.
The term "baby boomer" is also used in a cultural context, so it is difficult to achieve broad consensus of a precise date definition. Different people, organizations, and scholars have varying opinions on who is a baby boomer, both technically and culturally. Ascribing universal attributes to such a generation is difficult, and some believe it is inherently impossible, but many have attempted to determine their cultural similarities and historical impact, and the term has thus gained widespread popular usage.
Baby boomers are associated with a rejection or redefinition of traditional values. Many commentators, however, have disputed the extent of that rejection, noting the widespread continuity of values with older and younger generations.
In Europe and North America, boomers are widely associated with privilege, as many grew up in a time of widespread government subsidies in post-war housing and education, and increasing affluence.
As a group, baby boomers were the wealthiest, most active, and most physically fit generation up to the era in which they arrived, and were among the first to grow up genuinely expecting the world to improve with time. They were also the generation that received peak levels of income; they could therefore reap the benefits of abundant levels of food, apparel, retirement programs, and sometimes even "midlife crisis" products.
The increased consumerism for this generation has been regularly criticized as excessive.
One feature of the boomers was that they have tended to think of themselves as a special generation, very different from those that had come before. In the 1960s, as the relatively large numbers of young people became teenagers and young adults, they, and those around them, created a very specific rhetoric around their statistical cohort, and the changes they were bringing about.
This rhetoric had an important impact in the self perceptions of the boomers, as well as their tendency to define the world in terms of generations, which was a relatively new phenomenon. The baby boom has been described variously as a "shockwave" and as "the pig in the python."
The term "Generation Jones" has sometimes been used to distinguish those born from 1954 to 1964 from the earlier baby boomers.
African-Americans including Their Historical Timeline
YouTube Video of a documentary honoring the World War II Tuskegee Airmen*
* -- African-American military pilots (fighter and bomber) who fought in World War II
Pictured: Great African-Americans who have made a difference include (Top Left) Martin Luther King, Jr.; (Top Right) Oprah Winfrey and (Bottom) President Barack Obama
Click here for a timeline of African-American History.
African Americans (also referred to as Black Americans or Afro-Americans) are an ethnic group of Americans with total or partial ancestry from any of the Black racial groups of Africa.
The term may also be used to include only those individuals who are descended from enslaved Africans. As a compound adjective the term is usually hyphenated as African-American.
Black and African Americans constitute the third largest racial and ethnic group in the United States (after White Americans and Hispanic and Latino Americans).
Most African Americans are of West and Central African descent and are descendants of enslaved peoples within the boundaries of the present United States. On average, African Americans are of 73.2–80.9% West African, 18–24% European, and 0.8–0.9% Native American genetic heritage, with large variation between individuals.
According to US Census Bureau data, African immigrants generally do not self-identify as African American. The overwhelming majority of African immigrants identify instead with their own respective ethnicities (~95%). Immigrants from some Caribbean, Central American and South American nations and their descendants may or may not also self-identify with the term.
African-American history starts in the 16th century, with peoples from West Africa forcibly taken as slaves to Spanish America, and in the 17th century with West African slaves taken to English colonies in North America. After the founding of the United States, black people continued to be enslaved, with four million denied freedom from bondage prior to the Civil War.
Believed to be inferior to white people, they were treated as second-class citizens. The Naturalization Act of 1790 limited U.S. citizenship to whites only, and only white men of property could vote.
These circumstances were changed by Reconstruction, development of the black community, participation in the great military conflicts of the United States, the elimination of racial segregation, and the Civil Rights Movement which sought political and social freedom. In 2008, Barack Obama became the first African American to be elected President of the United States.
Click on any of the following blue hyperlinks for more about African-Americans:
African Americans (also referred to as Black Americans or Afro-Americans) are an ethnic group of Americans with total or partial ancestry from any of the Black racial groups of Africa.
The term may also be used to include only those individuals who are descended from enslaved Africans. As a compound adjective the term is usually hyphenated as African-American.
Black and African Americans constitute the third largest racial and ethnic group in the United States (after White Americans and Hispanic and Latino Americans).
Most African Americans are of West and Central African descent and are descendants of enslaved peoples within the boundaries of the present United States. On average, African Americans are of 73.2–80.9% West African, 18–24% European, and 0.8–0.9% Native American genetic heritage, with large variation between individuals.
According to US Census Bureau data, African immigrants generally do not self-identify as African American. The overwhelming majority of African immigrants identify instead with their own respective ethnicities (~95%). Immigrants from some Caribbean, Central American and South American nations and their descendants may or may not also self-identify with the term.
African-American history starts in the 16th century, with peoples from West Africa forcibly taken as slaves to Spanish America, and in the 17th century with West African slaves taken to English colonies in North America. After the founding of the United States, black people continued to be enslaved, with four million denied freedom from bondage prior to the Civil War.
Believed to be inferior to white people, they were treated as second-class citizens. The Naturalization Act of 1790 limited U.S. citizenship to whites only, and only white men of property could vote.
These circumstances were changed by Reconstruction, development of the black community, participation in the great military conflicts of the United States, the elimination of racial segregation, and the Civil Rights Movement which sought political and social freedom. In 2008, Barack Obama became the first African American to be elected President of the United States.
Click on any of the following blue hyperlinks for more about African-Americans:
- History
- Demographics
- Religion
- Business
- Language
- Genetics
- Traditional names
- Contemporary issues
- Politics and social issues including Political legacy
- News media and coverage
- Culture in the United States
- Terminology
- See also:
- African American art
- African-American business history
- African-American Civil Rights Movement (1954–68)
- African-American Civil Rights Movement (1865–95)
- African-American Civil Rights Movement (1896–1954)
- Timeline of the African-American Civil Rights Movement (1954–68)
- African-American literature
- African-American music
- African American National Biography Project
- African-American neighborhood
- African American Vernacular English
- African-American upper class
- African-American middle class
- Afrophobia
- Anglo-African term
- Back-to-Africa movement
- Black feminism
- Black History Month
- Black Lives Matter
- Black Loyalist
- Military history of African Americans
- National Museum of African American History and Culture
- African-American names
- Scientific racism
- Stereotypes of African Americans
- Diaspora:
- Lists:
- Index of articles related to African Americans
- Lists of African Americans
- List of historically black colleges and universities
- List of topics related to the African diaspora
- List of populated places in the United States with African-American plurality populations
- List of U.S. states by African-American population
- List of U.S. counties with African-American majority populations in 2000
- List of U.S. metropolitan areas with large African-American populations
- List of U.S. cities with large African-American populations
- List of U.S. communities with African-American majority populations in 2010
- List of African-American neighborhoods
- List of black college football classics
- Terminology:
The Culture of Surfing
YouTube Video from the surfing movie "Endless Summer" (1966)
YouTube Video of One of the Largest Wave Ever Surfed!
Pictured: "Teenager captures incredible photos of pro-surfers riding huge waves from INSIDE the barrel as he tags along behind them"
Surf culture is the culture that includes the people, language, fashion, and lifestyle surrounding the sport of surfing. The history of surfing began with the ancient Polynesians.
That initial culture directly influenced modern surfing, which began to flourish and evolve in the early 20th century, with popularity spiking greatly during the 1950s and 1960s (principally in Hawaii, Australia, and California). It continues to progress and spread throughout the world. It has at times affected popular fashion, music, literature, films, art, jargon, and more.
The fickle nature of weather and the ocean, plus the great desire for the best possible types of waves for surfing, make surfers dependent on weather conditions that may change rapidly.
The staff of Surfer Magazine, founded in the 1960s when surfing had gained popularity with teenagers, used to say that if they were hard at work and someone yelled "Surf's up!" the office would suddenly be empty. Also, since surfing has a restricted geographical necessity (i.e. the coast), the culture of beach life often influenced surfers and vice versa. Location or territory is a part of the development of surf culture in which individuals or groups of surfers designate certain key surfing spots as their own.
Aspects of 1960s surf culture in Southern California, where it was first popularized, include the woodie, bikinis and other beach wear, such as board shorts or baggies, and surf music.
Surfers developed the skateboard to be able to "surf" on land; and a number of other boardsports.
Big Wave Culture:
A non-competitive adventure activity involving riding the biggest waves possible (known as "rhino hunting") is also popular with some surfers. A practice popularized in the 1990s has seen big wave surfing revolutionized, as surfers use personal watercraft to tow them out to a position where they can catch previously unrideable waves (see tow-in surfing). These waves were previously unrideable due to the speed at which they travel.
Some waves reach speeds of over 60 km/h; personal watercraft enable surfers to catch up to the speed of the wave, thereby making them rideable. Personal watercraft also allow surfers to survive wipeouts. In many instances surfers would not survive the battering of the "sets" (groups of waves together). This spectacular activity is extremely popular with television crews, but because such waves rarely occur in heavily populated regions, and usually only a very long way out to sea on outer reefs, few spectators see such events directly.
Though surfers come from all walks of life, the basis of the beach bum stereotype comes from that great enthusiasm that surfers can have for their sport. Dedication and perfectionism are also qualities that surfers bring to what many have traditionally regarded as a commitment to a lifestyle as well as a sport.
For specific surf spots, the state of the ocean tide can play a significant role in the quality of waves or hazards of surfing there. Tidal variations vary greatly among the various global surfing regions, and the effect the tide has on specific spots can vary greatly among the spots within each area. Locations such as Bali, Panama, and Ireland experience 2-3 meter tide fluctuations, whereas in Hawaii the difference between high and low tide is typically less than one meter.
Each surf break is different, since the underwater topography of one place is unlike any other. At beach breaks, the sandbanks can change shape from week to week, so it takes commitment to get good waves.
The saying "You should have been here yesterday," became a commonly used phrase for bad conditions. Nowadays, however, surf forecasting is aided by advances in information technology, whereby mathematical modeling graphically depicts the size and direction of swells moving around the globe.
The quest for perfect surf has given rise to a field of tourism based on the surfing adventure. Yacht charters and surf camps offer surfers access to the high quality surf found in remote, tropical locations, where tradewinds ensure offshore conditions.
Along with the rarity of what surfers consider truly perfect surf conditions (due to changing weather and surf condition) and the inevitable hunt for great waves, surfers often become dedicated to their sport in a way that precludes a more traditional life. Surfing, instead, becomes their lifestyle.
The goals of those who practice the sport vary, but throughout its history, many have seen surfing as more than a sport, as an opportunity to harness the waves and to relax and forget about their daily routines.
Surfers have veered from even this beaten path, and foregone the traditional goals of first world culture in the hunt for a continual 'stoke', harmony with life, their surfing, and the ocean. These "Soul Surfers" are a vibrant and long-standing sub-group.
Competitive surf culture, centered around surf contests and endorsement deals, and location's disturbance of the peace, are often seen in opposition to this.
Location:
Even though waves break everywhere along a coast, good surf spots are rare. A surf break that forms great surfable waves may easily become a coveted commodity, especially if the wave only breaks there rarely. If this break is near a large population center with many surfers, territorialism often arises.
Regular surfers who live around a desirable surf break may often guard it jealously, hence the expression "locals only." The expression "locals only" is common among beach towns, especially those that are seasonally encroached upon by vacationers who live outside the area. Localism is expressed when surfers are involved in verbal or physical threats or abuse to deter people from surfing at certain surf spots. It is based in part on the belief that fewer people mean more waves per surfer.
Some locals have been known to form loose gangs that surf in a certain break or beach and fiercely protect their "territory" from outsiders. These surfers are often referred to as "surf punks" or "surf nazis." The local surfer gangs in Malibu and on Hawaii, known as da hui, have been known to threaten tourists with physical violence for invading their territory.
In Southern California, at the Venice and Santa Monica beaches, local surfers are especially hostile to the surfers from the San Fernando Valley whom they dub "vallies" or "valley kooks".
The expression "Surf Nazi" arose in the 1960s to describe territorial and authoritarian surfers, often involved in surf gangs or surf clubs. The term "Nazi" was originally used simply to denote the strict territorialism, violence and hostility to outsiders, and absolute obsession with surfing that was characteristic in the so-called "surf nazis."
However, some surfers reclaimed and accepted the term, and a few actually embraced Nazism and Nazi symbolism. Some surf clubs in the 1960s, particularly at Windansea in La Jolla, used the swastika symbol on their boards and identified with Nazism as a counter culture (though this may have just been an effort to keep out or scare non-locals.)
The "locals only" attitude and protectionism of the Santa Monica surf spots in the early 1970s was depicted in the movie Lords of Dogtown, which was based on the documentary Dogtown and Z-Boys.
Localism often exists due to socioeconomic factors as well. Until relatively recently, surfers were looked down upon as lazy people on the fringe of society (hence the term "beach bum.") Many who surfed were locals of beach towns who lived there year-round, and were from a lower economic class.
For that reason as much as any other, these groups were resentful of outsiders, particularly those who were well-to-do and came to their beaches to surf recreationally rather than as a way of life.
Australia has its own history where surfers were openly treated with hostility from local governments in the sport's early days, and the tension never really went away, despite the sport's enormous increase in popularity. Maroubra Beach in Australia became infamous for localism and other violence chronicled in the documentary film Bra Boys about the eponymous group, although the surfers in the film maintain they are not a "gang."
Click on any of the following blue hyperlinks for more about Surf Culture:
That initial culture directly influenced modern surfing, which began to flourish and evolve in the early 20th century, with popularity spiking greatly during the 1950s and 1960s (principally in Hawaii, Australia, and California). It continues to progress and spread throughout the world. It has at times affected popular fashion, music, literature, films, art, jargon, and more.
The fickle nature of weather and the ocean, plus the great desire for the best possible types of waves for surfing, make surfers dependent on weather conditions that may change rapidly.
The staff of Surfer Magazine, founded in the 1960s when surfing had gained popularity with teenagers, used to say that if they were hard at work and someone yelled "Surf's up!" the office would suddenly be empty. Also, since surfing has a restricted geographical necessity (i.e. the coast), the culture of beach life often influenced surfers and vice versa. Location or territory is a part of the development of surf culture in which individuals or groups of surfers designate certain key surfing spots as their own.
Aspects of 1960s surf culture in Southern California, where it was first popularized, include the woodie, bikinis and other beach wear, such as board shorts or baggies, and surf music.
Surfers developed the skateboard to be able to "surf" on land; and a number of other boardsports.
Big Wave Culture:
A non-competitive adventure activity involving riding the biggest waves possible (known as "rhino hunting") is also popular with some surfers. A practice popularized in the 1990s has seen big wave surfing revolutionized, as surfers use personal watercraft to tow them out to a position where they can catch previously unrideable waves (see tow-in surfing). These waves were previously unrideable due to the speed at which they travel.
Some waves reach speeds of over 60 km/h; personal watercraft enable surfers to catch up to the speed of the wave, thereby making them rideable. Personal watercraft also allow surfers to survive wipeouts. In many instances surfers would not survive the battering of the "sets" (groups of waves together). This spectacular activity is extremely popular with television crews, but because such waves rarely occur in heavily populated regions, and usually only a very long way out to sea on outer reefs, few spectators see such events directly.
Though surfers come from all walks of life, the basis of the beach bum stereotype comes from that great enthusiasm that surfers can have for their sport. Dedication and perfectionism are also qualities that surfers bring to what many have traditionally regarded as a commitment to a lifestyle as well as a sport.
For specific surf spots, the state of the ocean tide can play a significant role in the quality of waves or hazards of surfing there. Tidal variations vary greatly among the various global surfing regions, and the effect the tide has on specific spots can vary greatly among the spots within each area. Locations such as Bali, Panama, and Ireland experience 2-3 meter tide fluctuations, whereas in Hawaii the difference between high and low tide is typically less than one meter.
Each surf break is different, since the underwater topography of one place is unlike any other. At beach breaks, the sandbanks can change shape from week to week, so it takes commitment to get good waves.
The saying "You should have been here yesterday," became a commonly used phrase for bad conditions. Nowadays, however, surf forecasting is aided by advances in information technology, whereby mathematical modeling graphically depicts the size and direction of swells moving around the globe.
The quest for perfect surf has given rise to a field of tourism based on the surfing adventure. Yacht charters and surf camps offer surfers access to the high quality surf found in remote, tropical locations, where tradewinds ensure offshore conditions.
Along with the rarity of what surfers consider truly perfect surf conditions (due to changing weather and surf condition) and the inevitable hunt for great waves, surfers often become dedicated to their sport in a way that precludes a more traditional life. Surfing, instead, becomes their lifestyle.
The goals of those who practice the sport vary, but throughout its history, many have seen surfing as more than a sport, as an opportunity to harness the waves and to relax and forget about their daily routines.
Surfers have veered from even this beaten path, and foregone the traditional goals of first world culture in the hunt for a continual 'stoke', harmony with life, their surfing, and the ocean. These "Soul Surfers" are a vibrant and long-standing sub-group.
Competitive surf culture, centered around surf contests and endorsement deals, and location's disturbance of the peace, are often seen in opposition to this.
Location:
Even though waves break everywhere along a coast, good surf spots are rare. A surf break that forms great surfable waves may easily become a coveted commodity, especially if the wave only breaks there rarely. If this break is near a large population center with many surfers, territorialism often arises.
Regular surfers who live around a desirable surf break may often guard it jealously, hence the expression "locals only." The expression "locals only" is common among beach towns, especially those that are seasonally encroached upon by vacationers who live outside the area. Localism is expressed when surfers are involved in verbal or physical threats or abuse to deter people from surfing at certain surf spots. It is based in part on the belief that fewer people mean more waves per surfer.
Some locals have been known to form loose gangs that surf in a certain break or beach and fiercely protect their "territory" from outsiders. These surfers are often referred to as "surf punks" or "surf nazis." The local surfer gangs in Malibu and on Hawaii, known as da hui, have been known to threaten tourists with physical violence for invading their territory.
In Southern California, at the Venice and Santa Monica beaches, local surfers are especially hostile to the surfers from the San Fernando Valley whom they dub "vallies" or "valley kooks".
The expression "Surf Nazi" arose in the 1960s to describe territorial and authoritarian surfers, often involved in surf gangs or surf clubs. The term "Nazi" was originally used simply to denote the strict territorialism, violence and hostility to outsiders, and absolute obsession with surfing that was characteristic in the so-called "surf nazis."
However, some surfers reclaimed and accepted the term, and a few actually embraced Nazism and Nazi symbolism. Some surf clubs in the 1960s, particularly at Windansea in La Jolla, used the swastika symbol on their boards and identified with Nazism as a counter culture (though this may have just been an effort to keep out or scare non-locals.)
The "locals only" attitude and protectionism of the Santa Monica surf spots in the early 1970s was depicted in the movie Lords of Dogtown, which was based on the documentary Dogtown and Z-Boys.
Localism often exists due to socioeconomic factors as well. Until relatively recently, surfers were looked down upon as lazy people on the fringe of society (hence the term "beach bum.") Many who surfed were locals of beach towns who lived there year-round, and were from a lower economic class.
For that reason as much as any other, these groups were resentful of outsiders, particularly those who were well-to-do and came to their beaches to surf recreationally rather than as a way of life.
Australia has its own history where surfers were openly treated with hostility from local governments in the sport's early days, and the tension never really went away, despite the sport's enormous increase in popularity. Maroubra Beach in Australia became infamous for localism and other violence chronicled in the documentary film Bra Boys about the eponymous group, although the surfers in the film maintain they are not a "gang."
Click on any of the following blue hyperlinks for more about Surf Culture:
- Surf terminology
- Issues affecting surfers
- Surf Environmentalism
- Surf tourism
- Spirituality
- Surfing art
- Fashion
- Events
- Surfing organizations
- Spin-offs & influences
- Surfing in multimedia
- Television shows
- Print media
- Graphic art
- See also:
- Surfing
- History of surfing
- World surfing champion
- List of surfing topics
- List of surfers
- Surf forecasting
- Surfing Heritage Foundation in San Clemente, California
- International Surfing Museum in Huntington Beach, California
- Santa Cruz Surfing Museum in Santa Cruz, California, with museum panoramas at the City of Santa Cruz's website.
- History of surfing and surf culture
Generation Gap
YouTube Video: Young vs. Old in a Game of Generation Gap (Jimmy Kimmel Live)
A generation gap or generational gap, is a difference of opinions between one generation and another regarding beliefs, politics, or values. In today's usage, "generation gap" often refers to a perceived gap between younger people and their parents or grandparents.
Early sociologists such as Karl Mannheim noted differences across generations in how the youth transistions into adulthood. and studied the ways in which generations separate themselves from one another, in the home and in social situations and areas (such as churches, clubs, senior centers, and youth centers).
The sociological theory of a generation gap first came to light in the 1960s, when the younger generation (later known as Baby Boomers) seemed to go against everything their parents had previously believed in terms of music, values, governmental and political views. Sociologists now refer to "generation gap" as "institutional age segregation".
Usually, when any of these age groups is engaged in its primary activity, the individual members are physically isolated from people of other generations, with little interaction across age barriers except at the nuclear family level.
Distinguishing generation gaps:
There are several ways to make distinctions between generations. For example, names are given to major groups (Baby boomers, Gen X, etc.) and each generation sets its own trends and has its own cultural impact.
Language use:
Generation gaps can be distinguished by the differences in their language use. The generation gap has created a parallel gap in language that can be difficult to communicate across. This issue is one visible throughout society, creating complications within day to day communication at home, in the work place, and within schools.
As new generations seek to define themselves as something apart from the old, they adopt new lingo and slang, allowing a generation to create a sense of division from the previous one. This is a visible gap between generations we see every day. "Man's most important symbol is his language and through this language he defines his reality."
Slang:
Slang is an ever-changing set of colloquial words and phrases that speakers use to establish or reinforce social identity or cohesiveness within a group or with a trend in society at large.
As each successive generation of society struggles to establish its own unique identity among its predecessors it can be determined that generational gaps provide a large influence over the continual change and adaptation of slang.
As slang is often regarded as an ephemeral dialect, a constant supply of new words is required to meet the demands of the rapid change in characteristics. And while most slang terms maintain a fairly brief duration of popularity, slang provides a quick and readily available vernacular screen to establish and maintain generational gaps in a societal context.
Technological influences:
Every generation develops new slang, but with the development of technology, understanding gaps have widened between the older and younger generations. "The term 'communication skills,' for example, might mean formal writing and speaking abilities to an older worker. But it might mean e-mail and instant-messenger savvy to a twenty something."
People often have private conversations in secret in a crowded room in today's age due to the advances of mobile phones and text messaging. Among "texters" a form of slang or texting lingo has developed, often keeping those not as tech savvy out of the loop:
"Children increasingly rely on personal technological devices like cell phones to define themselves and create social circles apart from their families, changing the way they communicate with their parents. Cell phones, instant messaging, e-mail and the like have encouraged younger users to create their own inventive, quirky and very private written language. That has given them the opportunity to essentially hide in plain sight. They are more connected than ever, but also far more independent. Text messaging, in particular, has perhaps become this generation's version of pig Latin."
While in the case with language skills such as shorthand, a system of stenography popular during the twentieth century, technological innovations occurring between generations have made these skills obsolete. Older generations used shorthand to be able to take notes and write faster using abbreviated symbols, rather than having to write each word. However, with new technology and keyboards, newer generations no longer need these older communication skills, like Gregg shorthand.
Although over 20 years ago, language skills such as shorthand classes were taught in many high schools, now students have rarely seen or even heard of forms like shorthand.
The transitions from each level of lifespan development have remained the same throughout history. They have all shared the same basic milestones in their travel from childhood, through midlife and into retirement.
However, while the pathways remain the same—i.e. attending school, marriage, raising families, retiring—the actual journey varies not only with each individual, but with each new generation. For instance, as time goes on, technology is being introduced to individuals at younger and younger ages.
While the Baby Boomers had to introduce Atari and VCRs to their parents, Generation Y’ers had to teach their parents how to maneuver such things as DVRs, cell phones and social media. There is a vast difference in Generation Y’ers and the Baby Boomers when it comes to technology.
In 2011, the National Sleep Foundation conducted a poll that focused on sleep and the use of technology; 95% of those polled admitted to using some form of technology within the last hour before going to bed at night. The study compared the difference in sleep patterns in those who watched TV or listened to music prior to bedtime compared to those who used cell phones, video games and the Internet.
The study looked at Baby Boomers (born 1946-1964), Generation X’ers (born 1965-1980), Generation Y’ers (born 1981-2000) and Generation Z’ers (born mid 1990s or 2000 to present). The research, as expected, showed generational gaps between the different forms of technology used.
The largest gap was shown between texting and talking on the phone; 56% of Gen Z’ers and 42% of Gen Y’ers admitted to sending, receiving, reading text messages every night within one hour prior to bedtime, compared to only 15% of Gen X’ers (born 1965-1980), and 5% of Baby Boomers.
Baby Boomers (born 1946-1964), were more likely to watch TV within the last hour prior to bedtime, 67%, compared to Gen Y’ers (born 1981-2000), who came in at 49%. When asked about computer/internet use within the last hour prior to bedtime, 70% of those polled admitted to using a computer "a few times a week", and from those, 55% of the Gen Z’ers (born mid-1990s or 2000 to present), said they "surf the web" every night before bed.
Language brokering:
Another phenomenon within language that works to define a generation gap occurs within families in which different generations speak different primary languages. In order to find a means to communicate within the household environment, many have taken up the practice of language brokering, which refers to the "interpretation and translation performed in everyday situations by bilinguals who have had no special training".
In immigrant families where the first generation speaks primarily in their native tongue, the second generation primarily in the language of the country in which they now live while still retaining fluency in their parent's dominant language, and the third generation primarily in the language of the country they were born in while retaining little to no conversational language in their grandparent's native tongue, the second generation family members serve as interpreters not only to outside persons, but within the household, further propelling generational differences and divisions by means of linguistic communication.
Furthermore, in some immigrant families and communities, language brokering is also used to integrate children into family endeavors and into civil society. Child integration has become very important to form linkages between new immigrant communities and the predominant culture and new forms of bureaucratic systems. In addition, it also serves towards child development by learning and pitching in.
Workplace Attitudes:
USA Today reported that younger generations are "entering the workplace in the face of demographic change and an increasingly multi-generational workplace". Multiple engagement studies show that the interests shared across the generation gap by members of this increasingly multi-generational workplace can differ substantially.
A popular belief held by older generations is that the characteristics of Millennials can potentially complicate professional interactions. To some managers, this generation is a group of coddled, lazy, disloyal, and narcissistic young people, who are incapable of handling the simplest task without guidance. For this reason, when millennials first enter a new organization, they are often greeted with wary coworkers.
Career was an essential component of the identities of Baby boomers; they made many sacrifices, working 55 to 60 hour weeks, patiently waiting for promotions.
Millennials, on the other hand, are not workaholics and do not place such a strong emphasis on their careers. Even so, they expect all the perks, in terms of good pay and benefits, rapid advancement, work-life balance, stimulating work, and giving back to their community.
Studies have found that millennials are usually exceptionally confident in their abilities and, as a result, fail to prove themselves by working hard, seeking key roles in significant projects early on in their careers, which frustrates their older coworkers.
Most of these inflated expectations are direct results of the generation's upbringing. During the Great Recession, millennials watched first-hand as their parents worked long hours, only to fall victim to downsizing and layoffs.
Many families could not withstand these challenges, leading to high divorce rates and broken families. Millennials do not want to be put in the same position as their parents, so they have made their personal lives a main priority. In fact, fifty-nine percent of Millennials say the Great Recession negatively impacted their career plans, while only 35% of mature workers feel the same way.
For these reasons, millennials are more likely to negotiate the terms of their work. Though some boomers view this as lazy behavior, others have actually been able to learn from millennials, reflecting on whether the sacrifices that they had made in their lives provided them with the happiness that they had hoped for.
Growing up, millennials looked to parents, teachers, and coaches as a source of praise and support. They were a part of an educational system with inflated grades and Standardized tests, in which they were skilled at performing well. They were brought up believing they could be anything and everything they dreamed of. As a result, millennials developed a strong need for frequent, positive feedback from supervisors.
Today, managers find themselves assessing their subordinates’ productivity quite frequently, despite the fact that they often find it burdensome. Additionally, millennials’ salaries and Employee benefits give this generation an idea of how well they are performing. Millennials crave success, and good paying jobs have been proven to make them feel more successful.
Additionally, studies show that promotions are very important to millennials, and when they do not see opportunities for rapid advancement at one organization, they are quick to quit in an effort to find better opportunities. They have an unrealistic timeline for these promotions, however, which frustrates older generations.
They also have a low tolerance for unchallenging work; when work is not stimulating, they often perform poorly out of boredom. As a result, managers must constantly provide millennials with greater responsibility so that they feel more involved and needed in the organization.
Because group projects and presentations were commonplace during the schooling of millennials, this generation enjoys collaborating and even developing close friendships with colleagues. While working as part of a team enhances innovation, enhances productivity, and lowers personnel costs, downsides still exist.
Supervisors find that millennials avoid risk and independent responsibility by relying on team members when making decisions, which prevents them from showcasing their own abilities.
Perhaps the most commonly cited difference between older and younger generations is technological proficiency. Studies have shown that their reliance on technology has made millennials less comfortable with face-to-face interaction and deciphering verbal cues.
However, technological proficiency also has its benefits; millennials are far more effective in multitasking, responding to visual stimulation, and filtering information than older generations.
However, according to the engagement studies, mature workers and the new generations of workers share similar thoughts on a number of topics across the generation gap. Their opinions overlap on flexible working hours/arrangements, promotions/bonuses, the importance of computer proficiency, and leadership. Additionally, the majority of Millennials and mature workers enjoy going to work every day, and feel inspired to do their best.
Generational consciousness:
Generational consciousness is another way of distinguishing among generations that was worked on by social scientist Karl Mannheim. Generational consciousness is when a group of people become mindful of their place in a distinct group identifiable by their shared interests and values.
Social, economic, or political changes can bring awareness to these shared interests and values for similarly-aged people who experience these events together, and thereby form a generational consciousness. These types of experiences can impact individuals' development at a young age and enable them to begin making their own interpretations of the world based on personal encounters that set them apart from other generations.
Inter-generational Living:
"Both social isolation and loneliness in older men and women are associated with increased mortality, according to a 2012 Report by the National Academy of Sciences of the United States of America".
Inter-generational living is one method being used currently worldwide as a means of combating such feelings. A nursing home in Deventer, The Netherlands, developed a program wherein students from a local university are provided small, rent-free apartments within the nursing home facility. In exchange, the students volunteer a minimum of 30 hours per month to spend time with the seniors.
The students will watch sports with the seniors, celebrate birthdays, and simply keep them company during illnesses and times of distress. Programs similar to the Netherlands’ program were developed as far back as the mid-1990s in Barcelona, Spain.
In Spain's program, students were placed in seniors’ homes, with a similar goal of free/cheap housing in exchange for companionship for the elderly. That program quickly spread to 27 other cities throughout Spain, and similar programs can be found in Lyons, France, and Cleveland, Ohio.
Demographics:
In order for sociologists to understand the transition into adulthood of children in different generation gaps, they compare the current generation to both older and earlier generations at the same time.
Not only does each generation experience their own ways of mental and physical maturation, but they also create new aspects of attending school, forming new households, starting families and even creating new demographics. The difference in demographics regarding values, attitudes and behaviors between the two generations are used to create a profile for the emerging generation of young adults.
Following the thriving economic success that was a product of the Second World War, America's population skyrocketed between the years 1940-1959, to which the new American generation was called the Baby Boomers.
Today, as of 2017, many of these Baby Boomers have celebrated their 60th birthdays and in the next few years America's senior citizen population will boost exponentially due to the population of people who were born during the years 1940 and 1959. The generation gap, however, between the Baby Boomers and earlier generations is growing due to the Boomers population post-war.
There is a large demographic difference between the Baby Boomer generation and earlier generations, where earlier generations are less racially and ethnically diverse than the Baby Boomers’ population.
Where this drastic racial demographic difference occurs also holds to a continually growing cultural gap as well; baby boomers have had generally higher education, with a higher percentage of women in the labor force and more often occupying professional and managerial positions. These drastic culture and generation gaps create issues of community preferences as well as spending.
Click on any of the following blue hyperlinks for more about the Generation Gap:
Early sociologists such as Karl Mannheim noted differences across generations in how the youth transistions into adulthood. and studied the ways in which generations separate themselves from one another, in the home and in social situations and areas (such as churches, clubs, senior centers, and youth centers).
The sociological theory of a generation gap first came to light in the 1960s, when the younger generation (later known as Baby Boomers) seemed to go against everything their parents had previously believed in terms of music, values, governmental and political views. Sociologists now refer to "generation gap" as "institutional age segregation".
Usually, when any of these age groups is engaged in its primary activity, the individual members are physically isolated from people of other generations, with little interaction across age barriers except at the nuclear family level.
Distinguishing generation gaps:
There are several ways to make distinctions between generations. For example, names are given to major groups (Baby boomers, Gen X, etc.) and each generation sets its own trends and has its own cultural impact.
Language use:
Generation gaps can be distinguished by the differences in their language use. The generation gap has created a parallel gap in language that can be difficult to communicate across. This issue is one visible throughout society, creating complications within day to day communication at home, in the work place, and within schools.
As new generations seek to define themselves as something apart from the old, they adopt new lingo and slang, allowing a generation to create a sense of division from the previous one. This is a visible gap between generations we see every day. "Man's most important symbol is his language and through this language he defines his reality."
Slang:
Slang is an ever-changing set of colloquial words and phrases that speakers use to establish or reinforce social identity or cohesiveness within a group or with a trend in society at large.
As each successive generation of society struggles to establish its own unique identity among its predecessors it can be determined that generational gaps provide a large influence over the continual change and adaptation of slang.
As slang is often regarded as an ephemeral dialect, a constant supply of new words is required to meet the demands of the rapid change in characteristics. And while most slang terms maintain a fairly brief duration of popularity, slang provides a quick and readily available vernacular screen to establish and maintain generational gaps in a societal context.
Technological influences:
Every generation develops new slang, but with the development of technology, understanding gaps have widened between the older and younger generations. "The term 'communication skills,' for example, might mean formal writing and speaking abilities to an older worker. But it might mean e-mail and instant-messenger savvy to a twenty something."
People often have private conversations in secret in a crowded room in today's age due to the advances of mobile phones and text messaging. Among "texters" a form of slang or texting lingo has developed, often keeping those not as tech savvy out of the loop:
"Children increasingly rely on personal technological devices like cell phones to define themselves and create social circles apart from their families, changing the way they communicate with their parents. Cell phones, instant messaging, e-mail and the like have encouraged younger users to create their own inventive, quirky and very private written language. That has given them the opportunity to essentially hide in plain sight. They are more connected than ever, but also far more independent. Text messaging, in particular, has perhaps become this generation's version of pig Latin."
While in the case with language skills such as shorthand, a system of stenography popular during the twentieth century, technological innovations occurring between generations have made these skills obsolete. Older generations used shorthand to be able to take notes and write faster using abbreviated symbols, rather than having to write each word. However, with new technology and keyboards, newer generations no longer need these older communication skills, like Gregg shorthand.
Although over 20 years ago, language skills such as shorthand classes were taught in many high schools, now students have rarely seen or even heard of forms like shorthand.
The transitions from each level of lifespan development have remained the same throughout history. They have all shared the same basic milestones in their travel from childhood, through midlife and into retirement.
However, while the pathways remain the same—i.e. attending school, marriage, raising families, retiring—the actual journey varies not only with each individual, but with each new generation. For instance, as time goes on, technology is being introduced to individuals at younger and younger ages.
While the Baby Boomers had to introduce Atari and VCRs to their parents, Generation Y’ers had to teach their parents how to maneuver such things as DVRs, cell phones and social media. There is a vast difference in Generation Y’ers and the Baby Boomers when it comes to technology.
In 2011, the National Sleep Foundation conducted a poll that focused on sleep and the use of technology; 95% of those polled admitted to using some form of technology within the last hour before going to bed at night. The study compared the difference in sleep patterns in those who watched TV or listened to music prior to bedtime compared to those who used cell phones, video games and the Internet.
The study looked at Baby Boomers (born 1946-1964), Generation X’ers (born 1965-1980), Generation Y’ers (born 1981-2000) and Generation Z’ers (born mid 1990s or 2000 to present). The research, as expected, showed generational gaps between the different forms of technology used.
The largest gap was shown between texting and talking on the phone; 56% of Gen Z’ers and 42% of Gen Y’ers admitted to sending, receiving, reading text messages every night within one hour prior to bedtime, compared to only 15% of Gen X’ers (born 1965-1980), and 5% of Baby Boomers.
Baby Boomers (born 1946-1964), were more likely to watch TV within the last hour prior to bedtime, 67%, compared to Gen Y’ers (born 1981-2000), who came in at 49%. When asked about computer/internet use within the last hour prior to bedtime, 70% of those polled admitted to using a computer "a few times a week", and from those, 55% of the Gen Z’ers (born mid-1990s or 2000 to present), said they "surf the web" every night before bed.
Language brokering:
Another phenomenon within language that works to define a generation gap occurs within families in which different generations speak different primary languages. In order to find a means to communicate within the household environment, many have taken up the practice of language brokering, which refers to the "interpretation and translation performed in everyday situations by bilinguals who have had no special training".
In immigrant families where the first generation speaks primarily in their native tongue, the second generation primarily in the language of the country in which they now live while still retaining fluency in their parent's dominant language, and the third generation primarily in the language of the country they were born in while retaining little to no conversational language in their grandparent's native tongue, the second generation family members serve as interpreters not only to outside persons, but within the household, further propelling generational differences and divisions by means of linguistic communication.
Furthermore, in some immigrant families and communities, language brokering is also used to integrate children into family endeavors and into civil society. Child integration has become very important to form linkages between new immigrant communities and the predominant culture and new forms of bureaucratic systems. In addition, it also serves towards child development by learning and pitching in.
Workplace Attitudes:
USA Today reported that younger generations are "entering the workplace in the face of demographic change and an increasingly multi-generational workplace". Multiple engagement studies show that the interests shared across the generation gap by members of this increasingly multi-generational workplace can differ substantially.
A popular belief held by older generations is that the characteristics of Millennials can potentially complicate professional interactions. To some managers, this generation is a group of coddled, lazy, disloyal, and narcissistic young people, who are incapable of handling the simplest task without guidance. For this reason, when millennials first enter a new organization, they are often greeted with wary coworkers.
Career was an essential component of the identities of Baby boomers; they made many sacrifices, working 55 to 60 hour weeks, patiently waiting for promotions.
Millennials, on the other hand, are not workaholics and do not place such a strong emphasis on their careers. Even so, they expect all the perks, in terms of good pay and benefits, rapid advancement, work-life balance, stimulating work, and giving back to their community.
Studies have found that millennials are usually exceptionally confident in their abilities and, as a result, fail to prove themselves by working hard, seeking key roles in significant projects early on in their careers, which frustrates their older coworkers.
Most of these inflated expectations are direct results of the generation's upbringing. During the Great Recession, millennials watched first-hand as their parents worked long hours, only to fall victim to downsizing and layoffs.
Many families could not withstand these challenges, leading to high divorce rates and broken families. Millennials do not want to be put in the same position as their parents, so they have made their personal lives a main priority. In fact, fifty-nine percent of Millennials say the Great Recession negatively impacted their career plans, while only 35% of mature workers feel the same way.
For these reasons, millennials are more likely to negotiate the terms of their work. Though some boomers view this as lazy behavior, others have actually been able to learn from millennials, reflecting on whether the sacrifices that they had made in their lives provided them with the happiness that they had hoped for.
Growing up, millennials looked to parents, teachers, and coaches as a source of praise and support. They were a part of an educational system with inflated grades and Standardized tests, in which they were skilled at performing well. They were brought up believing they could be anything and everything they dreamed of. As a result, millennials developed a strong need for frequent, positive feedback from supervisors.
Today, managers find themselves assessing their subordinates’ productivity quite frequently, despite the fact that they often find it burdensome. Additionally, millennials’ salaries and Employee benefits give this generation an idea of how well they are performing. Millennials crave success, and good paying jobs have been proven to make them feel more successful.
Additionally, studies show that promotions are very important to millennials, and when they do not see opportunities for rapid advancement at one organization, they are quick to quit in an effort to find better opportunities. They have an unrealistic timeline for these promotions, however, which frustrates older generations.
They also have a low tolerance for unchallenging work; when work is not stimulating, they often perform poorly out of boredom. As a result, managers must constantly provide millennials with greater responsibility so that they feel more involved and needed in the organization.
Because group projects and presentations were commonplace during the schooling of millennials, this generation enjoys collaborating and even developing close friendships with colleagues. While working as part of a team enhances innovation, enhances productivity, and lowers personnel costs, downsides still exist.
Supervisors find that millennials avoid risk and independent responsibility by relying on team members when making decisions, which prevents them from showcasing their own abilities.
Perhaps the most commonly cited difference between older and younger generations is technological proficiency. Studies have shown that their reliance on technology has made millennials less comfortable with face-to-face interaction and deciphering verbal cues.
However, technological proficiency also has its benefits; millennials are far more effective in multitasking, responding to visual stimulation, and filtering information than older generations.
However, according to the engagement studies, mature workers and the new generations of workers share similar thoughts on a number of topics across the generation gap. Their opinions overlap on flexible working hours/arrangements, promotions/bonuses, the importance of computer proficiency, and leadership. Additionally, the majority of Millennials and mature workers enjoy going to work every day, and feel inspired to do their best.
Generational consciousness:
Generational consciousness is another way of distinguishing among generations that was worked on by social scientist Karl Mannheim. Generational consciousness is when a group of people become mindful of their place in a distinct group identifiable by their shared interests and values.
Social, economic, or political changes can bring awareness to these shared interests and values for similarly-aged people who experience these events together, and thereby form a generational consciousness. These types of experiences can impact individuals' development at a young age and enable them to begin making their own interpretations of the world based on personal encounters that set them apart from other generations.
Inter-generational Living:
"Both social isolation and loneliness in older men and women are associated with increased mortality, according to a 2012 Report by the National Academy of Sciences of the United States of America".
Inter-generational living is one method being used currently worldwide as a means of combating such feelings. A nursing home in Deventer, The Netherlands, developed a program wherein students from a local university are provided small, rent-free apartments within the nursing home facility. In exchange, the students volunteer a minimum of 30 hours per month to spend time with the seniors.
The students will watch sports with the seniors, celebrate birthdays, and simply keep them company during illnesses and times of distress. Programs similar to the Netherlands’ program were developed as far back as the mid-1990s in Barcelona, Spain.
In Spain's program, students were placed in seniors’ homes, with a similar goal of free/cheap housing in exchange for companionship for the elderly. That program quickly spread to 27 other cities throughout Spain, and similar programs can be found in Lyons, France, and Cleveland, Ohio.
Demographics:
In order for sociologists to understand the transition into adulthood of children in different generation gaps, they compare the current generation to both older and earlier generations at the same time.
Not only does each generation experience their own ways of mental and physical maturation, but they also create new aspects of attending school, forming new households, starting families and even creating new demographics. The difference in demographics regarding values, attitudes and behaviors between the two generations are used to create a profile for the emerging generation of young adults.
Following the thriving economic success that was a product of the Second World War, America's population skyrocketed between the years 1940-1959, to which the new American generation was called the Baby Boomers.
Today, as of 2017, many of these Baby Boomers have celebrated their 60th birthdays and in the next few years America's senior citizen population will boost exponentially due to the population of people who were born during the years 1940 and 1959. The generation gap, however, between the Baby Boomers and earlier generations is growing due to the Boomers population post-war.
There is a large demographic difference between the Baby Boomer generation and earlier generations, where earlier generations are less racially and ethnically diverse than the Baby Boomers’ population.
Where this drastic racial demographic difference occurs also holds to a continually growing cultural gap as well; baby boomers have had generally higher education, with a higher percentage of women in the labor force and more often occupying professional and managerial positions. These drastic culture and generation gaps create issues of community preferences as well as spending.
Click on any of the following blue hyperlinks for more about the Generation Gap:
- Achievement gap
- Ageism
- Digital divide
- Income gap
- Inter-generational contract
- Intergenerational equity
- List of Generations
- Marriage gap
- Moral panic
- Student activism
- Student voice
- Transgenerational design
- Youth activism
- Youth voice
- Slang
- Technology
Drug Culture
YouTube Video about the Current Opioid Crisis by PBS NewsHour
Pictured (L-R): Marijuana, Cocaine and Heroin
Drug culture is an example of countercultures that are primarily defined by recreational drug use.
Drug subcultures are groups of people united by a common understanding of the meaning and value (good or otherwise) of the incorporation into one's life of the drug in question.
Such unity can take many forms, from friends who take the drug together, possibly obeying certain rules of etiquette, groups banding together to help each other obtain drugs and avoid arrest to full-scale political movements for the reform of drug laws. The sum of these parts can be considered an individual drug's "culture".
Many artists have used various drugs and explored their influence on human life in general and particularly on the creative process. Hunter S. Thompson's Fear and Loathing in Las Vegas employs drug use as a major theme and provides an example of the drug culture of the 1960s.
Drinking culture:
Main article: Drinking culture
Alcoholic beverages contain ethanol (simply called alcohol). Ethanol is a psychoactive drug primarily found in alcoholic beverages. Alcohol is one of the most commonly abused drugs in the world (Metropol, 1996) often used for self-medication, and as recreational drug use.
Cannabis culture:
Main article: Cannabis culture
Cannabis culture has been responsible for the genre of films known as stoner films which has come to be accepted as a mainstream cinema movement.
In the United States the culture has also spawned its own celebrities (such as Tommy Chong and Terence McKenna), magazines (Cannabis Culture and High Times), and, in North America, its own distinct holiday: April 20, which is marked as a day for calling for the legalization of cannabis and celebration of cannabis.
Click on any of the following blue hyperlinks for more about the Drug Culture:
Drug subcultures are groups of people united by a common understanding of the meaning and value (good or otherwise) of the incorporation into one's life of the drug in question.
Such unity can take many forms, from friends who take the drug together, possibly obeying certain rules of etiquette, groups banding together to help each other obtain drugs and avoid arrest to full-scale political movements for the reform of drug laws. The sum of these parts can be considered an individual drug's "culture".
Many artists have used various drugs and explored their influence on human life in general and particularly on the creative process. Hunter S. Thompson's Fear and Loathing in Las Vegas employs drug use as a major theme and provides an example of the drug culture of the 1960s.
Drinking culture:
Main article: Drinking culture
Alcoholic beverages contain ethanol (simply called alcohol). Ethanol is a psychoactive drug primarily found in alcoholic beverages. Alcohol is one of the most commonly abused drugs in the world (Metropol, 1996) often used for self-medication, and as recreational drug use.
Cannabis culture:
Main article: Cannabis culture
Cannabis culture has been responsible for the genre of films known as stoner films which has come to be accepted as a mainstream cinema movement.
In the United States the culture has also spawned its own celebrities (such as Tommy Chong and Terence McKenna), magazines (Cannabis Culture and High Times), and, in North America, its own distinct holiday: April 20, which is marked as a day for calling for the legalization of cannabis and celebration of cannabis.
Click on any of the following blue hyperlinks for more about the Drug Culture:
Culture Icons
YouTube Video: Top 10 Symbols of America by WatchMojo
YouTube Video: Top 10 Most Influential Counterculture Icons by WatchMojo
A cultural icon is an artifact that is identified by members of a culture as representative of that culture. The process of identification is subjective, and "icons" are judged by the extent to which they can be seen as an authentic proxy of that culture.
When individuals perceive a cultural icon, they relate it to their general perceptions of the cultural identity represented. Cultural icons can also be identified as an authentic representation of the practices of one culture by another.
In the media, many items and persons of popular culture have been called "iconic" despite their lack of durability; and the term "pop icon" is often now used. Some commentators believe that the word is overused or misused.
Types of Culture Icons include:
A subset of cultural icons are national icons.
A web-based survey was set up in 2006 allowing the public to nominate their ideas for national icons of England and the results reflect the range of different types of icon associated with an English view of English culture. Some examples are:
Matryoshka dolls are seen internationally as cultural icons of Russia. In the former Soviet Union, the hammer and sickle symbol and statues of Vladimir Lenin instead represented the country's most prominent cultural icons.
The values, norms and ideals represented by a cultural icon vary both among people who subscribe to it, and more widely among other people who may interpret cultural icons as symbolizing quite different values. Thus an apple pie is a cultural icon of the United States, but its significance varies among Americans.
National icons can become targets for those opposing or criticizing a regime, for example, crowds destroying statues of Lenin in Eastern Europe after the fall of communism or burning the Stars and Stripes flag to protest about US actions abroad.
Religious icons can also become cultural icons in societies where religion and culture are deeply entwined, such as representations of the Madonna in societies with a strong catholic tradition.
Use in Popular Media:
Describing something as iconic or as an icon has become very common in the popular media. This has drawn criticism from some: a writer in Liverpool Daily Post calls "iconic" "a word that makes my flesh creep", a word "pressed into service to describe almost anything." The Christian Examiner nominates "iconic" in its list of overused words, finding over 18,000 "iconic" references in news stories alone, with another 30,000 for "icon", including its use for SpongeBob SquarePants.
Click on any of the following blue hyperlinks for more about Cultural Icons:
When individuals perceive a cultural icon, they relate it to their general perceptions of the cultural identity represented. Cultural icons can also be identified as an authentic representation of the practices of one culture by another.
In the media, many items and persons of popular culture have been called "iconic" despite their lack of durability; and the term "pop icon" is often now used. Some commentators believe that the word is overused or misused.
Types of Culture Icons include:
A subset of cultural icons are national icons.
A web-based survey was set up in 2006 allowing the public to nominate their ideas for national icons of England and the results reflect the range of different types of icon associated with an English view of English culture. Some examples are:
- Big Ben (the nickname for the bell, but widely recognized as Elizabeth Tower of the Houses of Parliament in London);
- Cup of tea (for the British tea drinking habit);
- Red telephone box;
- Red AEC Routemaster London double decker bus;
- Spitfire, a World War II fighter aircraft.
Matryoshka dolls are seen internationally as cultural icons of Russia. In the former Soviet Union, the hammer and sickle symbol and statues of Vladimir Lenin instead represented the country's most prominent cultural icons.
The values, norms and ideals represented by a cultural icon vary both among people who subscribe to it, and more widely among other people who may interpret cultural icons as symbolizing quite different values. Thus an apple pie is a cultural icon of the United States, but its significance varies among Americans.
National icons can become targets for those opposing or criticizing a regime, for example, crowds destroying statues of Lenin in Eastern Europe after the fall of communism or burning the Stars and Stripes flag to protest about US actions abroad.
Religious icons can also become cultural icons in societies where religion and culture are deeply entwined, such as representations of the Madonna in societies with a strong catholic tradition.
Use in Popular Media:
Describing something as iconic or as an icon has become very common in the popular media. This has drawn criticism from some: a writer in Liverpool Daily Post calls "iconic" "a word that makes my flesh creep", a word "pressed into service to describe almost anything." The Christian Examiner nominates "iconic" in its list of overused words, finding over 18,000 "iconic" references in news stories alone, with another 30,000 for "icon", including its use for SpongeBob SquarePants.
Click on any of the following blue hyperlinks for more about Cultural Icons:
- Popular culture
- Horror icon
- Our New Icons by The Daily Telegraph
- Nothing and no one are Off Limits in an Age of Iconomania by The Age
- Culture24: Icons of England
New Age
YouTube Video: This is comedy in the age of Trump (HBO)
Pictured below: American politics in the new age of disbelief (Japan Times)
New Age is a term applied to a range of spiritual or religious beliefs and practices that developed in Western nations during the 1970s.
Precise scholarly definitions of the New Age differ in their emphasis, largely as a result of its highly eclectic structure. Although analytically often considered to be religious, those involved in it typically prefer the designation of spiritual or Mind, Body, Spirit and rarely use the term "New Age" themselves. Many scholars of the subject refer to it as the New Age movement, although others contest this term and suggest that it is better seen as a milieu or zeitgeist.
As a form of Western esotericism, the New Age drew heavily upon a number of older esoteric traditions, in particular those that emerged from the occultist current that developed in the eighteenth century.
Such prominent occult influences include the work of Emanuel Swedenborg and Franz Mesmer, as well as the ideas of Spiritualism, New Thought, and Theosophy. A number of mid-twentieth century influences, such as the UFO religions of the 1950s, the Counterculture of the 1960s, and the Human Potential Movement, also exerted a strong influence on the early development of the New Age.
The exact origins of the phenomenon remain contested, but there is general agreement that it developed in the 1970s, at which time it was centred largely in the United Kingdom. It expanded and grew largely in the 1980s and 1990s, in particular within the United States. By the start of the 21st century, the term "New Age" was increasingly rejected within this milieu, with some scholars arguing that the New Age phenomenon had ended.
Despite its highly eclectic nature, a number of beliefs commonly found within the New Age have been identified. Theologically, the New Age typically adopts a belief in a holistic form of divinity that imbues all of the universe, including human beings themselves.
There is thus a strong emphasis on the spiritual authority of the self. This is accompanied by a common belief in a wide variety of semi-divine non-human entities, such as angels and masters, with whom humans can communicate, particularly through the form of channeling.
Typically viewing human history as being divided into a series of distinct ages, a common New Age belief is that whereas once humanity lived in an age of great technological advancement and spiritual wisdom, it has entered a period of spiritual degeneracy, which will be remedied through the establishment of a coming Age of Aquarius, from which the milieu gets its name. There is also a strong focus on healing, particularly using forms of alternative medicine, and an emphasis on a New Age "science" that seeks to unite science and spirituality.
Centered primarily in Western countries, those involved in the New Age have been primarily from middle and upper-middle-class backgrounds. The degree to which New Agers are involved in the milieu varied considerably, from those who adopted a number of New Age ideas and practices to those who fully embraced and dedicated their lives to it.
The New Age has generated criticism from established Christian organisations as well as modern Pagan and indigenous communities. From the 1990s onward, the New Age became the subject of research by academic scholars of religious studies.
Click on any of the following blue hyperlinks for more about the Culture of the New Age:
Precise scholarly definitions of the New Age differ in their emphasis, largely as a result of its highly eclectic structure. Although analytically often considered to be religious, those involved in it typically prefer the designation of spiritual or Mind, Body, Spirit and rarely use the term "New Age" themselves. Many scholars of the subject refer to it as the New Age movement, although others contest this term and suggest that it is better seen as a milieu or zeitgeist.
As a form of Western esotericism, the New Age drew heavily upon a number of older esoteric traditions, in particular those that emerged from the occultist current that developed in the eighteenth century.
Such prominent occult influences include the work of Emanuel Swedenborg and Franz Mesmer, as well as the ideas of Spiritualism, New Thought, and Theosophy. A number of mid-twentieth century influences, such as the UFO religions of the 1950s, the Counterculture of the 1960s, and the Human Potential Movement, also exerted a strong influence on the early development of the New Age.
The exact origins of the phenomenon remain contested, but there is general agreement that it developed in the 1970s, at which time it was centred largely in the United Kingdom. It expanded and grew largely in the 1980s and 1990s, in particular within the United States. By the start of the 21st century, the term "New Age" was increasingly rejected within this milieu, with some scholars arguing that the New Age phenomenon had ended.
Despite its highly eclectic nature, a number of beliefs commonly found within the New Age have been identified. Theologically, the New Age typically adopts a belief in a holistic form of divinity that imbues all of the universe, including human beings themselves.
There is thus a strong emphasis on the spiritual authority of the self. This is accompanied by a common belief in a wide variety of semi-divine non-human entities, such as angels and masters, with whom humans can communicate, particularly through the form of channeling.
Typically viewing human history as being divided into a series of distinct ages, a common New Age belief is that whereas once humanity lived in an age of great technological advancement and spiritual wisdom, it has entered a period of spiritual degeneracy, which will be remedied through the establishment of a coming Age of Aquarius, from which the milieu gets its name. There is also a strong focus on healing, particularly using forms of alternative medicine, and an emphasis on a New Age "science" that seeks to unite science and spirituality.
Centered primarily in Western countries, those involved in the New Age have been primarily from middle and upper-middle-class backgrounds. The degree to which New Agers are involved in the milieu varied considerably, from those who adopted a number of New Age ideas and practices to those who fully embraced and dedicated their lives to it.
The New Age has generated criticism from established Christian organisations as well as modern Pagan and indigenous communities. From the 1990s onward, the New Age became the subject of research by academic scholars of religious studies.
Click on any of the following blue hyperlinks for more about the Culture of the New Age:
- Definitions
- History
- Beliefs and practices
- Demographics
- Commercial aspects
- Politics
- Reception
- See also:
- Higher consciousness
- Hippies
- Hypnosis
- Mantras
- New Age communities
- New religious movement
- Paradigm shift
- Peace movement
- Reincarnation
- Philosophy of happiness
- Spiritual evolution
- New Age at Curlie (based on DMOZ)
- Center for Visionary Leadership. Organization co-founded and directed by Corinne McLaughlin, co-author of Spiritual Politics, cited above.
- Lorian Association. Organization co-founded and co-directed by David Spangler, author of Revelation: The Birth of a New Age, cited above.
- "The New Age 40 Years Later". Huffington Post interview of Mark Satin, author of New Age Politics, cited above
The Culture of Cannabis including its Consumption and the Effects of Cannabis
- YouTube Video of the Benefits of Cannabis to One'sHealth
- YouTube Video: Surprising truths about legalizing cannabis | Ben Cort | TEDxMileHigh
- YouTube Video: Top 10 Celebrity Potheads (WatchMojo)
Cannabis culture describes a social atmosphere or series of associated social behaviors that depends heavily upon cannabis consumption, particularly as an entheogen, recreational drug and medicine.
Historically cannabis has been used an entheogen to enduce spiritual experiences - most notably in the Indian subcontinent since the Vedic period dating back to approximately 1500 BCE, but perhaps as far back as 2000 BCE.
Its entheogenic use was also recorded in Ancient China, the Germanic peoples the Celts, Ancient Central Asia, and Africa.
In modern times, spiritual use of the drug is mostly associated with the Rastafari movement of Jamaica. Several Western subcultures have had marijuana consumption as an idiosyncratic feature, such as hippies, beatniks, hipsters (both the 1940s subculture and the contemporary subculture), ravers and hip hop.
Cannabis has now "evolved its own language, humour, etiquette, art, literature and music." Nick Brownlee writes: "Perhaps because of its ancient mystical and spiritual roots, because of the psychotherapeutic effects of the drug and because it is illegal, even the very act of smoking a joint has deep symbolism."
However, the culture of cannabis as "the manifestation of introspection and bodily pass ivity" — which has generated a negative "slacker" stereotype around its consumers — is a relatively modern concept, as cannabis has been consumed in various forms for almost 5,000 years.
The counterculture of the 1960s has been identified as the era that "sums up the glory years of modern cannabis culture," with the Woodstock Festival serving as "the pinnacle of the hippie revolution in the USA, and in many people's opinion the ultimate example of cannabis culture at work".
The influence of cannabis has encompassed holidays (most notably 4/20), cinema (such as the exploitation and stoner film genres), music (particularly jazz, reggae, psychedelia and rap music), and magazines including High Times and Cannabis Culture.
Social Custom:
Main article: Recreational drug use
Cannabis was once sold in clubs known as "Teapads" during Prohibition in the United States; jazz was usually played at these clubs. Cannabis was often viewed to be of lower class and was disliked by many.
After the outlawing of cannabis, its consumption was used in secret. Years later after cannabis has been once again tolerated legally in some regions. Holidays have formed around the consumption of cannabis such as 420, named after the popular time of day to consume cannabis (4:20 pm and celebrated on April 20 (4/20). If consumed in a social setting it is encouraged to share your cannabis with others
Click here for more about the Culture of Cannabis.
___________________________________________________________________________
The Consumption of Cannabis:
Cannabis consumption refers to the variety of ways cannabis is consumed, among which inhalation (smoking and vaporizing) and ingestion are most common. Salves and absorption through the skin (transdermal) are increasingly common in medical uses, both of CBD, THC, and other cannabinoids.
Each method leads to subtly different psychoactive effects due to the THC and other chemicals being activated, and then consumed through different administration routes. It is generally considered that smoking, which includes combustion toxins, comes on quickly but lasts for a short period of time, while eating delays the onset of effect but the duration of effect is typically longer.
In a 2007 Science Daily report of research conducted at the University of California–San Francisco, researchers reported that vaporizer users experience the same biological effect, but without the toxins associated with smoking
Click on any of the following blue hyperlinks for more about Cannabis Consumption: ___________________________________________________________________________
The Effects of Consuming Cannabis:
The effects of cannabis are caused by the chemical compounds in the plant, including cannabinoids, such as tetrahydrocannabinol (THC), which is only one of more than 100 different cannabinoids present in the plant.
Cannabis has various psychological and physiological effects on the human body.
Different plants of the genus Cannabis contain different and often unpredictable concentrations of THC and other cannabinoids and hundreds of other molecules that have a pharmacological effect, so that the final net effect cannot reliably be foreseen.
Acute effects while under the influence can include euphoria and anxiety. Cannabidiol (CBD), another cannabinoid found in cannabis in varying amounts, has been shown to alleviate the adverse effects of THC that some consumers experience. When ingested orally, THC can produce stronger psychotropic effects than when inhaled. At doses exceeding the psychotropic threshold, users may experience adverse side effects such as anxiety and panic attacks that can result in increased heart rate and changes in blood pressure.
Research about medical benefits of cannabis has been hindered by United States federal law.
Smoking any substance could possibly carry similar risks as smoking tobacco due to carcinogens in all smoke, and the ultimate conclusions on these factors are disputed.
Cannabis use disorder is defined as a medical diagnosis in the fifth revision of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5).
Efficacies:
Cannabinoids and cannabinoid receptors
The most prevalent psychoactive substances in cannabis are cannabinoids, most notably THC. Some varieties, having undergone careful selection and growing techniques, can yield as much as 34% THC. Another psychoactive cannabinoid present in Cannabis sativa is tetrahydrocannabivarin (THCV), but it is only found in small amounts and is a cannabinoid antagonist.
There are also similar compounds contained in cannabis that do not exhibit any psychoactive response but are obligatory for functionality:
How these other compounds interact with THC is not fully understood. Some clinical studies have proposed that CBD acts as a balancing force to regulate the strength of the psychoactive agent THC.
CBD is also believed to regulate the body’s metabolism of THC by inactivating cytochrome P450, an important class of enzymes that metabolize drugs. Experiments in which babies were treated with CBD followed by THC showed that CBD treatment was associated with a substantial increase in brain concentrations of THC and its major metabolites, most likely because it decreased the rate of clearance of THC from the body.
Cannabis cofactor compounds have also been linked to lowering body temperature, modulating immune functioning, and cell protection.
The essential oil of cannabis contains many fragrant terpenoids which may synergize with the cannabinoids to produce their unique effects. THC is converted rapidly to 11-hydroxy-THC, which is also pharmacologically active, so the drug effect outlasts measurable THC levels in blood.
THC and cannabidiol are also neuroprotective antioxidants. Research in rats has indicated that THC prevented hydroperoxide-induced oxidative damage as well as or better than other antioxidants in a chemical (Fenton reaction) system and neuronal cultures.
Cannabidiol was significantly more protective than either vitamin E or vitamin C.
The cannabinoid receptor is a typical member of the largest known family of receptors called a G protein-coupled receptor. A signature of this type of receptor is the distinct pattern of how the receptor molecule spans the cell membrane seven times. The location of cannabinoid receptors exists on the cell membrane, and both outside (extracellularly) and inside (intracellularly) the cell membrane.
CB1 receptors, the bigger of the two, are extraordinarily abundant in the brain: 10 times more plentiful than μ-opioid receptors, the receptors responsible for the effects of morphine. CB2 receptors are structurally different (the sequence similarity between the two subtypes of receptors is 44%), found only on cells of the immune system, and seems to function similarly to its CB1 counterpart.
CB2 receptors are most commonly prevalent on B-cells, natural killer cells, and monocytes, but can also be found on polymorphonuclear neutrophil cells, T8 cells, and T4 cells. In the tonsils the CB2 receptors appear to be restricted to B-lymphocyte-enriched areas.
THC and its endogenous equivalent anandamide additionally interact with glycine receptors.
Biochemical mechanisms in the brain:
See also: Cannabis (drug) § Mechanism of action
Cannabinoids usually contain a 1,1'-di-methyl-pyran ring, a variedly derivatized aromatic ring and a variedly unsaturated cyclohexyl ring and their immediate chemical precursors, constituting a family of about 60 bi-cyclic and tri-cyclic compounds.
Like most other neurological processes, the effects of cannabis on the brain follow the standard protocol of signal transduction, the electrochemical system of sending signals through neurons for a biological response.
It is now understood that cannabinoid receptors appear in similar forms in most vertebrates and invertebrates and have a long evolutionary history of 500 million years.
The binding of cannabinoids to cannabinoid receptors decrease adenylyl cyclase activity, inhibit calcium N channels, and disinhibit K+A channels. There are at least two types of cannabinoid receptors (CB1 and CB2).
The CB1 receptor is found primarily in the brain and mediates the psychological effects of THC. The CB2 receptor is most abundantly found on cells of the immune system.
Cannabinoids act as immunomodulators at CB2 receptors, meaning they increase some immune responses and decrease others.
For example, nonpsychotropic cannabinoids can be used as a very effective anti-inflammatory. The affinity of cannabinoids to bind to either receptor is about the same, with only a slight increase observed with the plant-derived compound CBD binding to CB2 receptors more frequently.
Cannabinoids likely have a role in the brain’s control of movement and memory, as well as natural pain modulation. It is clear that cannabinoids can affect pain transmission and, specifically, that cannabinoids interact with the brain's endogenous opioid system and may affect dopamine transmission.
Sustainability in the body:
Main article: Cannabis drug testing
Most cannabinoids are lipophilic (fat soluble) compounds that are easily stored in fat, thus yielding a long elimination half-life relative to other recreational drugs. The THC molecule, and related compounds, are usually detectable in drug tests from 3 days up to 10 days according to Redwood Laboratories; long-term users can produce positive tests for two to three months after ceasing cannabis use (see drug test).
Toxicities:
Related to cannabinoids:
No fatal overdoses with cannabis use have been reported as of 2006. A review published in the British Journal of Psychiatry in February 2008 said that "no deaths directly due to acute cannabis use have ever been reported".
THC, the principal psychoactive constituent of the cannabis plant, has an extremely low toxicity and the amount that can enter the body through the consumption of cannabis plants poses no threat of death. In dogs, the minimum lethal dose of THC is over 3 g/kg.
According to the Merck Index, the LD50 of THC (the dose which causes the death of 50% of individuals) is 1270 mg/kg for male rats and 730 mg/kg for female rats from oral consumption in sesame oil, and 42 mg/kg for rats from inhalation.
It is important though to note that cannabinoids and other molecules present in cannabis can alter the metabolism of other drugs, especially due to competition for clearing metabolic pathways such as cytochromes CYP450, thus leading to drug toxicities by medications that the person consuming cannabis may be taking.
Related to smoking:
A 2007 study found that while tobacco and cannabis smoke are quite similar, cannabis smoke contained higher amounts of ammonia, hydrogen cyanide, and nitrogen oxides, but lower levels of carcinogenic polycyclic aromatic hydrocarbons (PAHs). This study found that directly inhaled cannabis smoke contained as much as 20 times as much ammonia and 5 times as much hydrogen cyanide as tobacco smoke and compared the properties of both mainstream and sidestream (smoke emitted from a smouldering 'joint' or 'cone') smoke.
Mainstream cannabis smoke was found to contain higher concentrations of selected polycyclic aromatic hydrocarbons (PAHs) than sidestream tobacco smoke. However, other studies have found much lower disparities in ammonia and hydrogen cyanide between cannabis and tobacco, and that some other constituents (such as polonium-210, lead, arsenic, nicotine, and tobacco-specific nitrosamines) are either lower or non-existent in cannabis smoke.
Cannabis smoke contains thousands of organic and inorganic chemical compounds. This tar is chemically similar to that found in tobacco smoke or cigars.
Over fifty known carcinogens have been identified in cannabis smoke. These include nitrosamines, reactive aldehydes, and polycylic hydrocarbons, including benz[a]pyrene. Marijuana smoke was listed as a cancer agent in California in 2009.
A study by the British Lung Foundation published in 2012 identifies cannabis smoke as a carcinogen and also finds awareness of the danger is low compared with the high awareness of the dangers of smoking tobacco particularly among younger users.
Other observations include possible increased risk from each cigarette; lack of research on the effect of cannabis smoke alone; low rate of addiction compared to tobacco; and episodic nature of cannabis use compared to steady frequent smoking of tobacco.
Professor David Nutt, a UK drug expert, points out that the study cited by the British Lung Foundation has been accused of both "false reasoning" and "incorrect methodology". Further, he notes that other studies have failed to connect cannabis with lung cancer, and accuses the BLF of "scaremongering over cannabis".
Short Term Effects:
When smoked, the short-term effects of cannabis manifest within seconds and are fully apparent within a few minutes, typically lasting for 1–3 hours, varying by the person and the strain of cannabis.
After oral ingestion of cannabis, the onset of effect is delayed relative to smoking, taking 30 minutes to 2 hours, but the duration is prolonged due to continued slow absorption. The duration of noticeable effects has been observed to diminish due to prolonged, repeated use and the development of a tolerance to cannabinoids.
Psychological effects:
The psychoactive effects of cannabis, known as a "high", are subjective and can vary based on the person and the method of use.
When THC enters the blood stream and reaches the brain, it binds to cannabinoid receptors. The endogenous ligand of these receptors is anandamide, the effects of which THC emulates. This agonism of the cannabinoid receptors results in changes in the levels of various neurotransmitters, especially dopamine and norepinephrine; neurotransmitters which are closely associated with the acute effects of cannabis ingestion, such as euphoria and anxiety.
Some effects may include a general alteration of conscious perception, euphoria, feelings of well-being, relaxation or stress reduction, increased appreciation of the arts, including humor and music (especially discerning its various components/instruments), joviality, metacognition and introspection, enhanced recollection (episodic memory), increased sensuality, increased awareness of sensation, increased libido, and creativity.
Abstract or philosophical thinking, disruption of linear memory and paranoia or anxiety are also typical. Anxiety is the most commonly reported side effect of smoking marijuana.
Between 20 and 30 percent of recreational users experience intense anxiety and/or panic attacks after smoking cannabis, however, some report anxiety only after not smoking cannabis for a prolonged period of time. Inexperience and use in an unfamiliar environment are major contributing factors to this anxiety.
Cannabidiol (CBD), another cannabinoid found in cannabis in varying amounts, has been shown to ameliorate the adverse effects of THC, including anxiety, that some consumers experience.
Cannabis also produces many other subjective and highly tangible effects, such as greater enjoyment of food taste and aroma, and marked distortions in the perception of time and space (where experiencing a "rush" of ideas from the bank of long-term memory can create the subjective impression of long elapsed time, while in reality only a short time has passed).
At higher doses, effects can include altered body image, auditory and/or visual illusions, pseudohallucinations, and ataxia from selective impairment of polysynaptic reflexes. In some cases, cannabis can lead to dissociative states such as depersonalization and derealization.
Any episode of acute psychosis that accompanies cannabis use usually abates after 6 hours, but in rare instances, heavy users may find the symptoms continuing for many days. If the episode is accompanied by aggression or sedation, physical restraint may be necessary.
While many psychoactive drugs clearly fall into the category of either stimulant, depressant, or hallucinogen, cannabis exhibits a mix of all properties, perhaps leaning the most towards hallucinogenic or psychedelic properties, though with other effects quite pronounced as well. THC is typically considered the primary active component of the cannabis plant; various scientific studies have suggested that certain other cannabinoids like CBD may also play a significant role in its psychoactive effects.
Somatic Effects:
Some of the short-term physical effects of cannabis use include increased heart rate, dry mouth, reddening of the eyes (congestion of the conjunctival blood vessels), a reduction in intra-ocular pressure, muscle relaxation and a sensation of cold or hot hands and feet and / or flushed face.
Electroencephalography or EEG shows somewhat more persistent alpha waves of slightly lower frequency than usual. Cannabinoids produce a "marked depression of motor activity" via activation of neuronal cannabinoid receptors belonging to the CB1 subtype.
Duration:
Peak levels of cannabis-associated intoxication occur approximately 30 minutes after smoking it and last for several hours.
Smoked:
The total short-term duration of cannabis use when smoked is based on the potency, method of smoking – e.g. whether pure or in conjunction with tobacco – and how much is smoked. Peak levels of intoxication typically last an average of three to four hours.
Oral:
When taken orally (in the form of capsules, food or drink), the psychoactive effects take longer to manifest and generally last longer, typically lasting for an average of four to ten hours after consumption. Very high doses may last even longer. Also, oral ingestion use eliminates the need to inhale toxic combustion products created by smoking and therefore negates the risk of respiratory harm associated with cannabis smoking.
Neurological effects:
The areas of the brain where cannabinoid receptors are most prevalently located are consistent with the behavioral effects produced by cannabinoids.
Brain regions in which cannabinoid receptors are very abundant are:
Other regions where cannabinoid receptors are moderately concentrated are:
Experiments on animal and human tissue have demonstrated a disruption of short-term memory formation, which is consistent with the abundance of C receptors on the hippocampus, the region of the brain most closely associated with memory.
Cannabinoids inhibit the release of several neurotransmitters in the hippocampus such as acetylcholine, norepinephrine, and glutamate, resulting in a major decrease in neuronal activity in that region. This decrease in activity resembles a "temporary hippocampal lesion."
In in-vitro experiments THC at extremely high concentrations, which could not be reached with commonly consumed doses, caused competitive inhibition of the AChE enzyme and inhibition of β-amyloid peptide aggregation, implicated in the development of Alzheimer's disease.
Compared to currently approved drugs prescribed for the treatment of Alzheimer's disease, THC is a considerably superior inhibitor of A aggregation, and this study provides a previously unrecognized molecular mechanism through which cannabinoid molecules may impact the progression of this debilitating disease.
Effects on driving:
Main article: Cannabis and impaired driving
While several studies have shown increased risk associated with cannabis use by drivers, other studies have not found increased risk.
Cannabis usage has been shown in some studies to have a negative effect on driving ability. The British Medical Journal indicated that "drivers who consume cannabis within three hours of driving are nearly twice as likely to cause a vehicle collision as those who are not under the influence of drugs or alcohol".
In Cannabis and driving: a review of the literature and commentary, the United Kingdom's Department for Transport reviewed data on cannabis and driving, finding although impaired, "subjects under cannabis treatment appear to perceive that they are indeed impaired. Where they can compensate, they do...".
In a review of driving simulator studies, researchers note that "even in those who learn to compensate for a drug's impairing effects, substantial impairment in performance can still be observed under conditions of general task performance (i.e. when no contingencies are present to maintain compensated performance)."
A 2012 meta-analysis found that acute cannabis use increased the risk of an automobile crash. An extensive 2013 review of 66 studies regarding crash risk and drug use found that cannabis was associated with minor, but not statistically significant increased odds of injury or fatal accident.
In the largest and most precisely controlled study of its kind carried out by the U.S. Department of Transportation’s National Highway Traffic Safety Administration, it was found that other "studies that measure the presence of THC in the drivers' blood or oral fluid, rather than relying on self-report tend to have much lower (or no) elevated crash risk estimates.
Likewise better controlled studies have found lower (or no) elevated crash risk estimates". The study found that "after adjusting for age, gender, race and alcohol use, drivers who tested positive for marijuana were no more likely to crash than those who had not used any drugs or alcohol prior to driving".
On the other hand, a recent study of Journal of Transport & Health indicated that the numbers of fatal crashes involving marijuana after the recreational marijuana legalization or decriminalization have significantly increased in Colorado, Washington, and Massachusetts.
Cardiovascular effects:
Short-term (one to two hours) effects on the cardiovascular system can include increased heart rate, dilation of blood vessels, and fluctuations in blood pressure.
There are medical reports of occasional heart attacks or myocardial infarction, stroke and other cardiovascular side effects.
Marijuana's cardiovascular effects are not associated with serious health problems for most young, healthy users.
Researchers reported in the International Journal of Cardiology, "Marijuana use by older people, particularly those with some degree of coronary artery or cerebrovascular disease, poses greater risks due to the resulting increase in catecholamines, cardiac workload, and carboxyhemoglobin levels, and concurrent episodes of profound postural hypotension.
Indeed, marijuana may be a much more common cause of myocardial infarction than is generally recognized. In day-to-day practice, a history of marijuana use is often not sought by many practitioners, and even when sought, the patient's response is not always truthful".
A 2013 analysis of 3,886 myocardial infarction survivors over an 18-year period showed "no statistically significant association between marijuana use and mortality".
A 2008 study by the National Institutes of Health Biomedical Research Centre in Baltimore found that heavy, chronic smoking of marijuana (138 joints per week) changed blood proteins associated with heart disease and stroke.
A 2000 study by researchers at Boston's Beth Israel Deaconess Medical Center, Massachusetts General Hospital and Harvard School of Public Health found that a middle-age person's risk of heart attack rises nearly fivefold in the first hour after smoking marijuana, "roughly the same risk seen within an hour of sexual activity".
Cannabis arteritis is a very rare peripheral vascular disease similar to Buerger's disease. There were about 50 confirmed cases from 1960 to 2008, all of which occurred in Europe.
Combination with other drugs:
A confounding factor in cannabis research is the prevalent usage of other recreational drugs, especially alcohol and nicotine. Such complications demonstrate the need for studies on cannabis that have stronger controls, and investigations into alleged symptoms of cannabis use that may also be caused by tobacco.
Some critics question whether agencies doing the research make an honest effort to present an accurate, unbiased summary of the evidence, or whether they "cherry-pick" their data to please funding sources which may include the tobacco industry or governments dependent on cigarette tax revenue; others caution that the raw data, and not the final conclusions, are what should be examined.
The Australian National Household Survey of 2001 showed that cannabis in Australia is rarely used without other drugs. 95% of cannabis users also drank alcohol; 26% took amphetamines; 19% took ecstasy and only 2.7% reported not having used any other drug with cannabis.
While research has been undertaken on the combined effects of alcohol and cannabis on performing certain tasks, little research has been conducted on the reasons why this combination is so popular.
Evidence from a controlled experimental study undertaken by Lukas and Orozco suggests that alcohol causes THC to be absorbed more rapidly into the blood plasma of the user. Data from the Australian National Survey of Mental Health and Wellbeing found that three-quarters of recent cannabis users reported using alcohol when cannabis was not available, this suggests that the two are substitutes.
Memory and learning:
Main article: Cannabis and memory
Studies on cannabis and memory are hindered by small sample sizes, confounding drug use, and other factors. The strongest evidence regarding cannabis and memory focuses on its temporary negative effects on short-term and working memory.
In a 2001 study looking at neuropsychological performance in long-term cannabis users, researchers found "some cognitive deficits appear detectable at least 7 days after heavy cannabis use but appear reversible and related to recent cannabis exposure rather than irreversible and related to cumulative lifetime use".
On his studies regarding cannabis use, lead researcher and Harvard professor Harrison Pope said he found marijuana is not dangerous over the long term, but there are short-term effects.
From neuropsychological tests, Pope found that chronic cannabis users showed difficulties, with verbal memory in particular, for "at least a week or two" after they stopped smoking. Within 28 days, memory problems vanished and the subjects "were no longer distinguishable from the comparison group".
Researchers from the University of California, San Diego School of Medicine failed to show substantial, systemic neurological effects from long-term recreational use of cannabis. Their findings were published in the July 2003 issue of the Journal of the International Neuropsychological Society.
The research team, headed by Dr Igor Grant, found that cannabis use did affect perception, but did not cause permanent brain damage. Researchers looked at data from 15 previously published controlled studies involving 704 long-term cannabis users and 484 nonusers. The results showed long-term cannabis use was only marginally harmful on the memory and learning.
Other functions such as reaction time, attention, language, reasoning ability, perceptual and motor skills were unaffected. The observed effects on memory and learning, they said, showed long-term cannabis use caused "selective memory defects", but that the impact was "of a very small magnitude".
A study at Johns Hopkins University School of Medicine showed that very heavy use of marijuana is associated with decrements in neurocognitive performance even after 28 days of abstinence.
Appetite:
The feeling of increased appetite following the use of cannabis has been documented for hundreds of years, and is known colloquially as "the munchies" in the English-speaking world.
Clinical studies and survey data have found that cannabis increases food enjoyment and interest in food. A 2015 study suggests that cannabis triggers uncharacteristic behaviour in POMC neurons, which are usually associated with decreasing hunger. Rarely, chronic users experience a severe vomiting disorder, cannabinoid hyperemesis syndrome, after smoking and find relief by taking hot baths.
Endogenous cannabinoids ("endocannabinoids") were discovered in cow's milk and soft cheeses. Endocannabinoids are also found in human breast milk. It is widely accepted that the neonatal survival of many species "is largely dependent upon their suckling behavior, or appetite for breast milk" and recent research has identified the endogenous cannabinoid system to be the first neural system to display complete control over milk ingestion and neonatal survival. It is possible that "cannabinoid receptors in our body interact with the cannabinoids in milk to stimulate a suckling response in newborns so as to prevent growth failure".
Pathogens and microtoxins:
Most microorganisms found in cannabis only affect plants and not humans, but some microorganisms, especially those that proliferate when the herb is not correctly dried and stored, can be harmful to humans. Some users may store marijuana in an airtight bag or jar in a refrigerator to prevent fungal and bacterial growth.
Fungi:
The following fungi have been found in moldy cannabis:
Aspergillus mold species can infect the lungs via smoking or handling of infected cannabis and cause opportunistic and sometimes deadly aspergillosis. Some of the microorganisms found create aflatoxins, which are toxic and carcinogenic. Researchers suggest that moldy cannabis should thus be discarded to avoid these serious risks.
Mold is also found in smoke from mold-infected cannabis, and the lungs and nasal passages are a major means of contracting fungal infections. Levitz and Diamond (1991) suggested baking marijuana in home ovens at 150 °C [302 °F], for five minutes before smoking. Oven treatment killed conidia of A. fumigatus, A. flavus and A. niger, and did not degrade the active component of marijuana, tetrahydrocannabinol (THC)."
Long-term Effects:
Main articles: Long-term effects of cannabis and Cannabis dependence
Exposure to marijuana may have biologically-based physical, mental, behavioral and social health consequences and is "associated with diseases of the liver (particularly with co-existing hepatitis C), lungs, heart, eyesight and vasculature" according to a 2013 literature review by Gordon and colleagues.
The association with these diseases has only been reported in cases where people have smoked cannabis. The authors cautioned that "evidence is needed, and further research should be considered, to prove causal associations of marijuana with many physical health conditions".
Cannabis use disorder is defined in the fifth revision of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5) as a condition requiring treatment. Several drugs have been investigated in an attempt to ameliorate the symptoms of stopping cannabis use. Such drugs include bupropion, divalproex, nefazodone, lofexidine, and dronabinol. Of these, dronabinol has proven the most effective.
Effects in pregnancy:
Main article: Cannabis in pregnancy
Cannabis consumption in pregnancy might be associated with restrictions in growth of the fetus, miscarriage, and cognitive deficits in offspring based on animal studies, although there is limited evidence for this in humans at this time.
A 2012 systematic review found although it was difficult to draw firm conclusions, there was some evidence that prenatal exposure to cannabis was associated with "deficits in language, attention, areas of cognitive performance, and delinquent behavior in adolescence".
A report prepared for the Australian National Council on Drugs concluded cannabis and other cannabinoids are contraindicated in pregnancy as it may interact with the endocannabinoid system.
See also:
Historically cannabis has been used an entheogen to enduce spiritual experiences - most notably in the Indian subcontinent since the Vedic period dating back to approximately 1500 BCE, but perhaps as far back as 2000 BCE.
Its entheogenic use was also recorded in Ancient China, the Germanic peoples the Celts, Ancient Central Asia, and Africa.
In modern times, spiritual use of the drug is mostly associated with the Rastafari movement of Jamaica. Several Western subcultures have had marijuana consumption as an idiosyncratic feature, such as hippies, beatniks, hipsters (both the 1940s subculture and the contemporary subculture), ravers and hip hop.
Cannabis has now "evolved its own language, humour, etiquette, art, literature and music." Nick Brownlee writes: "Perhaps because of its ancient mystical and spiritual roots, because of the psychotherapeutic effects of the drug and because it is illegal, even the very act of smoking a joint has deep symbolism."
However, the culture of cannabis as "the manifestation of introspection and bodily pass ivity" — which has generated a negative "slacker" stereotype around its consumers — is a relatively modern concept, as cannabis has been consumed in various forms for almost 5,000 years.
The counterculture of the 1960s has been identified as the era that "sums up the glory years of modern cannabis culture," with the Woodstock Festival serving as "the pinnacle of the hippie revolution in the USA, and in many people's opinion the ultimate example of cannabis culture at work".
The influence of cannabis has encompassed holidays (most notably 4/20), cinema (such as the exploitation and stoner film genres), music (particularly jazz, reggae, psychedelia and rap music), and magazines including High Times and Cannabis Culture.
Social Custom:
Main article: Recreational drug use
Cannabis was once sold in clubs known as "Teapads" during Prohibition in the United States; jazz was usually played at these clubs. Cannabis was often viewed to be of lower class and was disliked by many.
After the outlawing of cannabis, its consumption was used in secret. Years later after cannabis has been once again tolerated legally in some regions. Holidays have formed around the consumption of cannabis such as 420, named after the popular time of day to consume cannabis (4:20 pm and celebrated on April 20 (4/20). If consumed in a social setting it is encouraged to share your cannabis with others
Click here for more about the Culture of Cannabis.
___________________________________________________________________________
The Consumption of Cannabis:
Cannabis consumption refers to the variety of ways cannabis is consumed, among which inhalation (smoking and vaporizing) and ingestion are most common. Salves and absorption through the skin (transdermal) are increasingly common in medical uses, both of CBD, THC, and other cannabinoids.
Each method leads to subtly different psychoactive effects due to the THC and other chemicals being activated, and then consumed through different administration routes. It is generally considered that smoking, which includes combustion toxins, comes on quickly but lasts for a short period of time, while eating delays the onset of effect but the duration of effect is typically longer.
In a 2007 Science Daily report of research conducted at the University of California–San Francisco, researchers reported that vaporizer users experience the same biological effect, but without the toxins associated with smoking
Click on any of the following blue hyperlinks for more about Cannabis Consumption: ___________________________________________________________________________
The Effects of Consuming Cannabis:
The effects of cannabis are caused by the chemical compounds in the plant, including cannabinoids, such as tetrahydrocannabinol (THC), which is only one of more than 100 different cannabinoids present in the plant.
Cannabis has various psychological and physiological effects on the human body.
Different plants of the genus Cannabis contain different and often unpredictable concentrations of THC and other cannabinoids and hundreds of other molecules that have a pharmacological effect, so that the final net effect cannot reliably be foreseen.
Acute effects while under the influence can include euphoria and anxiety. Cannabidiol (CBD), another cannabinoid found in cannabis in varying amounts, has been shown to alleviate the adverse effects of THC that some consumers experience. When ingested orally, THC can produce stronger psychotropic effects than when inhaled. At doses exceeding the psychotropic threshold, users may experience adverse side effects such as anxiety and panic attacks that can result in increased heart rate and changes in blood pressure.
Research about medical benefits of cannabis has been hindered by United States federal law.
Smoking any substance could possibly carry similar risks as smoking tobacco due to carcinogens in all smoke, and the ultimate conclusions on these factors are disputed.
Cannabis use disorder is defined as a medical diagnosis in the fifth revision of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5).
Efficacies:
Cannabinoids and cannabinoid receptors
The most prevalent psychoactive substances in cannabis are cannabinoids, most notably THC. Some varieties, having undergone careful selection and growing techniques, can yield as much as 34% THC. Another psychoactive cannabinoid present in Cannabis sativa is tetrahydrocannabivarin (THCV), but it is only found in small amounts and is a cannabinoid antagonist.
There are also similar compounds contained in cannabis that do not exhibit any psychoactive response but are obligatory for functionality:
- cannabidiol (CBD), an isomer of THC;
- cannabivarin (CBV), an analog of cannabinol (CBN) with a different side chain,
- cannabidivarin (CBDV), an analog of CBD with a different side chain,
- and cannabinolic acid.
How these other compounds interact with THC is not fully understood. Some clinical studies have proposed that CBD acts as a balancing force to regulate the strength of the psychoactive agent THC.
CBD is also believed to regulate the body’s metabolism of THC by inactivating cytochrome P450, an important class of enzymes that metabolize drugs. Experiments in which babies were treated with CBD followed by THC showed that CBD treatment was associated with a substantial increase in brain concentrations of THC and its major metabolites, most likely because it decreased the rate of clearance of THC from the body.
Cannabis cofactor compounds have also been linked to lowering body temperature, modulating immune functioning, and cell protection.
The essential oil of cannabis contains many fragrant terpenoids which may synergize with the cannabinoids to produce their unique effects. THC is converted rapidly to 11-hydroxy-THC, which is also pharmacologically active, so the drug effect outlasts measurable THC levels in blood.
THC and cannabidiol are also neuroprotective antioxidants. Research in rats has indicated that THC prevented hydroperoxide-induced oxidative damage as well as or better than other antioxidants in a chemical (Fenton reaction) system and neuronal cultures.
Cannabidiol was significantly more protective than either vitamin E or vitamin C.
The cannabinoid receptor is a typical member of the largest known family of receptors called a G protein-coupled receptor. A signature of this type of receptor is the distinct pattern of how the receptor molecule spans the cell membrane seven times. The location of cannabinoid receptors exists on the cell membrane, and both outside (extracellularly) and inside (intracellularly) the cell membrane.
CB1 receptors, the bigger of the two, are extraordinarily abundant in the brain: 10 times more plentiful than μ-opioid receptors, the receptors responsible for the effects of morphine. CB2 receptors are structurally different (the sequence similarity between the two subtypes of receptors is 44%), found only on cells of the immune system, and seems to function similarly to its CB1 counterpart.
CB2 receptors are most commonly prevalent on B-cells, natural killer cells, and monocytes, but can also be found on polymorphonuclear neutrophil cells, T8 cells, and T4 cells. In the tonsils the CB2 receptors appear to be restricted to B-lymphocyte-enriched areas.
THC and its endogenous equivalent anandamide additionally interact with glycine receptors.
Biochemical mechanisms in the brain:
See also: Cannabis (drug) § Mechanism of action
Cannabinoids usually contain a 1,1'-di-methyl-pyran ring, a variedly derivatized aromatic ring and a variedly unsaturated cyclohexyl ring and their immediate chemical precursors, constituting a family of about 60 bi-cyclic and tri-cyclic compounds.
Like most other neurological processes, the effects of cannabis on the brain follow the standard protocol of signal transduction, the electrochemical system of sending signals through neurons for a biological response.
It is now understood that cannabinoid receptors appear in similar forms in most vertebrates and invertebrates and have a long evolutionary history of 500 million years.
The binding of cannabinoids to cannabinoid receptors decrease adenylyl cyclase activity, inhibit calcium N channels, and disinhibit K+A channels. There are at least two types of cannabinoid receptors (CB1 and CB2).
The CB1 receptor is found primarily in the brain and mediates the psychological effects of THC. The CB2 receptor is most abundantly found on cells of the immune system.
Cannabinoids act as immunomodulators at CB2 receptors, meaning they increase some immune responses and decrease others.
For example, nonpsychotropic cannabinoids can be used as a very effective anti-inflammatory. The affinity of cannabinoids to bind to either receptor is about the same, with only a slight increase observed with the plant-derived compound CBD binding to CB2 receptors more frequently.
Cannabinoids likely have a role in the brain’s control of movement and memory, as well as natural pain modulation. It is clear that cannabinoids can affect pain transmission and, specifically, that cannabinoids interact with the brain's endogenous opioid system and may affect dopamine transmission.
Sustainability in the body:
Main article: Cannabis drug testing
Most cannabinoids are lipophilic (fat soluble) compounds that are easily stored in fat, thus yielding a long elimination half-life relative to other recreational drugs. The THC molecule, and related compounds, are usually detectable in drug tests from 3 days up to 10 days according to Redwood Laboratories; long-term users can produce positive tests for two to three months after ceasing cannabis use (see drug test).
Toxicities:
Related to cannabinoids:
No fatal overdoses with cannabis use have been reported as of 2006. A review published in the British Journal of Psychiatry in February 2008 said that "no deaths directly due to acute cannabis use have ever been reported".
THC, the principal psychoactive constituent of the cannabis plant, has an extremely low toxicity and the amount that can enter the body through the consumption of cannabis plants poses no threat of death. In dogs, the minimum lethal dose of THC is over 3 g/kg.
According to the Merck Index, the LD50 of THC (the dose which causes the death of 50% of individuals) is 1270 mg/kg for male rats and 730 mg/kg for female rats from oral consumption in sesame oil, and 42 mg/kg for rats from inhalation.
It is important though to note that cannabinoids and other molecules present in cannabis can alter the metabolism of other drugs, especially due to competition for clearing metabolic pathways such as cytochromes CYP450, thus leading to drug toxicities by medications that the person consuming cannabis may be taking.
Related to smoking:
A 2007 study found that while tobacco and cannabis smoke are quite similar, cannabis smoke contained higher amounts of ammonia, hydrogen cyanide, and nitrogen oxides, but lower levels of carcinogenic polycyclic aromatic hydrocarbons (PAHs). This study found that directly inhaled cannabis smoke contained as much as 20 times as much ammonia and 5 times as much hydrogen cyanide as tobacco smoke and compared the properties of both mainstream and sidestream (smoke emitted from a smouldering 'joint' or 'cone') smoke.
Mainstream cannabis smoke was found to contain higher concentrations of selected polycyclic aromatic hydrocarbons (PAHs) than sidestream tobacco smoke. However, other studies have found much lower disparities in ammonia and hydrogen cyanide between cannabis and tobacco, and that some other constituents (such as polonium-210, lead, arsenic, nicotine, and tobacco-specific nitrosamines) are either lower or non-existent in cannabis smoke.
Cannabis smoke contains thousands of organic and inorganic chemical compounds. This tar is chemically similar to that found in tobacco smoke or cigars.
Over fifty known carcinogens have been identified in cannabis smoke. These include nitrosamines, reactive aldehydes, and polycylic hydrocarbons, including benz[a]pyrene. Marijuana smoke was listed as a cancer agent in California in 2009.
A study by the British Lung Foundation published in 2012 identifies cannabis smoke as a carcinogen and also finds awareness of the danger is low compared with the high awareness of the dangers of smoking tobacco particularly among younger users.
Other observations include possible increased risk from each cigarette; lack of research on the effect of cannabis smoke alone; low rate of addiction compared to tobacco; and episodic nature of cannabis use compared to steady frequent smoking of tobacco.
Professor David Nutt, a UK drug expert, points out that the study cited by the British Lung Foundation has been accused of both "false reasoning" and "incorrect methodology". Further, he notes that other studies have failed to connect cannabis with lung cancer, and accuses the BLF of "scaremongering over cannabis".
Short Term Effects:
When smoked, the short-term effects of cannabis manifest within seconds and are fully apparent within a few minutes, typically lasting for 1–3 hours, varying by the person and the strain of cannabis.
After oral ingestion of cannabis, the onset of effect is delayed relative to smoking, taking 30 minutes to 2 hours, but the duration is prolonged due to continued slow absorption. The duration of noticeable effects has been observed to diminish due to prolonged, repeated use and the development of a tolerance to cannabinoids.
Psychological effects:
The psychoactive effects of cannabis, known as a "high", are subjective and can vary based on the person and the method of use.
When THC enters the blood stream and reaches the brain, it binds to cannabinoid receptors. The endogenous ligand of these receptors is anandamide, the effects of which THC emulates. This agonism of the cannabinoid receptors results in changes in the levels of various neurotransmitters, especially dopamine and norepinephrine; neurotransmitters which are closely associated with the acute effects of cannabis ingestion, such as euphoria and anxiety.
Some effects may include a general alteration of conscious perception, euphoria, feelings of well-being, relaxation or stress reduction, increased appreciation of the arts, including humor and music (especially discerning its various components/instruments), joviality, metacognition and introspection, enhanced recollection (episodic memory), increased sensuality, increased awareness of sensation, increased libido, and creativity.
Abstract or philosophical thinking, disruption of linear memory and paranoia or anxiety are also typical. Anxiety is the most commonly reported side effect of smoking marijuana.
Between 20 and 30 percent of recreational users experience intense anxiety and/or panic attacks after smoking cannabis, however, some report anxiety only after not smoking cannabis for a prolonged period of time. Inexperience and use in an unfamiliar environment are major contributing factors to this anxiety.
Cannabidiol (CBD), another cannabinoid found in cannabis in varying amounts, has been shown to ameliorate the adverse effects of THC, including anxiety, that some consumers experience.
Cannabis also produces many other subjective and highly tangible effects, such as greater enjoyment of food taste and aroma, and marked distortions in the perception of time and space (where experiencing a "rush" of ideas from the bank of long-term memory can create the subjective impression of long elapsed time, while in reality only a short time has passed).
At higher doses, effects can include altered body image, auditory and/or visual illusions, pseudohallucinations, and ataxia from selective impairment of polysynaptic reflexes. In some cases, cannabis can lead to dissociative states such as depersonalization and derealization.
Any episode of acute psychosis that accompanies cannabis use usually abates after 6 hours, but in rare instances, heavy users may find the symptoms continuing for many days. If the episode is accompanied by aggression or sedation, physical restraint may be necessary.
While many psychoactive drugs clearly fall into the category of either stimulant, depressant, or hallucinogen, cannabis exhibits a mix of all properties, perhaps leaning the most towards hallucinogenic or psychedelic properties, though with other effects quite pronounced as well. THC is typically considered the primary active component of the cannabis plant; various scientific studies have suggested that certain other cannabinoids like CBD may also play a significant role in its psychoactive effects.
Somatic Effects:
Some of the short-term physical effects of cannabis use include increased heart rate, dry mouth, reddening of the eyes (congestion of the conjunctival blood vessels), a reduction in intra-ocular pressure, muscle relaxation and a sensation of cold or hot hands and feet and / or flushed face.
Electroencephalography or EEG shows somewhat more persistent alpha waves of slightly lower frequency than usual. Cannabinoids produce a "marked depression of motor activity" via activation of neuronal cannabinoid receptors belonging to the CB1 subtype.
Duration:
Peak levels of cannabis-associated intoxication occur approximately 30 minutes after smoking it and last for several hours.
Smoked:
The total short-term duration of cannabis use when smoked is based on the potency, method of smoking – e.g. whether pure or in conjunction with tobacco – and how much is smoked. Peak levels of intoxication typically last an average of three to four hours.
Oral:
When taken orally (in the form of capsules, food or drink), the psychoactive effects take longer to manifest and generally last longer, typically lasting for an average of four to ten hours after consumption. Very high doses may last even longer. Also, oral ingestion use eliminates the need to inhale toxic combustion products created by smoking and therefore negates the risk of respiratory harm associated with cannabis smoking.
Neurological effects:
The areas of the brain where cannabinoid receptors are most prevalently located are consistent with the behavioral effects produced by cannabinoids.
Brain regions in which cannabinoid receptors are very abundant are:
- the basal ganglia, associated with movement control;
- the cerebellum, associated with body movement coordination;
- the hippocampus, associated with learning, memory, and stress control;
- the cerebral cortex, associated with higher cognitive functions;
- and the nucleus accumbens, regarded as the reward center of the brain.
Other regions where cannabinoid receptors are moderately concentrated are:
- the hypothalamus, which regulates homeostatic functions;
- the amygdala, associated with emotional responses and fears;
- the spinal cord, associated with peripheral sensations like pain;
- the brain stem, associated with sleep, arousal, and motor control;
- and the nucleus of the solitary tract, associated with visceral sensations like nausea and vomiting.
Experiments on animal and human tissue have demonstrated a disruption of short-term memory formation, which is consistent with the abundance of C receptors on the hippocampus, the region of the brain most closely associated with memory.
Cannabinoids inhibit the release of several neurotransmitters in the hippocampus such as acetylcholine, norepinephrine, and glutamate, resulting in a major decrease in neuronal activity in that region. This decrease in activity resembles a "temporary hippocampal lesion."
In in-vitro experiments THC at extremely high concentrations, which could not be reached with commonly consumed doses, caused competitive inhibition of the AChE enzyme and inhibition of β-amyloid peptide aggregation, implicated in the development of Alzheimer's disease.
Compared to currently approved drugs prescribed for the treatment of Alzheimer's disease, THC is a considerably superior inhibitor of A aggregation, and this study provides a previously unrecognized molecular mechanism through which cannabinoid molecules may impact the progression of this debilitating disease.
Effects on driving:
Main article: Cannabis and impaired driving
While several studies have shown increased risk associated with cannabis use by drivers, other studies have not found increased risk.
Cannabis usage has been shown in some studies to have a negative effect on driving ability. The British Medical Journal indicated that "drivers who consume cannabis within three hours of driving are nearly twice as likely to cause a vehicle collision as those who are not under the influence of drugs or alcohol".
In Cannabis and driving: a review of the literature and commentary, the United Kingdom's Department for Transport reviewed data on cannabis and driving, finding although impaired, "subjects under cannabis treatment appear to perceive that they are indeed impaired. Where they can compensate, they do...".
In a review of driving simulator studies, researchers note that "even in those who learn to compensate for a drug's impairing effects, substantial impairment in performance can still be observed under conditions of general task performance (i.e. when no contingencies are present to maintain compensated performance)."
A 2012 meta-analysis found that acute cannabis use increased the risk of an automobile crash. An extensive 2013 review of 66 studies regarding crash risk and drug use found that cannabis was associated with minor, but not statistically significant increased odds of injury or fatal accident.
In the largest and most precisely controlled study of its kind carried out by the U.S. Department of Transportation’s National Highway Traffic Safety Administration, it was found that other "studies that measure the presence of THC in the drivers' blood or oral fluid, rather than relying on self-report tend to have much lower (or no) elevated crash risk estimates.
Likewise better controlled studies have found lower (or no) elevated crash risk estimates". The study found that "after adjusting for age, gender, race and alcohol use, drivers who tested positive for marijuana were no more likely to crash than those who had not used any drugs or alcohol prior to driving".
On the other hand, a recent study of Journal of Transport & Health indicated that the numbers of fatal crashes involving marijuana after the recreational marijuana legalization or decriminalization have significantly increased in Colorado, Washington, and Massachusetts.
Cardiovascular effects:
Short-term (one to two hours) effects on the cardiovascular system can include increased heart rate, dilation of blood vessels, and fluctuations in blood pressure.
There are medical reports of occasional heart attacks or myocardial infarction, stroke and other cardiovascular side effects.
Marijuana's cardiovascular effects are not associated with serious health problems for most young, healthy users.
Researchers reported in the International Journal of Cardiology, "Marijuana use by older people, particularly those with some degree of coronary artery or cerebrovascular disease, poses greater risks due to the resulting increase in catecholamines, cardiac workload, and carboxyhemoglobin levels, and concurrent episodes of profound postural hypotension.
Indeed, marijuana may be a much more common cause of myocardial infarction than is generally recognized. In day-to-day practice, a history of marijuana use is often not sought by many practitioners, and even when sought, the patient's response is not always truthful".
A 2013 analysis of 3,886 myocardial infarction survivors over an 18-year period showed "no statistically significant association between marijuana use and mortality".
A 2008 study by the National Institutes of Health Biomedical Research Centre in Baltimore found that heavy, chronic smoking of marijuana (138 joints per week) changed blood proteins associated with heart disease and stroke.
A 2000 study by researchers at Boston's Beth Israel Deaconess Medical Center, Massachusetts General Hospital and Harvard School of Public Health found that a middle-age person's risk of heart attack rises nearly fivefold in the first hour after smoking marijuana, "roughly the same risk seen within an hour of sexual activity".
Cannabis arteritis is a very rare peripheral vascular disease similar to Buerger's disease. There were about 50 confirmed cases from 1960 to 2008, all of which occurred in Europe.
Combination with other drugs:
A confounding factor in cannabis research is the prevalent usage of other recreational drugs, especially alcohol and nicotine. Such complications demonstrate the need for studies on cannabis that have stronger controls, and investigations into alleged symptoms of cannabis use that may also be caused by tobacco.
Some critics question whether agencies doing the research make an honest effort to present an accurate, unbiased summary of the evidence, or whether they "cherry-pick" their data to please funding sources which may include the tobacco industry or governments dependent on cigarette tax revenue; others caution that the raw data, and not the final conclusions, are what should be examined.
The Australian National Household Survey of 2001 showed that cannabis in Australia is rarely used without other drugs. 95% of cannabis users also drank alcohol; 26% took amphetamines; 19% took ecstasy and only 2.7% reported not having used any other drug with cannabis.
While research has been undertaken on the combined effects of alcohol and cannabis on performing certain tasks, little research has been conducted on the reasons why this combination is so popular.
Evidence from a controlled experimental study undertaken by Lukas and Orozco suggests that alcohol causes THC to be absorbed more rapidly into the blood plasma of the user. Data from the Australian National Survey of Mental Health and Wellbeing found that three-quarters of recent cannabis users reported using alcohol when cannabis was not available, this suggests that the two are substitutes.
Memory and learning:
Main article: Cannabis and memory
Studies on cannabis and memory are hindered by small sample sizes, confounding drug use, and other factors. The strongest evidence regarding cannabis and memory focuses on its temporary negative effects on short-term and working memory.
In a 2001 study looking at neuropsychological performance in long-term cannabis users, researchers found "some cognitive deficits appear detectable at least 7 days after heavy cannabis use but appear reversible and related to recent cannabis exposure rather than irreversible and related to cumulative lifetime use".
On his studies regarding cannabis use, lead researcher and Harvard professor Harrison Pope said he found marijuana is not dangerous over the long term, but there are short-term effects.
From neuropsychological tests, Pope found that chronic cannabis users showed difficulties, with verbal memory in particular, for "at least a week or two" after they stopped smoking. Within 28 days, memory problems vanished and the subjects "were no longer distinguishable from the comparison group".
Researchers from the University of California, San Diego School of Medicine failed to show substantial, systemic neurological effects from long-term recreational use of cannabis. Their findings were published in the July 2003 issue of the Journal of the International Neuropsychological Society.
The research team, headed by Dr Igor Grant, found that cannabis use did affect perception, but did not cause permanent brain damage. Researchers looked at data from 15 previously published controlled studies involving 704 long-term cannabis users and 484 nonusers. The results showed long-term cannabis use was only marginally harmful on the memory and learning.
Other functions such as reaction time, attention, language, reasoning ability, perceptual and motor skills were unaffected. The observed effects on memory and learning, they said, showed long-term cannabis use caused "selective memory defects", but that the impact was "of a very small magnitude".
A study at Johns Hopkins University School of Medicine showed that very heavy use of marijuana is associated with decrements in neurocognitive performance even after 28 days of abstinence.
Appetite:
The feeling of increased appetite following the use of cannabis has been documented for hundreds of years, and is known colloquially as "the munchies" in the English-speaking world.
Clinical studies and survey data have found that cannabis increases food enjoyment and interest in food. A 2015 study suggests that cannabis triggers uncharacteristic behaviour in POMC neurons, which are usually associated with decreasing hunger. Rarely, chronic users experience a severe vomiting disorder, cannabinoid hyperemesis syndrome, after smoking and find relief by taking hot baths.
Endogenous cannabinoids ("endocannabinoids") were discovered in cow's milk and soft cheeses. Endocannabinoids are also found in human breast milk. It is widely accepted that the neonatal survival of many species "is largely dependent upon their suckling behavior, or appetite for breast milk" and recent research has identified the endogenous cannabinoid system to be the first neural system to display complete control over milk ingestion and neonatal survival. It is possible that "cannabinoid receptors in our body interact with the cannabinoids in milk to stimulate a suckling response in newborns so as to prevent growth failure".
Pathogens and microtoxins:
Most microorganisms found in cannabis only affect plants and not humans, but some microorganisms, especially those that proliferate when the herb is not correctly dried and stored, can be harmful to humans. Some users may store marijuana in an airtight bag or jar in a refrigerator to prevent fungal and bacterial growth.
Fungi:
The following fungi have been found in moldy cannabis:
- Aspergillus flavus,
- Aspergillus fumigatus,
- Aspergillus niger,
- Aspergillus parasiticus,
- Aspergillus tamarii,
- Aspergillus sulphureus,
- Aspergillus repens,
- Mucor hiemalis (not a human pathogen),
- Penicillium chrysogenum,
- Penicillium italicum
- and Rhizopus nigrans
Aspergillus mold species can infect the lungs via smoking or handling of infected cannabis and cause opportunistic and sometimes deadly aspergillosis. Some of the microorganisms found create aflatoxins, which are toxic and carcinogenic. Researchers suggest that moldy cannabis should thus be discarded to avoid these serious risks.
Mold is also found in smoke from mold-infected cannabis, and the lungs and nasal passages are a major means of contracting fungal infections. Levitz and Diamond (1991) suggested baking marijuana in home ovens at 150 °C [302 °F], for five minutes before smoking. Oven treatment killed conidia of A. fumigatus, A. flavus and A. niger, and did not degrade the active component of marijuana, tetrahydrocannabinol (THC)."
Long-term Effects:
Main articles: Long-term effects of cannabis and Cannabis dependence
Exposure to marijuana may have biologically-based physical, mental, behavioral and social health consequences and is "associated with diseases of the liver (particularly with co-existing hepatitis C), lungs, heart, eyesight and vasculature" according to a 2013 literature review by Gordon and colleagues.
The association with these diseases has only been reported in cases where people have smoked cannabis. The authors cautioned that "evidence is needed, and further research should be considered, to prove causal associations of marijuana with many physical health conditions".
Cannabis use disorder is defined in the fifth revision of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5) as a condition requiring treatment. Several drugs have been investigated in an attempt to ameliorate the symptoms of stopping cannabis use. Such drugs include bupropion, divalproex, nefazodone, lofexidine, and dronabinol. Of these, dronabinol has proven the most effective.
Effects in pregnancy:
Main article: Cannabis in pregnancy
Cannabis consumption in pregnancy might be associated with restrictions in growth of the fetus, miscarriage, and cognitive deficits in offspring based on animal studies, although there is limited evidence for this in humans at this time.
A 2012 systematic review found although it was difficult to draw firm conclusions, there was some evidence that prenatal exposure to cannabis was associated with "deficits in language, attention, areas of cognitive performance, and delinquent behavior in adolescence".
A report prepared for the Australian National Council on Drugs concluded cannabis and other cannabinoids are contraindicated in pregnancy as it may interact with the endocannabinoid system.
See also:
- Cannabis smoking
- Psychoactive drug
- Cannabis Use and Psychosis from National Drug and Alcohol Research Centre, Australia
- Provision of Marijuana and Other Compounds For Scientific Research recommendations of The National Institute on Drug Abuse National Advisory Council
- Ramström, J. (2003), Adverse Health Consequences of Cannabis Use, A Survey of Scientific Studies Published up to and including the Autumn of 2003, National institute of public health, Sweden, Stockholm.
- Hall, W., Solowij, N., Lemon, J., The Health and Psychological Consequences of Cannabis Use. Canberra: Australian Government Publishing Service; 1994.
- World Health Organisation, PROGRAMME ON SUBSTANCE ABUSE, Cannabis: a health perspective and research agenda;1997.
- EU Research paper on the potency of Cannabis (2004)
Video Game Culture
- YouTube Video: Today's Gaming Culture Vs The Past
- YouTube Video: Top 10 YouTube Gamers of 2017 by WatchMojo
Video game culture is a worldwide new media subculture formed by video games. As computer and video games have exponentially increased in popularity over time, they have had a significant influence on popular culture.
Video game culture has also evolved over time hand in hand with internet culture as well as the increasing popularity of mobile games. Many people who play video games identify as gamers, which can mean anything from someone who enjoys games to someone who is passionate about it.
As video games become more social with multiplayer and online capability, gamers find themselves in growing social networks. Gaming can both be entertainment as well as competition, as a new trend known as electronic sports is becoming more widely accepted.
Today, video games can be seen in social media, politics, television, film, music and YouTube.
Demographics:
See also: Women and video games
As of 2016, the average age for a video game player is 31, a number slowly increasing as people who were children playing the first arcade, console and home computer games continue playing now on current systems.
The gender distribution of gamers is reaching equilibrium, according to a 2016 study showing that 59% of gamers are male and 41% female; however, research has also shown that women are less likely to self-identify as gamers out of fear of stigmatization.
As of 2011 ESA reported that 71% of people age six to forty-nine in the U.S. played video games, with 55% of gamers playing on their phones or mobile devices. The average age of players across the globe is mid to late 20s, and is increasing as older players grow in numbers.
One possible reason for the increase in players could be attributed to the growing number of genres that require less of a specific audience. For example, the Wii console has widened its audience with games such as Wii Sports and Wii Fit. Both require more activity from the user and provide more reasons to play including family competition or exercise. It could also be because people who played video games when they were young are now growing older and still have that interest in video games.
Currently, the largest entertainment industry for children is gaming. According to a 2008 telephone survey with a sample size of 1,102 respondents, 97% of children living in the United States and between the ages of 12 and 17 play video games.
As displayed by the recent release of certain games, video game developers have started to create gaming content that appeals to alternative audiences, beyond those of "Player 1." The idea of "Player 1" refers to the stereotypical straight male gamer as the sole individual that video games are created for.
On the other hand, "Player 2" may refer to populations of gamers who divert from this demographic, such as women or LGBTQ+ communities. Games designed to appeal to "Player 2" offer gamers an alternative gaming experience, thus allowing for the further demographic expansion of the video game subculture.
LAN Gaming:
Main article: LAN party
Video games are played in a variety of social ways, which often involve domestic gatherings or even in public places. A popular method of accomplishing this is a LAN (Local Area Network) party, which if hosted at a home involves family and friends, creating a social event for people friendly with each other. LAN parties are often held in large-scale events conducted in public spaces and have a great number of participants who might not usually socialize.
The Everquest Fan Faires for instance, provide weekends of socializing and playing, at a large gathering (an event of several thousands) of dedicated game fans. Terry Flew in his book Games: Technology, Industry, Culture also emphasises the Online Gaming Communities – "where players aren't physically located in the same space, but still socializing together".
This raises the notion of McLuhan's "Global Village", as people are able to transcend their physical limitations and communicate with people, possessing a similar interest, from all around the world. Shapiro also stresses the possibility of "Using technology to enhance one's social life", as friendships no longer have to be structured by physical proximity (e.g. neighbours, colleagues).
Shapiro states that "the net (Online Gaming Communities) gives individuals the opportunity to extend their social network in a novel way, to communicate and share life experiences with people regardless of where they live and form online relationships". Thus, such online communities satisfy a genuine need for affiliation with like-minded others.
Online Gaming:
Main article: Online game
See also: Voice chat in online gaming
Online gaming has drastically increased the scope and size of gaming culture. Online gaming grew out of games on bulletin board systems and on college mainframes from the 1970s and 1980s. MUDs offered multiplayer competition and cooperation but on a scope more geographically limited than on the internet.
The internet allowed gamers from all over the world – not just within one country or state – to play games together with ease. With the advent of Cloud Gaming high-performance games can now be played from low-end client systems and even TVs.
One of the most groundbreaking titles in the history of online gaming is Quake, which offered the ability to play with sixteen, and eventually up to thirty-two players simultaneously in a 3D world. Gamers quickly began to establish their own organized groups, called clans.
Clans established their own identities, their own marketing, their own form of internal organization, and even their own looks. Some clans had friendly or hostile rivalries, and there were often clans who were allied with other clans. Clan interaction took place on both professionally set competition events, and during normal casual playing where several members of one clan would play on a public server.
Clans would often do their recruiting this way; by noticing the best players on a particular server, they would send invitations for that player to either try out or accept membership in the clan.
Gamers of all ages play online games, with the average age being 31 years old.
'Clan'- or 'guild'-based play has since become an accepted (and expected) aspect of multiplayer gaming, with several games offering cash-prize tournament-style competition to their players. Many clans and guilds also have active fan-bases, which, when combined with the 'tournament' aspect, contribute in turning clan-based gaming into a semi-professional sport.
Clans also allow players to assist each other in simulated combat and quests in game advancement, as well as providing an online family for friendly socializing.
From Quake, gaming grew beyond first-person shooters and has impacted every genre. Real-time strategy, racing games, card games, sports games can all be played online. Online gaming has spread from its initial computer roots to console gaming as well. Today, every major video game console available offers degrees of online gaming, some limited by particular titles, some even offer up entire virtual communities.
Competition:
Main article: eSports Slang and terminology
As in other cultures, the community has developed a gamut of slang words or phrases that can be used for communication in or outside of games. Due to their growing online nature, modern video game slang overlaps heavily with internet slang, as well as Leetspeak, with many words such "pwn", as well as "noob", being direct carry-overs from Leetspeak.
There are terms to describe gaming events, games genres, gamer demographics, strategies, specific events, situations, and more. It is especially common among online games to encourage the use of neologisms for convenience in communication.
While most games have specific terms that the dedicated player bases use, some of the most prevalent phrases across all communities include abbreviations such as "lol," meaning "laughing out loud" as well as "noob," which is a derogatory term for a new or unskillful player.
Another popular term that stemmed from the gaming community is the abbreviation "AFK," meaning "away from keyboard," to refer to people who are not at the computer or paying attention. Another common abbreviations include "GL HF," which stands for "good luck, have fun," which is often said at the beginning of a match to show good sportmanship.
Likewise, at the end of a game, "GG" or "GG WP" may be said to congratulate the opponent, win or loss, on a "good game, well played."
While much of gaming lingo uses abbreviations for convenience, a lot of jargon is used for cyberbullying. In cases of online games with text or voice chat channels, it is not uncommon for players to blame or insult one another using such jargon. Some additional examples of slang and terminology include "rekt" (slang for 'wrecked') is often used to point out the obvious when a player or entity is destroyed. Less commonly, "own", "ownage", "owned" or "pwned" is used in a similar fashion.
Gaming networks:
The shift from console-based or "shrink-wrap" gaming to online games has allowed online games and massively-multiplayer online gaming today to develop highly advanced and comprehensive communication networks. With the freedom of the Internet's architecture, users can become producers of the technology and shapers of the growing networks.
Compared to past eras where consumers had little means of communication with game developers and other communities beyond their geographical location, the internet has created many methods of communication such as through the online bulletin board website, Reddit.
Gamers can often develop sub communities in game clans and may use third party VOIP programs to communicate while playing games such as Skype, Ventrillo, Teamspeak or Discord. These gaming communities may have nothing in common, or instead designed for dedicated, skilled players, or even clans made for those with shared commonalities such as personality, ethnicity, heritage, language or gender.
Another key component of many gaming networks is the connection between the player base and the game developers. Many game developers have outlets either through official website forums or social media where gamers can communicate with and provide feedback to the game developers. Likewise, these same places become key locations for game developers to communicate with their fans, where often dedicated employees act as liaisons as a bridge between the company and the community.
Some of the most advanced networks take place with massively-multiplayer online gaming where servers of tens of thousands can be present simultaneously in the same instance or environment. In major titles such as World of Warcraft and Dota 2, the player base is in the millions.
With so many people, many of these communities may develop virtual economies that may use a barter system or currency system. In some games, the interest in the virtual economies may be so great players will spend real money through auction sites like eBay for virtual property and items, commonly known as RMT (Real Market Trading). Some game developers may ban RMT in their games, especially when it interferes with the equity of the game.
That being said, other game developers embrace it with one game, Second Life, with its entire focus on the usage of real-life currency for everything in the game world.
Since smartphones became commonplace around 2007, mobile gaming has seen rapid increases in popularity. The widespread appeal of simple, "time-killing" games, reminiscent of "social games" such as those found on Facebook, has set the stage for mobile gaming to account for almost 35% of gaming's total market share by 2017.
Because games such as Clash of Clans offer in-game bonuses for referring new players to the game, mobile gamers have turned to social media sites to recruit their friends and family. Some games even offer integrated social media support to provide players with in-game chat or "friends" features for communicating and competing with other players. The large number of mobile game players has led to the creation of devoted forums, blogs, and tip sites similar to those committed to console gaming.
Popular gaming publications, like Ars Technica and TouchArcade are even beginning to give significant coverage to mobile games.
Social or anti-social technology?:
There has been much debate among media theorists as to whether video gaming is an inherently social or anti-social activity. Terry Flew argues that digital games are "increasingly social, a trend that works against the mainstream media's portrayal of players as isolated, usually adolescent boys hidden away in darkened bedrooms, failing to engage with the social world".
Flaw asserts that games are played in very social and public settings; for example computers and consoles are often played in living areas of domestic homes, where people play with family or friends.
David Marshall argues against the rich source of "effects" based research, finding that games are "deliberating and anti-social forms of behavior". Rather suggesting that "the reality of most games is that they are dynamically social – the elaborate social conversations that emerge from playing games online in massive multi-player formats" (MMOG).
Exemplifying 'The Sims Online', he states "has built up entire political and social structures in given communities' that provide an elaborate game life for participants"/ Gamers in these online worlds participate in many-to-many forms of communication and one-to-one correspondence. The games are not only massive; they are also "intimate and social".
Gosney argues that Alternate Reality Gaming is also inherently social, drawing upon Pierre Levy's (Levy 1998) notion of Collective Intelligence. He states that the game relied upon an "unprecedented level of corroboration and collective intelligence to solve the game". The issue of collective and corroborative team play is essential to ARG, thus are anything but a solitary activity.
Hans Geser further rejects the mainstream media view that video gaming is an anti-social activity, asserting "there is considerable empirical evidence that Second Life serves mainly to widen the life experience of individuals who have already a rich 'First Life', not as a compensating device for marginal loners." Thus highlighting the "fantastic social possibilities of Second Life", as the intangible reward of belonging socially is of paramount importance.
Bray and Konsynski also argue the ability of the technology "to enrich their lives", as most Millennials report: "No difference between friendships developed in the real world vs. friendships developed online, and most use the internet to maintain their social networks and plan their social activities".
Social implications of video games:
See also: Video game controversy
The advent of video games gave an innovative media technology, that allowed consumers to archive, annotate, appropriate and recirculate media content. Consumers can use this media source as an alternative tool to gain access to information of their interest. The community aspect to video-gaming has also had implications for the social interactions and collective behaviors of consumers involved in the activity.
Rise of subcultures:
Contemporary investigations have found that there is a prevailing social framework in place during gatherings of gaming enthusiasts or 'gamers'. Mäyrä (2008, p. 25) suggests that gamers who gather together to play possess a shared language, engage in collective rituals and are often interested in cultural artifacts such as gaming paraphernalia.
Cronin and McCarthy (2011) have also explored a liminal, hedonic food culture to be present among these socially connected actors. The commensal consumption of energy dense low nutrient foods is considered to be appropriated during long stretches of gameplay to contribute to the community and hedonistic aspects of social gaming.
In response to the central importance that food plays in the collective enjoyment of social gaming, various websites have been created which allow gamers to rate their favorite foods to accompany play.
The presence of rituals, shared discourse, collective action and even a liminal food culture among gaming communities gives credence to the concept of these cohorts existing as self defining sub-units within mainstream culture. However, due to the ephemeral and transient nature of their rituals, and also the possibility of virtual interaction through online participation, these cohorts should be considered 'postmodern subcultures'.
Gaming communities have social elements beyond physical interaction and have come to a stage where online and offline spaces can be seen as 'merged' rather than separate.
MMORPG and identity tourism:
Terry Flew (2005)(p. 264) suggests that the appeal of the "Massively Multiplayer Online Role Playing Game" lie in the idea of escapism, and the ability to assume the role of someone or something that is a fantasy in real life. He notes that '...for some women, [they] enjoy adopting what they feel to be an image of femininity more acceptable or desirable than their real world body...'
This is what he calls "identity tourism", a form of hopping from one person to another, for which there usually is a stereotypical discourse associated with the protagonist. This is seen in the case of males who assume the personas of the female gender, and the character's representation of her gender being overly sexualized and/or passive, '...this tends to perpetuate and accentuate existing stereotypes of... women...' (Nakamura).
Ownership:
Ownership of video game entities is a major issue in video game culture. On one side, players, especially who played with avatars for several years, have treated the avatars as their own property. On the other hand, publishers claim ownership of all in-game items and characters through the EULA (End User License Agreements).
Terry Flew recognized this problem: "Intellectual property is much better suited to conventional 'texts' that are fixed or finished, rather than ongoing collaborative creations like games". He also highlights that these issues will only worsen; as more interactive games emerging, issues of regulation, ownership, and service will only get more problematic.
Violence narrative:
Violent content in video game are often a source of criticism, which according to Terry Flew is related to the subject of 'moral panic'. Terry Flew writes that the 'effects-based' research which gives rise to the 'computer games cause violence' discourse is mostly psychology-based research, influenced particularly after horrific events such as the shooting of schoolchildren at Columbine High School in Littleton, Colorado in 1999.
Flew says that the assumption behind such research, cause-effect behaviorist models of communication, is a flawed one. Several studies show a correlation between violent content conveyed through media (including videogames) and violent or aggressive behavior, while others (Vastag 2004) consider that the evidence for such conclusions is thin and highly contestable.
Fox News reported that Montreal shooting case in Canada was carried out by the criminal Kimveer Gill, who is a player of Super Columbine Massacre, whose narrative attaches with strong violence sense.
On the other hand, some people who hold social determinism theory assert technology is neutral, but it is the way that humans manipulate technology which brings about its social impact.
Issues of Gender and Sexuality:
Main article: Women and video games
In conjunction with the changing demographics of video game creators and players, issues related to women and video games, including sexism in video gaming and gender representation in video games, have received increased attention by academia, the media, the games industry and by gamers themselves.
The Gamergate controversy of 2014, which involves issues of sexism, misogyny, and journalistic ethics, is an example of this.
Benjamin Paaßen has argued that because video game culture has long been a space dominated by heterosexual men, the video game industry tends to cater to this particularly lucrative audience, producing video games that reflect the desires of the heterosexual male gaze. He further argues that this lack of representation of alternate identities in video games has caused gamers who divert from the dominant demographic are often relegated to the margins of the culture.
This process is thus seen perpetuates the stereotypical image of the geeky, heterosexual male gamer as the ruler of the gaming world. Contrarily to popular belief, there are a multitude of communities within video game culture that do not fulfill the typical gamer stereotype. The problem is that they lack visibility.
One reason for this is that many people do not want to reveal their association with gaming culture out of fear of stigmatization. Past research has shown this to be the case for the female gamer. Because women in gaming are often ostracized by their male gamer counterparts, the female gamer are frequently forced to conceal their genders, only participating in gaming when they have the ability to remain anonymous. When concealing their identities, females gamers try to change their voice when talking online, they will play as a male character instead of a female character followed by some kind of masculine name.
Doing this however, can make video games less fun and exciting and could cause the player just quit the game. On the other hand, its different for the male gamer. Like girl gamers would choose a male character to play as, the male gamer would sometimes choose a girl character to play as. But for the male to pick a girl character is very common in the culture.
According to Bosson, Prewitt-Freilino, and Taylor, male gamers who try to be female characters are not harassed as much as girl gamers since the male gamers can simply undo the change or just reveal there true identities as a male which reduces the harassing.
When it comes to actually working into the gaming industries, there is a small minority of women within these industries. With 3% of programmers, 11% game designers, 13% of artists and animators, 13% of QA testers, and 16% of producers, these are low numbers for women in the gaming industry. The reason for this is the lack of encouragement due to the negativity or harassing of females in the gaming culture.
Due to being the minority, women in the gaming industry receive stereotypical threats because of being in a male dominant career. There was a hashtag that was created on twitter that was #1ReasonWhy in which it was reasons why there were a lack of women in the video game industry. Video game designer Kim Swift stated that, "Because I get mistaken for the receptionist or day-hire marketing at trade shows."
Additionally, dominant perceptions of gamers as asocial, straight, white men are also challenged by the presence of gamers who do not identify as heterosexual. For instance, it has been shown by past research that the LGBTQ+ community maintains a notable presence within video game culture.
For LGBTQ+ gamers, video games provide an alternate reality in which there is the opportunity for sexual expression, identity formation, and community building. Such communities indicate the development of diverse subcultures within the culture of gaming as a whole.
Gaming and popular culture:
See also: Advertising in video games
Games are also advertised on different TV channels, depending on the age demographic they are targeting. Games targeted toward kids and young teenagers are advertised on Cartoon Network and Nickelodeon, while games targeted toward older teenagers and adults are advertised on MTV, G4, Comedy Central and in NFL Network.
Gaming as portrayed by the media:
Main article: Video game journalism
See also: Video game controversy
From the 1970s through even the 1990s, video game playing was mostly seen as sub-culture hobby activity and as a substitute for physical sports. However, in its early history video gaming had occasionally caught the attention of the mainstream news outlets.
In 1972, Pong became the first video game pop-culture phenomenon. This was followed by Pac-Man in 1980.
Other video games labeled as pop-culture phenomena include the following:
As games became more realistic, issues of questionable content arose. The most notable early example is NARC, which through its use of digitized graphics and sound and its adult-oriented theme quickly became a target of the press.
These same issues arose again when Mortal Kombat debuted, particularly with its home video game console release on the Genesis and Super NES platforms; due to Nintendo's strict content-control guidelines, that system's version of Mortal Kombat was substantially re-worked to remove any 'extreme' violence.
In response to these issues (and in parallel to similar demands made upon the music and movie industries), the ESRB was established to help guide parents in their purchasing decisions. 1993's Doom caused quite a stir, with its detailed 3D graphics and copious amounts of blood and gore.
The 1996 game, Duke Nukem 3D, was accused of promoting pornography and violence; as a result of the criticism, censored versions of the game were released in certain countries. In the 1999 Columbine shootings, violent video games were for a time directly blamed by some for the incident, and labeled as "murder simulators".
In 2001, Grand Theft Auto III was released, which started the controversy over again.
Television channels:
The first video game TV show was GamePro TV.
The first television channel dedicated to video gaming and culture, G4, was launched in 2002. However, over the years, the channel has moved away from video game shows, and more towards male-oriented programs. X-Play, one of the channel's most popular shows and the highest rated video game review show, is still produced at G4 until it was bought by Esquire Magazine, who decided to cease X-Play and focus less on the video game oriented audience of G4 and go with their traditional, more general male audience of their magazine.
Ginx TV is an international multi-language gaming television channel, managed by the former MTV Networks Europe Managing Director Michiel Bakker.
There are also video game shows that appear on other channels, such as Spike TV, Fuel TV and MTV.
In Korea, there are two cable TV channels fully dedicated to video games, Ongamenet and MBCGame, broadcasting professional game leagues that are held in Korea.
In Germany most of the shows and channels dedicated to gaming were canceled, although the content was highly appreciated by the gaming audience. There was one digital cable and satellite channel with focus on video games, which was closed in 2009: GIGA Television.
Some of the hosts also did their own show Game One dedicated to games on the German MTV channel until canceled 2014. The show is quite famous for their sketches on games and gaming culture in Germany.
The unofficial successor is the YouTube show Game Two, financed by public-service broadcasting program funk and produced by the 24/7 online channel Rocket Beans TV, which is dedicated to gaming, nerd and pop culture. A similar show was "Reload"; produced for the public-service channel EinsPlus, until the channel was announced to close in 2014.
The Franco-German TV network arte has a show dedicated to gaming culture: Art of Gaming
In Australia, there is one TV show that is based on gaming and games. Good Game on the ABC (Australian Broadcasting Corporation) which broadcasts on channel ABC2. The show is also available as a podcast on iTunes.
In Russia, there are one satellite, the "Perviy Igrovoy" (Gaming First) and one cable, "Gameplay TV", gaming TV channels. Channels have internet streams.
Internet shows:
AVGN is a show about a fictional character created by James Rolfe. The character is portrayed as a foul mouthed, short tempered retro gamer who reviews old video games usually in a sarcastic and negative manner with frequent use of profanity for comical effect.
Pure Pwnage, was a fictional series chronicling the life and adventures of Jeremy, a self-proclaimed "pro gamer".
Red vs. Blue (made by Rooster Teeth), is a machinima (machine-cinema) filmed with many different video games. The series consist of hundreds of short episodes with characters acting out comedic sections of their lives in their own video game universes.
Consolevania, a game review/sketch show produced in Glasgow, Scotland, was developed into a broadcast series, videoGaiden on BBC Scotland.
Button Mashers, an original gaming news show for website Gamezombie.tv, was shot at a $1 million HDTV studio and has millions of viewers all across the internet.
The Guild is a web series, created by Felicia Day, in which the cast are members of a guild that plays an MMORPG similar to World of Warcraft.
Game Grumps, a show on YouTube in which the cast plays games sent in by viewers. It has a related show called Steam Train where the cast plays games either on Steam or sent in by independent developers.
Influences on music:
Main article: Video game music
Video game music has been utilized by popular musicians in many ways. The earliest example was the electronic music band Yellow Magic Orchestra's self-titled album, released in 1978, which utilized Space Invaders samples as instrumentation. In turn, the band would have a major influence on much of the video game music produced during the 8-bit and 16-bit eras.
During the golden age of arcade video games in the early 1980s, it became common for arcade game sounds and bleeps to be utilized, particularly in early hip hop music, synthpop, and electro music. Buckner & Garcia's Pac-Man Fever, released in 1982, featured songs that were both about famous arcade games like Pac-Man, Donkey Kong and Berzerk, and also used the sound samples from the games themselves as instrumentation.
In 1984, former Yellow Magic Orchestra member Harry Hosono produced an album entirely from Namco arcade game samples, entitled Video Game Music.
Aphex Twin, an experimental electronic artist, under the name "PowerPill" released the Pacman EP in 1992 that featured a heavy use of Pac-Man sound effects.
An entire music genre called chiptunes, or sometimes gamewave, have artists dedicated to using the synthesizer sets that came with past video game consoles and computers, particularly the Commodore 64 and the Nintendo Entertainment System. These bands include Mr. Pacman, 8 Bit Weapon, Goto 80, 50 Hertz and Puss.
The influence of retro video games on contemporary music can also be seen in the work of less purist "Bitpop" artists, such as Solemn Camel Crew and Anamanaguchi. Moreover, many gamers collect and listen to video game music, ripped from the games themselves.
This music is known by its file extension and includes such formats as: SID (Commodore 64), NSF (NES) and SPC (SNES).
Cover bands like Minibosses perform their own instrumentals, and groups like The Protomen have written rock operas inspired by the Mega Man video games, while communities like OverClocked ReMix have released thousands of game music arrangements in a variety of genres and have influenced the careers of several game composers.
A comedy subgenre has developed increasing the popularity of several musicians including Jonathan Coulton, famous for the song Still Alive featured in the credits of Valve Software's Portal, and Jonathan Lewis, songwriter and composer credited with the Half-Life-themed parody album Combine Road.
Full orchestras, such as the Symphonic Game Music Concert tour North America, the United States, and Asia performing symphonic versions of video game songs, particularly the Final Fantasy series, the Metal Gear series, and Nintendo themed music, such as the Mario & Zelda Big Band Live Concert. In Japan, Dragon Quest symphonic concerts are performed yearly, ever since their debut in 1987.
Video game and film crossovers:
Films based on video games:
Examples of films based on video games include:
Movies about video games:
Hollywood has also created films that are about video games themselves. In the 1982 film WarGames, a computer mistakes a fictional computer game called Global ThermoNuclear War for reality.
Also in 1982, Tron featured a programmer who was transported into a computer and had to directly take part in the games he had created.
In the 1984 film, The Last Starfighter, a stand-up arcade video game is used as a test to find those "with the gift", who are recruited to pilot actual Starfighter spacecraft in the conflict between the Rylan Star League and the Ko-Dan Empire.
1989's The Wizard, starring Fred Savage is the first film about a real video game. The plot revolves around about adolescents who compete at games for the Nintendo Entertainment System. The film was also a first look at the mega-hit Super Mario Bros. 3.
The main character from 2006's Grandma's Boy was a game tester who developed his own game on the side. The film made multiple references to video game culture and featured the game Demonik, which was cancelled by its publisher shortly after the film's release.
A more recent example of a film of this type is 2006's Stay Alive, a horror film about a next-generation video game that is so realistic that it kills its players in the same way their avatars were killed. Released in 2012, Wreck-It Ralph is also about the gaming world inside an arcade.
Interactive movies:
Main article: Interactive movie
Interactive movies as a computer and video game genre was the result of the multimedia expansion of computers and video game consoles in the mid-1990s, primarily because of the increased capacity offered by the laserdisc format.
Interactive movies started out on arcade machines in 1983, but quickly expanded to computers and video game consoles such as the Sega CD, the Phillips CD-i and the 3DO Interactive Multiplayer.
The games are characterized by more emphasis on cinematic sequences, using full-motion video and voice acting. Interactive movie games have been made in a number of genres, including adventure games, rail shooters, and role-playing games.
The first interactive movie game was Dragon's Lair, originally released in the arcades in 1983, making it the first game to use a laserdisc and animation by Don Bluth, a man who worked for Disney on features like Robin Hood, The Rescuers, and Pete's Dragon, but later worked for other film companies like United Artists (All Dogs Go to Heaven) and Universal Studios (The Land Before Time).
In Dragon's Lair, you control the actions of a daring knight named Dirk, to save a princess from an evil dragon, hence the name of the game. Since the dawn of this exact game, more and more companies were influenced by the technology used, and decided to make their own interactive movie games for arcades and consoles.
The birth of the 'interactive movie' genre was studded with unimpressive flops, though the genre later came into its own; at the time, video-capture technology was still in its infancy, with short (and often grainy and low-quality) video segments being the norm for games of any length.
Video game and traditional media forms:
With the rapid convergence of all media types into a digital form, video games are also beginning to affect, and be affected by traditional media forms.
In the history, the Television engineer Ralph Baer, who conceived the idea of an interactive television while building a television set from scratch created the first video game. Video games are now also being exploited by pay-TV companies which allow you to simply attach your computer or console to the television cable system and you can simply download the latest game.
Games act in television, with the player choosing to enter the artificial world. The constructed meanings in video games are more influential than those of traditional media forms. The reason is that 'games interact with the audience in a dialogue of emotion, action, and reaction'. The interactivity means this occurs to a depth that is not possible in the traditional media forms.
Computer games have developed in parallel to both the video game and the arcade video game. The personal computer and console machines such as the Dreamcast, Nintendo GameCube, PlayStation 2 and Xbox offered a new dimension to game playing. The consoles have now largely been replaced by the Xbox 360, Wii and, the PlayStation 4, and the personal computer is still a leading gaming machine..
Games are the first new computer-based media form to socialize a generation of youth in a way that traditional media forms have in the past. Therefore, the 'MTV generation' has been overtaken by the 'Nintendo generation'; however, some refer to the current generation as the 'iPod Generation'.
Because they straddle the technologies of television and computers, electronic games are a channel through which we can investigate the various impacts of new media and the technologies of convergence.
Interactive engagement between players and digital games:
Digital game is a new form of media where users interact and highly engage with. Terry Flew said that unlike "lean-back" types of media such as television, film and books, digital games place users into a productive relationship. In other words, a player has a relationship where he or she serves to create their own text every time when engaged.
Digital games are normally lacking in narrative. Rather than focusing on its character development or plot, usually, the setting is the important aspect of the narrative.
Despite the fact that games such as Tetris and Pong lack a plot or characters, they manage to keep players engaged for hours. Furthermore, digital games place players into a position where they have power to control.
Players have power because they are the actor in the game. Again, Flew wrote that "the engagement comes because the player is the performer, and the game evaluates the performance and adapts to it."
Emergent games are becoming a popular type of video games as they offer environments and sets of rules where players' story can branch in various unexpected directions, depending on the players' own decisions. Players as users of these new forms of media, not only ingest narrative but also can more freely interact and engage in ways where they can actually create their own text.(2005)
The time spent playing games varies by age group and by country, with estimates putting the average between 6 and 22 hours per week.
See also:
Video game culture has also evolved over time hand in hand with internet culture as well as the increasing popularity of mobile games. Many people who play video games identify as gamers, which can mean anything from someone who enjoys games to someone who is passionate about it.
As video games become more social with multiplayer and online capability, gamers find themselves in growing social networks. Gaming can both be entertainment as well as competition, as a new trend known as electronic sports is becoming more widely accepted.
Today, video games can be seen in social media, politics, television, film, music and YouTube.
Demographics:
See also: Women and video games
As of 2016, the average age for a video game player is 31, a number slowly increasing as people who were children playing the first arcade, console and home computer games continue playing now on current systems.
The gender distribution of gamers is reaching equilibrium, according to a 2016 study showing that 59% of gamers are male and 41% female; however, research has also shown that women are less likely to self-identify as gamers out of fear of stigmatization.
As of 2011 ESA reported that 71% of people age six to forty-nine in the U.S. played video games, with 55% of gamers playing on their phones or mobile devices. The average age of players across the globe is mid to late 20s, and is increasing as older players grow in numbers.
One possible reason for the increase in players could be attributed to the growing number of genres that require less of a specific audience. For example, the Wii console has widened its audience with games such as Wii Sports and Wii Fit. Both require more activity from the user and provide more reasons to play including family competition or exercise. It could also be because people who played video games when they were young are now growing older and still have that interest in video games.
Currently, the largest entertainment industry for children is gaming. According to a 2008 telephone survey with a sample size of 1,102 respondents, 97% of children living in the United States and between the ages of 12 and 17 play video games.
As displayed by the recent release of certain games, video game developers have started to create gaming content that appeals to alternative audiences, beyond those of "Player 1." The idea of "Player 1" refers to the stereotypical straight male gamer as the sole individual that video games are created for.
On the other hand, "Player 2" may refer to populations of gamers who divert from this demographic, such as women or LGBTQ+ communities. Games designed to appeal to "Player 2" offer gamers an alternative gaming experience, thus allowing for the further demographic expansion of the video game subculture.
LAN Gaming:
Main article: LAN party
Video games are played in a variety of social ways, which often involve domestic gatherings or even in public places. A popular method of accomplishing this is a LAN (Local Area Network) party, which if hosted at a home involves family and friends, creating a social event for people friendly with each other. LAN parties are often held in large-scale events conducted in public spaces and have a great number of participants who might not usually socialize.
The Everquest Fan Faires for instance, provide weekends of socializing and playing, at a large gathering (an event of several thousands) of dedicated game fans. Terry Flew in his book Games: Technology, Industry, Culture also emphasises the Online Gaming Communities – "where players aren't physically located in the same space, but still socializing together".
This raises the notion of McLuhan's "Global Village", as people are able to transcend their physical limitations and communicate with people, possessing a similar interest, from all around the world. Shapiro also stresses the possibility of "Using technology to enhance one's social life", as friendships no longer have to be structured by physical proximity (e.g. neighbours, colleagues).
Shapiro states that "the net (Online Gaming Communities) gives individuals the opportunity to extend their social network in a novel way, to communicate and share life experiences with people regardless of where they live and form online relationships". Thus, such online communities satisfy a genuine need for affiliation with like-minded others.
Online Gaming:
Main article: Online game
See also: Voice chat in online gaming
Online gaming has drastically increased the scope and size of gaming culture. Online gaming grew out of games on bulletin board systems and on college mainframes from the 1970s and 1980s. MUDs offered multiplayer competition and cooperation but on a scope more geographically limited than on the internet.
The internet allowed gamers from all over the world – not just within one country or state – to play games together with ease. With the advent of Cloud Gaming high-performance games can now be played from low-end client systems and even TVs.
One of the most groundbreaking titles in the history of online gaming is Quake, which offered the ability to play with sixteen, and eventually up to thirty-two players simultaneously in a 3D world. Gamers quickly began to establish their own organized groups, called clans.
Clans established their own identities, their own marketing, their own form of internal organization, and even their own looks. Some clans had friendly or hostile rivalries, and there were often clans who were allied with other clans. Clan interaction took place on both professionally set competition events, and during normal casual playing where several members of one clan would play on a public server.
Clans would often do their recruiting this way; by noticing the best players on a particular server, they would send invitations for that player to either try out or accept membership in the clan.
Gamers of all ages play online games, with the average age being 31 years old.
'Clan'- or 'guild'-based play has since become an accepted (and expected) aspect of multiplayer gaming, with several games offering cash-prize tournament-style competition to their players. Many clans and guilds also have active fan-bases, which, when combined with the 'tournament' aspect, contribute in turning clan-based gaming into a semi-professional sport.
Clans also allow players to assist each other in simulated combat and quests in game advancement, as well as providing an online family for friendly socializing.
From Quake, gaming grew beyond first-person shooters and has impacted every genre. Real-time strategy, racing games, card games, sports games can all be played online. Online gaming has spread from its initial computer roots to console gaming as well. Today, every major video game console available offers degrees of online gaming, some limited by particular titles, some even offer up entire virtual communities.
Competition:
Main article: eSports Slang and terminology
As in other cultures, the community has developed a gamut of slang words or phrases that can be used for communication in or outside of games. Due to their growing online nature, modern video game slang overlaps heavily with internet slang, as well as Leetspeak, with many words such "pwn", as well as "noob", being direct carry-overs from Leetspeak.
There are terms to describe gaming events, games genres, gamer demographics, strategies, specific events, situations, and more. It is especially common among online games to encourage the use of neologisms for convenience in communication.
While most games have specific terms that the dedicated player bases use, some of the most prevalent phrases across all communities include abbreviations such as "lol," meaning "laughing out loud" as well as "noob," which is a derogatory term for a new or unskillful player.
Another popular term that stemmed from the gaming community is the abbreviation "AFK," meaning "away from keyboard," to refer to people who are not at the computer or paying attention. Another common abbreviations include "GL HF," which stands for "good luck, have fun," which is often said at the beginning of a match to show good sportmanship.
Likewise, at the end of a game, "GG" or "GG WP" may be said to congratulate the opponent, win or loss, on a "good game, well played."
While much of gaming lingo uses abbreviations for convenience, a lot of jargon is used for cyberbullying. In cases of online games with text or voice chat channels, it is not uncommon for players to blame or insult one another using such jargon. Some additional examples of slang and terminology include "rekt" (slang for 'wrecked') is often used to point out the obvious when a player or entity is destroyed. Less commonly, "own", "ownage", "owned" or "pwned" is used in a similar fashion.
Gaming networks:
The shift from console-based or "shrink-wrap" gaming to online games has allowed online games and massively-multiplayer online gaming today to develop highly advanced and comprehensive communication networks. With the freedom of the Internet's architecture, users can become producers of the technology and shapers of the growing networks.
Compared to past eras where consumers had little means of communication with game developers and other communities beyond their geographical location, the internet has created many methods of communication such as through the online bulletin board website, Reddit.
Gamers can often develop sub communities in game clans and may use third party VOIP programs to communicate while playing games such as Skype, Ventrillo, Teamspeak or Discord. These gaming communities may have nothing in common, or instead designed for dedicated, skilled players, or even clans made for those with shared commonalities such as personality, ethnicity, heritage, language or gender.
Another key component of many gaming networks is the connection between the player base and the game developers. Many game developers have outlets either through official website forums or social media where gamers can communicate with and provide feedback to the game developers. Likewise, these same places become key locations for game developers to communicate with their fans, where often dedicated employees act as liaisons as a bridge between the company and the community.
Some of the most advanced networks take place with massively-multiplayer online gaming where servers of tens of thousands can be present simultaneously in the same instance or environment. In major titles such as World of Warcraft and Dota 2, the player base is in the millions.
With so many people, many of these communities may develop virtual economies that may use a barter system or currency system. In some games, the interest in the virtual economies may be so great players will spend real money through auction sites like eBay for virtual property and items, commonly known as RMT (Real Market Trading). Some game developers may ban RMT in their games, especially when it interferes with the equity of the game.
That being said, other game developers embrace it with one game, Second Life, with its entire focus on the usage of real-life currency for everything in the game world.
Since smartphones became commonplace around 2007, mobile gaming has seen rapid increases in popularity. The widespread appeal of simple, "time-killing" games, reminiscent of "social games" such as those found on Facebook, has set the stage for mobile gaming to account for almost 35% of gaming's total market share by 2017.
Because games such as Clash of Clans offer in-game bonuses for referring new players to the game, mobile gamers have turned to social media sites to recruit their friends and family. Some games even offer integrated social media support to provide players with in-game chat or "friends" features for communicating and competing with other players. The large number of mobile game players has led to the creation of devoted forums, blogs, and tip sites similar to those committed to console gaming.
Popular gaming publications, like Ars Technica and TouchArcade are even beginning to give significant coverage to mobile games.
Social or anti-social technology?:
There has been much debate among media theorists as to whether video gaming is an inherently social or anti-social activity. Terry Flew argues that digital games are "increasingly social, a trend that works against the mainstream media's portrayal of players as isolated, usually adolescent boys hidden away in darkened bedrooms, failing to engage with the social world".
Flaw asserts that games are played in very social and public settings; for example computers and consoles are often played in living areas of domestic homes, where people play with family or friends.
David Marshall argues against the rich source of "effects" based research, finding that games are "deliberating and anti-social forms of behavior". Rather suggesting that "the reality of most games is that they are dynamically social – the elaborate social conversations that emerge from playing games online in massive multi-player formats" (MMOG).
Exemplifying 'The Sims Online', he states "has built up entire political and social structures in given communities' that provide an elaborate game life for participants"/ Gamers in these online worlds participate in many-to-many forms of communication and one-to-one correspondence. The games are not only massive; they are also "intimate and social".
Gosney argues that Alternate Reality Gaming is also inherently social, drawing upon Pierre Levy's (Levy 1998) notion of Collective Intelligence. He states that the game relied upon an "unprecedented level of corroboration and collective intelligence to solve the game". The issue of collective and corroborative team play is essential to ARG, thus are anything but a solitary activity.
Hans Geser further rejects the mainstream media view that video gaming is an anti-social activity, asserting "there is considerable empirical evidence that Second Life serves mainly to widen the life experience of individuals who have already a rich 'First Life', not as a compensating device for marginal loners." Thus highlighting the "fantastic social possibilities of Second Life", as the intangible reward of belonging socially is of paramount importance.
Bray and Konsynski also argue the ability of the technology "to enrich their lives", as most Millennials report: "No difference between friendships developed in the real world vs. friendships developed online, and most use the internet to maintain their social networks and plan their social activities".
Social implications of video games:
See also: Video game controversy
The advent of video games gave an innovative media technology, that allowed consumers to archive, annotate, appropriate and recirculate media content. Consumers can use this media source as an alternative tool to gain access to information of their interest. The community aspect to video-gaming has also had implications for the social interactions and collective behaviors of consumers involved in the activity.
Rise of subcultures:
Contemporary investigations have found that there is a prevailing social framework in place during gatherings of gaming enthusiasts or 'gamers'. Mäyrä (2008, p. 25) suggests that gamers who gather together to play possess a shared language, engage in collective rituals and are often interested in cultural artifacts such as gaming paraphernalia.
Cronin and McCarthy (2011) have also explored a liminal, hedonic food culture to be present among these socially connected actors. The commensal consumption of energy dense low nutrient foods is considered to be appropriated during long stretches of gameplay to contribute to the community and hedonistic aspects of social gaming.
In response to the central importance that food plays in the collective enjoyment of social gaming, various websites have been created which allow gamers to rate their favorite foods to accompany play.
The presence of rituals, shared discourse, collective action and even a liminal food culture among gaming communities gives credence to the concept of these cohorts existing as self defining sub-units within mainstream culture. However, due to the ephemeral and transient nature of their rituals, and also the possibility of virtual interaction through online participation, these cohorts should be considered 'postmodern subcultures'.
Gaming communities have social elements beyond physical interaction and have come to a stage where online and offline spaces can be seen as 'merged' rather than separate.
MMORPG and identity tourism:
Terry Flew (2005)(p. 264) suggests that the appeal of the "Massively Multiplayer Online Role Playing Game" lie in the idea of escapism, and the ability to assume the role of someone or something that is a fantasy in real life. He notes that '...for some women, [they] enjoy adopting what they feel to be an image of femininity more acceptable or desirable than their real world body...'
This is what he calls "identity tourism", a form of hopping from one person to another, for which there usually is a stereotypical discourse associated with the protagonist. This is seen in the case of males who assume the personas of the female gender, and the character's representation of her gender being overly sexualized and/or passive, '...this tends to perpetuate and accentuate existing stereotypes of... women...' (Nakamura).
Ownership:
Ownership of video game entities is a major issue in video game culture. On one side, players, especially who played with avatars for several years, have treated the avatars as their own property. On the other hand, publishers claim ownership of all in-game items and characters through the EULA (End User License Agreements).
Terry Flew recognized this problem: "Intellectual property is much better suited to conventional 'texts' that are fixed or finished, rather than ongoing collaborative creations like games". He also highlights that these issues will only worsen; as more interactive games emerging, issues of regulation, ownership, and service will only get more problematic.
Violence narrative:
Violent content in video game are often a source of criticism, which according to Terry Flew is related to the subject of 'moral panic'. Terry Flew writes that the 'effects-based' research which gives rise to the 'computer games cause violence' discourse is mostly psychology-based research, influenced particularly after horrific events such as the shooting of schoolchildren at Columbine High School in Littleton, Colorado in 1999.
Flew says that the assumption behind such research, cause-effect behaviorist models of communication, is a flawed one. Several studies show a correlation between violent content conveyed through media (including videogames) and violent or aggressive behavior, while others (Vastag 2004) consider that the evidence for such conclusions is thin and highly contestable.
Fox News reported that Montreal shooting case in Canada was carried out by the criminal Kimveer Gill, who is a player of Super Columbine Massacre, whose narrative attaches with strong violence sense.
On the other hand, some people who hold social determinism theory assert technology is neutral, but it is the way that humans manipulate technology which brings about its social impact.
Issues of Gender and Sexuality:
Main article: Women and video games
In conjunction with the changing demographics of video game creators and players, issues related to women and video games, including sexism in video gaming and gender representation in video games, have received increased attention by academia, the media, the games industry and by gamers themselves.
The Gamergate controversy of 2014, which involves issues of sexism, misogyny, and journalistic ethics, is an example of this.
Benjamin Paaßen has argued that because video game culture has long been a space dominated by heterosexual men, the video game industry tends to cater to this particularly lucrative audience, producing video games that reflect the desires of the heterosexual male gaze. He further argues that this lack of representation of alternate identities in video games has caused gamers who divert from the dominant demographic are often relegated to the margins of the culture.
This process is thus seen perpetuates the stereotypical image of the geeky, heterosexual male gamer as the ruler of the gaming world. Contrarily to popular belief, there are a multitude of communities within video game culture that do not fulfill the typical gamer stereotype. The problem is that they lack visibility.
One reason for this is that many people do not want to reveal their association with gaming culture out of fear of stigmatization. Past research has shown this to be the case for the female gamer. Because women in gaming are often ostracized by their male gamer counterparts, the female gamer are frequently forced to conceal their genders, only participating in gaming when they have the ability to remain anonymous. When concealing their identities, females gamers try to change their voice when talking online, they will play as a male character instead of a female character followed by some kind of masculine name.
Doing this however, can make video games less fun and exciting and could cause the player just quit the game. On the other hand, its different for the male gamer. Like girl gamers would choose a male character to play as, the male gamer would sometimes choose a girl character to play as. But for the male to pick a girl character is very common in the culture.
According to Bosson, Prewitt-Freilino, and Taylor, male gamers who try to be female characters are not harassed as much as girl gamers since the male gamers can simply undo the change or just reveal there true identities as a male which reduces the harassing.
When it comes to actually working into the gaming industries, there is a small minority of women within these industries. With 3% of programmers, 11% game designers, 13% of artists and animators, 13% of QA testers, and 16% of producers, these are low numbers for women in the gaming industry. The reason for this is the lack of encouragement due to the negativity or harassing of females in the gaming culture.
Due to being the minority, women in the gaming industry receive stereotypical threats because of being in a male dominant career. There was a hashtag that was created on twitter that was #1ReasonWhy in which it was reasons why there were a lack of women in the video game industry. Video game designer Kim Swift stated that, "Because I get mistaken for the receptionist or day-hire marketing at trade shows."
Additionally, dominant perceptions of gamers as asocial, straight, white men are also challenged by the presence of gamers who do not identify as heterosexual. For instance, it has been shown by past research that the LGBTQ+ community maintains a notable presence within video game culture.
For LGBTQ+ gamers, video games provide an alternate reality in which there is the opportunity for sexual expression, identity formation, and community building. Such communities indicate the development of diverse subcultures within the culture of gaming as a whole.
Gaming and popular culture:
See also: Advertising in video games
Games are also advertised on different TV channels, depending on the age demographic they are targeting. Games targeted toward kids and young teenagers are advertised on Cartoon Network and Nickelodeon, while games targeted toward older teenagers and adults are advertised on MTV, G4, Comedy Central and in NFL Network.
Gaming as portrayed by the media:
Main article: Video game journalism
See also: Video game controversy
From the 1970s through even the 1990s, video game playing was mostly seen as sub-culture hobby activity and as a substitute for physical sports. However, in its early history video gaming had occasionally caught the attention of the mainstream news outlets.
In 1972, Pong became the first video game pop-culture phenomenon. This was followed by Pac-Man in 1980.
Other video games labeled as pop-culture phenomena include the following:
- Final Fantasy,
- Halo,
- Metal Gear,
- The Legend of Zelda,
- Tomb Raider,
- Grand Theft Auto,
- Call of Duty,
- Street Fighter,
- Mortal Kombat,
- Pokémon,
- Guitar Hero,
- Sonic the Hedgehog,
- and the Mario games.
As games became more realistic, issues of questionable content arose. The most notable early example is NARC, which through its use of digitized graphics and sound and its adult-oriented theme quickly became a target of the press.
These same issues arose again when Mortal Kombat debuted, particularly with its home video game console release on the Genesis and Super NES platforms; due to Nintendo's strict content-control guidelines, that system's version of Mortal Kombat was substantially re-worked to remove any 'extreme' violence.
In response to these issues (and in parallel to similar demands made upon the music and movie industries), the ESRB was established to help guide parents in their purchasing decisions. 1993's Doom caused quite a stir, with its detailed 3D graphics and copious amounts of blood and gore.
The 1996 game, Duke Nukem 3D, was accused of promoting pornography and violence; as a result of the criticism, censored versions of the game were released in certain countries. In the 1999 Columbine shootings, violent video games were for a time directly blamed by some for the incident, and labeled as "murder simulators".
In 2001, Grand Theft Auto III was released, which started the controversy over again.
Television channels:
The first video game TV show was GamePro TV.
The first television channel dedicated to video gaming and culture, G4, was launched in 2002. However, over the years, the channel has moved away from video game shows, and more towards male-oriented programs. X-Play, one of the channel's most popular shows and the highest rated video game review show, is still produced at G4 until it was bought by Esquire Magazine, who decided to cease X-Play and focus less on the video game oriented audience of G4 and go with their traditional, more general male audience of their magazine.
Ginx TV is an international multi-language gaming television channel, managed by the former MTV Networks Europe Managing Director Michiel Bakker.
There are also video game shows that appear on other channels, such as Spike TV, Fuel TV and MTV.
In Korea, there are two cable TV channels fully dedicated to video games, Ongamenet and MBCGame, broadcasting professional game leagues that are held in Korea.
In Germany most of the shows and channels dedicated to gaming were canceled, although the content was highly appreciated by the gaming audience. There was one digital cable and satellite channel with focus on video games, which was closed in 2009: GIGA Television.
Some of the hosts also did their own show Game One dedicated to games on the German MTV channel until canceled 2014. The show is quite famous for their sketches on games and gaming culture in Germany.
The unofficial successor is the YouTube show Game Two, financed by public-service broadcasting program funk and produced by the 24/7 online channel Rocket Beans TV, which is dedicated to gaming, nerd and pop culture. A similar show was "Reload"; produced for the public-service channel EinsPlus, until the channel was announced to close in 2014.
The Franco-German TV network arte has a show dedicated to gaming culture: Art of Gaming
In Australia, there is one TV show that is based on gaming and games. Good Game on the ABC (Australian Broadcasting Corporation) which broadcasts on channel ABC2. The show is also available as a podcast on iTunes.
In Russia, there are one satellite, the "Perviy Igrovoy" (Gaming First) and one cable, "Gameplay TV", gaming TV channels. Channels have internet streams.
Internet shows:
AVGN is a show about a fictional character created by James Rolfe. The character is portrayed as a foul mouthed, short tempered retro gamer who reviews old video games usually in a sarcastic and negative manner with frequent use of profanity for comical effect.
Pure Pwnage, was a fictional series chronicling the life and adventures of Jeremy, a self-proclaimed "pro gamer".
Red vs. Blue (made by Rooster Teeth), is a machinima (machine-cinema) filmed with many different video games. The series consist of hundreds of short episodes with characters acting out comedic sections of their lives in their own video game universes.
Consolevania, a game review/sketch show produced in Glasgow, Scotland, was developed into a broadcast series, videoGaiden on BBC Scotland.
Button Mashers, an original gaming news show for website Gamezombie.tv, was shot at a $1 million HDTV studio and has millions of viewers all across the internet.
The Guild is a web series, created by Felicia Day, in which the cast are members of a guild that plays an MMORPG similar to World of Warcraft.
Game Grumps, a show on YouTube in which the cast plays games sent in by viewers. It has a related show called Steam Train where the cast plays games either on Steam or sent in by independent developers.
Influences on music:
Main article: Video game music
Video game music has been utilized by popular musicians in many ways. The earliest example was the electronic music band Yellow Magic Orchestra's self-titled album, released in 1978, which utilized Space Invaders samples as instrumentation. In turn, the band would have a major influence on much of the video game music produced during the 8-bit and 16-bit eras.
During the golden age of arcade video games in the early 1980s, it became common for arcade game sounds and bleeps to be utilized, particularly in early hip hop music, synthpop, and electro music. Buckner & Garcia's Pac-Man Fever, released in 1982, featured songs that were both about famous arcade games like Pac-Man, Donkey Kong and Berzerk, and also used the sound samples from the games themselves as instrumentation.
In 1984, former Yellow Magic Orchestra member Harry Hosono produced an album entirely from Namco arcade game samples, entitled Video Game Music.
Aphex Twin, an experimental electronic artist, under the name "PowerPill" released the Pacman EP in 1992 that featured a heavy use of Pac-Man sound effects.
An entire music genre called chiptunes, or sometimes gamewave, have artists dedicated to using the synthesizer sets that came with past video game consoles and computers, particularly the Commodore 64 and the Nintendo Entertainment System. These bands include Mr. Pacman, 8 Bit Weapon, Goto 80, 50 Hertz and Puss.
The influence of retro video games on contemporary music can also be seen in the work of less purist "Bitpop" artists, such as Solemn Camel Crew and Anamanaguchi. Moreover, many gamers collect and listen to video game music, ripped from the games themselves.
This music is known by its file extension and includes such formats as: SID (Commodore 64), NSF (NES) and SPC (SNES).
Cover bands like Minibosses perform their own instrumentals, and groups like The Protomen have written rock operas inspired by the Mega Man video games, while communities like OverClocked ReMix have released thousands of game music arrangements in a variety of genres and have influenced the careers of several game composers.
A comedy subgenre has developed increasing the popularity of several musicians including Jonathan Coulton, famous for the song Still Alive featured in the credits of Valve Software's Portal, and Jonathan Lewis, songwriter and composer credited with the Half-Life-themed parody album Combine Road.
Full orchestras, such as the Symphonic Game Music Concert tour North America, the United States, and Asia performing symphonic versions of video game songs, particularly the Final Fantasy series, the Metal Gear series, and Nintendo themed music, such as the Mario & Zelda Big Band Live Concert. In Japan, Dragon Quest symphonic concerts are performed yearly, ever since their debut in 1987.
Video game and film crossovers:
Films based on video games:
Examples of films based on video games include:
- Street Fighter,
- Mortal Kombat,
- BloodRayne,
- Doom,
- House of the Dead,
- Alone in the Dark,
- Resident Evil,
- Silent Hill,
- Tomb Raider,
- and Assassin's Creed (film).
Movies about video games:
Hollywood has also created films that are about video games themselves. In the 1982 film WarGames, a computer mistakes a fictional computer game called Global ThermoNuclear War for reality.
Also in 1982, Tron featured a programmer who was transported into a computer and had to directly take part in the games he had created.
In the 1984 film, The Last Starfighter, a stand-up arcade video game is used as a test to find those "with the gift", who are recruited to pilot actual Starfighter spacecraft in the conflict between the Rylan Star League and the Ko-Dan Empire.
1989's The Wizard, starring Fred Savage is the first film about a real video game. The plot revolves around about adolescents who compete at games for the Nintendo Entertainment System. The film was also a first look at the mega-hit Super Mario Bros. 3.
The main character from 2006's Grandma's Boy was a game tester who developed his own game on the side. The film made multiple references to video game culture and featured the game Demonik, which was cancelled by its publisher shortly after the film's release.
A more recent example of a film of this type is 2006's Stay Alive, a horror film about a next-generation video game that is so realistic that it kills its players in the same way their avatars were killed. Released in 2012, Wreck-It Ralph is also about the gaming world inside an arcade.
Interactive movies:
Main article: Interactive movie
Interactive movies as a computer and video game genre was the result of the multimedia expansion of computers and video game consoles in the mid-1990s, primarily because of the increased capacity offered by the laserdisc format.
Interactive movies started out on arcade machines in 1983, but quickly expanded to computers and video game consoles such as the Sega CD, the Phillips CD-i and the 3DO Interactive Multiplayer.
The games are characterized by more emphasis on cinematic sequences, using full-motion video and voice acting. Interactive movie games have been made in a number of genres, including adventure games, rail shooters, and role-playing games.
The first interactive movie game was Dragon's Lair, originally released in the arcades in 1983, making it the first game to use a laserdisc and animation by Don Bluth, a man who worked for Disney on features like Robin Hood, The Rescuers, and Pete's Dragon, but later worked for other film companies like United Artists (All Dogs Go to Heaven) and Universal Studios (The Land Before Time).
In Dragon's Lair, you control the actions of a daring knight named Dirk, to save a princess from an evil dragon, hence the name of the game. Since the dawn of this exact game, more and more companies were influenced by the technology used, and decided to make their own interactive movie games for arcades and consoles.
The birth of the 'interactive movie' genre was studded with unimpressive flops, though the genre later came into its own; at the time, video-capture technology was still in its infancy, with short (and often grainy and low-quality) video segments being the norm for games of any length.
Video game and traditional media forms:
With the rapid convergence of all media types into a digital form, video games are also beginning to affect, and be affected by traditional media forms.
In the history, the Television engineer Ralph Baer, who conceived the idea of an interactive television while building a television set from scratch created the first video game. Video games are now also being exploited by pay-TV companies which allow you to simply attach your computer or console to the television cable system and you can simply download the latest game.
Games act in television, with the player choosing to enter the artificial world. The constructed meanings in video games are more influential than those of traditional media forms. The reason is that 'games interact with the audience in a dialogue of emotion, action, and reaction'. The interactivity means this occurs to a depth that is not possible in the traditional media forms.
Computer games have developed in parallel to both the video game and the arcade video game. The personal computer and console machines such as the Dreamcast, Nintendo GameCube, PlayStation 2 and Xbox offered a new dimension to game playing. The consoles have now largely been replaced by the Xbox 360, Wii and, the PlayStation 4, and the personal computer is still a leading gaming machine..
Games are the first new computer-based media form to socialize a generation of youth in a way that traditional media forms have in the past. Therefore, the 'MTV generation' has been overtaken by the 'Nintendo generation'; however, some refer to the current generation as the 'iPod Generation'.
Because they straddle the technologies of television and computers, electronic games are a channel through which we can investigate the various impacts of new media and the technologies of convergence.
Interactive engagement between players and digital games:
Digital game is a new form of media where users interact and highly engage with. Terry Flew said that unlike "lean-back" types of media such as television, film and books, digital games place users into a productive relationship. In other words, a player has a relationship where he or she serves to create their own text every time when engaged.
Digital games are normally lacking in narrative. Rather than focusing on its character development or plot, usually, the setting is the important aspect of the narrative.
Despite the fact that games such as Tetris and Pong lack a plot or characters, they manage to keep players engaged for hours. Furthermore, digital games place players into a position where they have power to control.
Players have power because they are the actor in the game. Again, Flew wrote that "the engagement comes because the player is the performer, and the game evaluates the performance and adapts to it."
Emergent games are becoming a popular type of video games as they offer environments and sets of rules where players' story can branch in various unexpected directions, depending on the players' own decisions. Players as users of these new forms of media, not only ingest narrative but also can more freely interact and engage in ways where they can actually create their own text.(2005)
The time spent playing games varies by age group and by country, with estimates putting the average between 6 and 22 hours per week.
See also:
- Gamer
- Gamers Outreach Foundation
- Geek culture
- List of books about video games
- List of novels based on video games
- PC Master Race
- Sexism in video gaming
- Social interaction and first-person shooters
Motorcycle Club and its Outlaw Motorcycle Club Culture, including a List of Motorcycle Sub-cultures
TOP: Easy Rider (1969): from left to right: Dennis Hopper (back), Peter Fonda and Jack Nicholson (behind Fonda)
BOTTOM: The Wild One (1953): Marlon Brando (on front motorcycle)
- YouTube Video: Riding with an outlaw motorcycle club (CNN)
- YouTube Video: Outlaws leader tells his side of the story
- YouTube Video: Christian Bikers
TOP: Easy Rider (1969): from left to right: Dennis Hopper (back), Peter Fonda and Jack Nicholson (behind Fonda)
BOTTOM: The Wild One (1953): Marlon Brando (on front motorcycle)
A motorcycle club is a group of individuals whose primary interest and activities involve motorcycles. A motorcycle group can range as clubbed groups of different bikes or bikers who own same model of vehicle like the Harley Owners Group.
Types of clubs, groups, and organizations:
Most clubs are either organized around a brand or make, or around a type of riding (e.g. touring).
Motorcycle clubs vary a great deal in their objectives and organizations.
Mainstream motorcycle clubs or associations typically have elected officers and directors, annual dues, and a regular publication. They may also sponsor sports events and annual or more frequent motorcycle rallies where members can socialize.
There are a great many brand clubs, i.e. clubs dedicated to a particular marque, including those sponsored by various manufacturers, modeled on the original brand club, the Harley Owners Group.
There are also large national independent motorcycle clubs, for example the BMW Motorcycle Owners of America, and the Dominar Owners Club (DOC), an exclusive motorcycle group for Bajaj Dominar bikes only.
There are also specific clubs for women, such as Women's International Motorcycle Association, and clubs for lesbians and gays, such as Dykes on Bikes.
Clubs catering for those interested in vintage machines such as the Vintage Motor Cycle Club are also popular as well as those centered on particular venues. Clubs catering for riders' rights such as the Motorcycle Action Group, and charities like The Royal British Legion Riders Branch are also popular.
Many affiliate with an umbrella organization, such as the British Motorcyclists Federation in the UK, or FEMA in Europe. Producing national and local branch club magazines and events are typical activities of such clubs.
Other organizations whose activities primarily involve motorcycles exist for specific purposes or social causes such as the Patriot Guard Riders, who provide funeral escorts for military veterans, and Rolling Thunder, which advocates for troops missing in action and prisoners of war. While neither of the latter two groups require a motorcycle for membership, they are motorcycling-oriented and much of their activity involves rides.
There are numerous religiously oriented clubs such as the Christian Motorcyclists Association, a biker ministry, charities such as Freewheelers EVS, which use motorcycles to provide an out-of-hours emergency medical courier service, and clubs which attract membership from specific groups, such as the Blue Knights Law Enforcement Motorcycle Club, for law enforcement personnel.
History:
One of the first motorcycle clubs was the New York Motorcycle Club, which in 1903 merged with the Alpha Motorcycle Club of Brooklyn to become the Federation of American Motorcyclists. Later, the Motorcycle and Allied Trades Association (M&ATA) formed a Rider Division which spun off into the American Motorcyclist Association.
AMA:
The American Motorcyclist Association (AMA) is the largest American motorcyclist organization. It serves as an umbrella organization for local clubs and sporting events. As of 2015, the AMA counts over 200,000 active members and over 1,200 chartered clubs.
MCs and MCCs:
Main article: Outlaw motorcycle club
The abbreviations MC and MCC are both used to mean "motorcycle club" but have a special social meaning from the point of view of the outlaw or one percenter motorcycling subculture. MC is generally reserved for those clubs that are mutually recognized by other MC or outlaw motorcycle clubs. This is indicated by a motorcyclist wearing an MC patch, or a three-piece patch called colors, on the back of their jacket or riding vest.
Outlaw or one percenter can mean merely that the club is not chartered under the auspices of the American Motorcyclist Association, implying a radical rejection of authority and embracing of the "biker" lifestyle as defined and popularized since the 1950s and represented by such media as Easyriders magazine, the work of painter David Mann and others.
In many contexts the terms overlap with the usual meaning of "outlaw" because some of these clubs, or some of their members, are recognized by law enforcement agencies as taking part in organized crime.
Outside of the outlaw motorcyclist subculture, the words "motorcycle club" carry no pejorative meaning beyond the everyday English definition of the words – a club involving motorcycles, whose members come from every walk of life.
Thus, there are clubs that are culturally and stylistically nothing like outlaw or one percenter clubs, and whose activities and goals not similar to them at all, but still use three-part patches or the initials MC in their name or insignia.
See also: ___________________________________________________________________________
An outlaw motorcycle club, is a motorcycle subculture that has its roots in the immediate post-World War II era of North American society. It is generally centered on the use of cruiser motorcycles, particularly Harley-Davidsons and choppers, and a set of ideals that celebrate freedom, nonconformity to mainstream culture, and loyalty to the biker group.
In the United States, such motorcycle clubs (MCs) are considered "outlaw" not necessarily because they engage in criminal activity, but because they are not sanctioned by the American Motorcyclist Association (AMA) and do not adhere to the AMA's rules. Instead the clubs have their own set of bylaws reflecting the outlaw biker culture.
The U.S. Department of Justice defines "outlaw motorcycle gangs" (OMG) as "organizations whose members use their motorcycle clubs as conduits for criminal enterprises".
Organization and leadership:
While organizations may vary, the typical internal organization of a motorcycle club consists of a president, vice president, treasurer, secretary, road captain, and sergeant-at-arms (sometimes known as enforcer).
Localized groups of a single, large MC are called charters and the first charter established for an MC is referred to as the mother charter. The president of the mother charter serves as the president of the entire MC, and sets club policy on a variety of issues.
Larger motorcycle clubs often acquire real estate for use as a clubhouse or private compound.
Membership:
Some "biker" clubs employ a process whereby members must pass several stages such as "friend of the club", "hang-around", and "prospect", on their way to becoming full-patch (see explanation of 'patching' below) members. The actual stages and membership process can and often does vary widely from club to club.
Often, an individual must pass a vote of the membership and swear some level of allegiance to the club. Some clubs have a unique club patch (cut or top rocker) adorned with the term MC that are worn on the rider's vest, known as a kutte.
In these clubs, some amount of hazing may occur during the early stages (i.e. hang-around, prospecting) ranging from the mandatory performance of menial labor tasks for full patch members to sophomoric pranks, and, in rare cases with some outlaw motorcycle clubs, acts of violence.
During this time, the prospect may wear the club name on the back of their vest, but not the full logo, though this practice may vary from club to club.
To become a full member, the prospect or probate must be voted on by the rest of the full club members. Successful admission usually requires more than a simple majority, and some clubs may reject a prospect or a probate for a single dissenting vote.
A formal induction follows, in which the new member affirms his loyalty to the club and its members. The final logo patch is then awarded. Full members are often referred to as "full patch members" or "patchholders" and the step of attaining full membership can be referred to as "being patched".
Biker Culture:
Outlaw motorcycle clubs who identify with this subculture may not necessarily be criminals, with some members expressing their outlaw status on a social level, and equating the word "outlaw" with disregard for the law of groups like the American Motorcyclist Association, not the laws of government.
There are also non-outlaw motorcycle clubs, such as women's motorcycle clubs, who adopt similar insignia, colors, organizational structure and trappings, such as leather outfits typical of outlaw clubs, and, in the case of men, beards, making it difficult for outsiders (especially police) to tell the difference between the two.
Some believe that these other groups are attracted by the mystique of the outlaw image while objecting to the suggestion that they are outlaws.
Charity events:
Outlaw clubs are often prominent at charity events, such as toy runs. Charitable giving is frequently cited as evidence that these clubs do not deserve their negative media image. Outlaw clubs have been accused of using charity rides to mask their criminal nature.
The American Motorcyclist Association has frequently complained of the bad publicity for motorcycling in general caused by outlaw clubs, and they have said that the presence of outlaw clubs at charity events has actually harmed the needy by driving down public participation and reducing donations.
Events such as a 2005 shootout between rival outlaw clubs in the midst of a charity toy drive in California have raised fears about the participation of outlaw biker clubs in charity events.
Authorities have attempted to ban outlaw clubs from charity events, or to restrict the wearing of colors at events in order to avert the sort of inter-club violence that has happened at previous charity runs. In 2002, the Warlocks MC of Pennsylvania sued over their exclusion from a charity event.
Identification:
Main article: Colors (motorcycling)
The primary visual identification of a member of an outlaw motorcycle club is the vest adorned with a large club-specific patch or patches, predominantly located in the middle of the back. The patch(es) will contain a club logo, the name of the club, and the letters MC, and a possible state, province, or other chapter identification.
This garment and the patches themselves are referred to as the colors or cut (a term taken from the early practice of cutting the collars and/or sleeves from a denim or leather jacket).
Many non-outlaw motorcycle riding clubs such as the Harley Owners Group also wear patches on the back of their vests, without including the letters MC.
The club patches always remain property of the club itself, not the member, and only members are allowed to wear the club's patches. Hang-arounds and/or support clubs wear support patches with the club's colors. A member must closely guard their colors, for allowing one's colors to fall into the hands of an outsider is an act of disgrace and may result in loss of membership in a club, or some other punishment.
One-, two-, and three-piece patches:
The colors worn by members of some motorcycle clubs will sometimes follow a convention of using either a one-piece patch for nonconformist social clubs, two-piece patch for clubs paying dues, a three-piece patch for outlaw clubs or side patches. The three-piece patch consists of the club logo and the top and bottom patches, usually crescent shaped, which are referred to as rockers.
The number and arrangement of patches is somewhat indicative of the nature of the club. Though many motorcycle clubs wear the three-piece patch arrangement, this is not necessarily an indication that a club is an outlaw motorcycle club.
Law enforcement agencies have confiscated colors and other club paraphernalia of these types of clubs when they raid a clubhouse or the home of a MC member, and they often display these items at press conferences. These items are then used at trial to support prosecution assertions that MC members perform criminal acts on behalf of their club.
Courts have found that the probative value of such items is far outweighed by their unfairly prejudicial effects on the defense.
One Percenter:
Some outlaw motorcycle clubs can be distinguished by a "1%" patch worn on the colors. This is said to refer to a comment by the American Motorcyclist Association (AMA) that 99% of motorcyclists were law-abiding citizens, implying the last one percent were outlaws.
The alleged AMA comment, supposedly a response to the Hollister riot in 1947, is denied by the AMA, who claim to have no record of such a statement to the press and that the story is a misquote.
Other patches:
Other patches may be worn by members, including phrases and symbols. The style or meaning of these other patches can vary between clubs. Some, such as a skull and crossbones patch, or the motto "Respect Few, Fear None", are worn in some clubs by members who commit murder or other acts of violence on behalf of the club.
There are also wings or biker's wings, which are earned something like jump wings or pilot's wings, but with various color-coded meanings, e.g. in some clubs, it is said that a member who has had sex with a woman with venereal disease can wear green wings.
It has also been suggested that these definitions are a hoax, intended to make fools of those outside the outlaw biker world, and also to serve the purpose of provoking outrage among conservative public and authorities.
Frequently, additional patches may involve symbols, such as the use of Nazi swastikas or the SS Totenkopf. These may not indicate Nazi sympathies, but serve to express the outlaw biker's total rejection of social constraints, and desire for the shock value among those who fail to understand the biker way.
Gender and Race:
Most outlaw motorcycle clubs do not allow women to become full-patch members. Rather, in some 1%er clubs, women have in the past been portrayed as submissive or victims to the men, treated as property, forced into prostitution or street-level drug trafficking, and often physically and sexually abused, their roles as being those of obedient followers and their status as objects.
These women are claimed to pass over any pay they receive to their partners or sometimes to the entire club. This appears to make these groups extremely gender segregated. This has not always been the case, as during the 1950s and 1960s, some Hells Angels chapters had women members.
Academic research has criticized the methodology of such previous studies as being "vague and hazy", and lacking in participant demographics.
Such reports may have made clear statements and authoritative analyses about the role of women associated with outlaw motorcycle clubs, but few state how they have come to such conclusions; one admitting that, "[his] interviews with biker women were limited lest [his] intentions were misinterpreted" by their male companions and that such views of women are mythic and "sexist research" in itself, using deeply flawed methodologies and serve two highly political purposes of maintaining a dominance myth of women by men and amplifying the deviance of the male club members.
These myths about the women are: that they are subservient working class women, used as objects for club sexual rituals; are hard-bitten, unattractive, and politically conservative; and that they are 'money makers' for the biker men and clubs, i.e., prostitutes, topless barmaids or strippers who are forced to hand over their money to the club.
A 1990 paper noted the changing role of women within outlaw motorcycle clubs and a 2000 paper stated that they now have agency and political savvy, and have reframed the narratives of their lives. "We did it. We showed them we are real women dealing with real men. I'd much prefer to be living with an OMC member than some dork who is a pawn in the system", said one woman who felt she and her peers had "set the record straight".
One woman in 2001 described the previous work done by men about women in the outlaw motorcycle club world by saying "the men that wrote that must be meatheads". They are part of the scene because they want to be and enjoy it. These women have broken from society's stereotypically defined roles and find freedom with the biker world.
Outlaw motorcycle clubs reflect their social roots and the demographics of motorcyclists in general. High-profile outlaw bikers have historically been white and their clubs are typically exclusively racially homogeneous. Other sources state outright, that "With few exceptions, blacks are excluded from membership or riding with one-percenter biker clubs."
The average age for a club studied was 34. There are black clubs, white clubs, and Mexican and other Spanish-speaking clubs. Although race does not appear to be important as a creed or philosophical orientation to them, virtually all of the clubs are racially unmixed.
Bikers in American prison, as prisoners generally do, band together along racial lines. It is claimed that racial discrimination within clubs has led to creation of rival clubs in the past, such as the Mongols Motorcycle Club after members were rejected by the local Hells Angels chapter. Some clubs or individual chapters are now multi-racial, but the number of "white supremacist biker clubs are growing nationwide".
Outlaw motorcycle clubs and crime:
Some members of outlaw motorcycle clubs engage in criminal activities and organized crime.
Despite their connection with motorcycles and the "one percenter" subculture, law enforcement agencies perceive such individuals and motorcycle clubs as being unique among criminal groups because they maintain websites and businesses, identify themselves through patches and tattoos, write and obey constitutions and bylaws, trademark their club names and logos, and even hold publicity campaigns aimed at improving their public image.
Outlaw motorcycle clubs as criminal enterprises:
The U.S. Federal Bureau of Investigation (FBI) and Criminal Intelligence Service Canada have designated four MCs as "outlaw motorcycle gangs": the Hells Angels, the Pagans, the Outlaws, and the Bandidos, known as the "Big Four".
These four have a large enough national impact to be prosecuted under the U.S. Federal Racketeer Influenced and Corrupt Organizations (RICO) statute. The California Attorney General also lists the Mongols and the Vagos Motorcycle Club as outlaw motorcycle gangs.
The FBI asserts that OMGs support themselves primarily through drug dealing, trafficking in stolen goods, and extortion, and that they fight over territory and the illegal drug trade and collect $1 billion in illegal income annually.
In 1985 a three-year, eleven-state FBI operation named Roughrider culminated in the largest OMG bust in history, with the confiscation of $2 million worth of illegal drugs, as well as an illegal arsenal of weapons, ranging from Uzi submachine guns to antitank weapons.
In October 2008, the FBI announced the end of a six-month undercover operation by agents into the narcotics trafficking by the Mongols Motorcycle Club. The bust went down with 160 search warrants and 110 arrest warrants.
Canada, especially, has in the late 20th century experienced a significant upsurge in crime involving outlaw motorcycle clubs, most notably in what has been dubbed the Quebec Biker War, which has involved more than 150 murders (plus a young bystander killed by an exploding car bomb), 84 bombings, and 130 cases of arson.
The increased violence in Canada has been attributed to turf wars over the illegal drug trafficking business, specifically relating to access to the Port of Montreal, but also as the Hells Angels have sought to obtain control of the street level trade from other rival and/or independent gangs in various regions of Canada.
The Royal Canadian Mounted Police Gazette, quoting from the Provincial Court of Manitoba, defines these groups as: "Any group of motorcycle enthusiasts who have voluntarily made a commitment to band together and abide by their organizations' rigorous rules enforced by violence, who engage in activities that bring them and their club into serious conflict with society and the law".
Members and supporters of these clubs insist that illegal activities are isolated occurrences and that they, as a whole, are not criminal organizations. They often compare themselves to police departments, wherein the occasional "bad cop" does not make a police department a criminal organization and the Hells Angels sponsors charitable events for Toys for Tots in an attempt to legitimize themselves with public opinion.
Contrary to other criminal organizations, OMGs operate on an individual basis instead of top-down, which is how supporters can claim that only some members are committing crimes. Belonging guarantees to each member the option of running criminal activity, using other members as support—the main characteristic of OMGs being "amoral individualism", in contrast to the hierarchical orders and bonds of "amoral familism" of other criminal organizations such as the Mafia.
U.S. Bureau of Alcohol, Tobacco, Firearms and Explosives (ATF) agent William Queen, who infiltrated the Mongols, wrote that what makes a group like them different from the Mafia is that crime and violence are not used as expedients in pursuit of profit, but that the priorities are reversed. Mayhem and lawlessness are inherent in living "The Life" and the money they obtain by illegal means is only wanted as a way to perpetuate that lifestyle.
Recently, authorities have tried tactics aimed at undermining the gang identity and breaking up the membership. But in June 2011 the High Court of Australia overturned a law that outlawed crime-focused motorcycle clubs and required members to avoid contact with one another.
In the U.S., a Federal judge rejected a prosecutor's request to seize ownership of the Mongols Motorcycle Club logo and name, saying the government had no right to the trademarks. Federal prosecutors had requested, as part of a larger criminal indictment, a court order giving the government ownership of the logo in order to prevent members from wearing the club's colors.
Relationships between outlaw motorcycle clubs:
Certain large one-percent MCs have rivalries between each other and will fight over territory and other issues. Sometimes smaller clubs are forced into or willingly accept supportive roles for a larger one-percent club and are sometimes required to wear a "support patch" on their vests that shows their affiliation with the dominant regional club.
Smaller clubs are often allowed to form with the permission of the dominant regional club. Clubs that resist have been forcibly disbanded by being told to hand over their colours on threat of aggression.
In Australia and the United States, many MCs have established statewide MC coalitions. These coalitions are composed of MCs who have chapters in the state, and the occasional interested third party organization, and hold periodic meetings on neutral ground where representatives from each club meet in closed session to resolve disputes between clubs and discuss issues of common interest.
Local coalitions or confederations of clubs have eliminated some of the inter-club rivalry and together they have acted to hire legal and PR representation.
Cultural Influence:
Outlaw motorcyclists and their clubs have been frequently portrayed and parodied in movies and the media generally, giving rise to an "outlaw biker film" genre.
It generally exists as a negative stereotype in the public's subconscious and yet has inspired fashion trends for both males and, as "biker babes", for females.
The appearance has even been exploited by the fashion industry bringing it into legal conflict with some clubs and simultaneously encouraging a cultural specific fetishistic look that conveys sex, danger, rebelliousness, masculinity, and working class values.
The biker style has influenced the look of other sub-cultures such as punk, heavy metal, leather subculture and cybergoth fashion, and, initially an American subculture, has had an international influence.
Bikers, their clothing, and motorcycles have become cultural icons of mythic status, their portrayal generally exaggerating a criminal or deviant association exploited by the media for their own often financial interests.
In popular culture
Main article: Outlaw biker film
Literature:
Television:
Video games:
See also:
Category:Motorcycling subculture
SubcategoriesThis category has the following 8 subcategories, out of 8 total:
B
Pages in category "Motorcycling subculture"The following 21 pages are in this category, out of 21 total:
B C G H I M N P R S Τ
Types of clubs, groups, and organizations:
Most clubs are either organized around a brand or make, or around a type of riding (e.g. touring).
Motorcycle clubs vary a great deal in their objectives and organizations.
Mainstream motorcycle clubs or associations typically have elected officers and directors, annual dues, and a regular publication. They may also sponsor sports events and annual or more frequent motorcycle rallies where members can socialize.
There are a great many brand clubs, i.e. clubs dedicated to a particular marque, including those sponsored by various manufacturers, modeled on the original brand club, the Harley Owners Group.
There are also large national independent motorcycle clubs, for example the BMW Motorcycle Owners of America, and the Dominar Owners Club (DOC), an exclusive motorcycle group for Bajaj Dominar bikes only.
There are also specific clubs for women, such as Women's International Motorcycle Association, and clubs for lesbians and gays, such as Dykes on Bikes.
Clubs catering for those interested in vintage machines such as the Vintage Motor Cycle Club are also popular as well as those centered on particular venues. Clubs catering for riders' rights such as the Motorcycle Action Group, and charities like The Royal British Legion Riders Branch are also popular.
Many affiliate with an umbrella organization, such as the British Motorcyclists Federation in the UK, or FEMA in Europe. Producing national and local branch club magazines and events are typical activities of such clubs.
Other organizations whose activities primarily involve motorcycles exist for specific purposes or social causes such as the Patriot Guard Riders, who provide funeral escorts for military veterans, and Rolling Thunder, which advocates for troops missing in action and prisoners of war. While neither of the latter two groups require a motorcycle for membership, they are motorcycling-oriented and much of their activity involves rides.
There are numerous religiously oriented clubs such as the Christian Motorcyclists Association, a biker ministry, charities such as Freewheelers EVS, which use motorcycles to provide an out-of-hours emergency medical courier service, and clubs which attract membership from specific groups, such as the Blue Knights Law Enforcement Motorcycle Club, for law enforcement personnel.
History:
One of the first motorcycle clubs was the New York Motorcycle Club, which in 1903 merged with the Alpha Motorcycle Club of Brooklyn to become the Federation of American Motorcyclists. Later, the Motorcycle and Allied Trades Association (M&ATA) formed a Rider Division which spun off into the American Motorcyclist Association.
AMA:
The American Motorcyclist Association (AMA) is the largest American motorcyclist organization. It serves as an umbrella organization for local clubs and sporting events. As of 2015, the AMA counts over 200,000 active members and over 1,200 chartered clubs.
MCs and MCCs:
Main article: Outlaw motorcycle club
The abbreviations MC and MCC are both used to mean "motorcycle club" but have a special social meaning from the point of view of the outlaw or one percenter motorcycling subculture. MC is generally reserved for those clubs that are mutually recognized by other MC or outlaw motorcycle clubs. This is indicated by a motorcyclist wearing an MC patch, or a three-piece patch called colors, on the back of their jacket or riding vest.
Outlaw or one percenter can mean merely that the club is not chartered under the auspices of the American Motorcyclist Association, implying a radical rejection of authority and embracing of the "biker" lifestyle as defined and popularized since the 1950s and represented by such media as Easyriders magazine, the work of painter David Mann and others.
In many contexts the terms overlap with the usual meaning of "outlaw" because some of these clubs, or some of their members, are recognized by law enforcement agencies as taking part in organized crime.
Outside of the outlaw motorcyclist subculture, the words "motorcycle club" carry no pejorative meaning beyond the everyday English definition of the words – a club involving motorcycles, whose members come from every walk of life.
Thus, there are clubs that are culturally and stylistically nothing like outlaw or one percenter clubs, and whose activities and goals not similar to them at all, but still use three-part patches or the initials MC in their name or insignia.
See also: ___________________________________________________________________________
An outlaw motorcycle club, is a motorcycle subculture that has its roots in the immediate post-World War II era of North American society. It is generally centered on the use of cruiser motorcycles, particularly Harley-Davidsons and choppers, and a set of ideals that celebrate freedom, nonconformity to mainstream culture, and loyalty to the biker group.
In the United States, such motorcycle clubs (MCs) are considered "outlaw" not necessarily because they engage in criminal activity, but because they are not sanctioned by the American Motorcyclist Association (AMA) and do not adhere to the AMA's rules. Instead the clubs have their own set of bylaws reflecting the outlaw biker culture.
The U.S. Department of Justice defines "outlaw motorcycle gangs" (OMG) as "organizations whose members use their motorcycle clubs as conduits for criminal enterprises".
Organization and leadership:
While organizations may vary, the typical internal organization of a motorcycle club consists of a president, vice president, treasurer, secretary, road captain, and sergeant-at-arms (sometimes known as enforcer).
Localized groups of a single, large MC are called charters and the first charter established for an MC is referred to as the mother charter. The president of the mother charter serves as the president of the entire MC, and sets club policy on a variety of issues.
Larger motorcycle clubs often acquire real estate for use as a clubhouse or private compound.
Membership:
Some "biker" clubs employ a process whereby members must pass several stages such as "friend of the club", "hang-around", and "prospect", on their way to becoming full-patch (see explanation of 'patching' below) members. The actual stages and membership process can and often does vary widely from club to club.
Often, an individual must pass a vote of the membership and swear some level of allegiance to the club. Some clubs have a unique club patch (cut or top rocker) adorned with the term MC that are worn on the rider's vest, known as a kutte.
In these clubs, some amount of hazing may occur during the early stages (i.e. hang-around, prospecting) ranging from the mandatory performance of menial labor tasks for full patch members to sophomoric pranks, and, in rare cases with some outlaw motorcycle clubs, acts of violence.
During this time, the prospect may wear the club name on the back of their vest, but not the full logo, though this practice may vary from club to club.
To become a full member, the prospect or probate must be voted on by the rest of the full club members. Successful admission usually requires more than a simple majority, and some clubs may reject a prospect or a probate for a single dissenting vote.
A formal induction follows, in which the new member affirms his loyalty to the club and its members. The final logo patch is then awarded. Full members are often referred to as "full patch members" or "patchholders" and the step of attaining full membership can be referred to as "being patched".
Biker Culture:
Outlaw motorcycle clubs who identify with this subculture may not necessarily be criminals, with some members expressing their outlaw status on a social level, and equating the word "outlaw" with disregard for the law of groups like the American Motorcyclist Association, not the laws of government.
There are also non-outlaw motorcycle clubs, such as women's motorcycle clubs, who adopt similar insignia, colors, organizational structure and trappings, such as leather outfits typical of outlaw clubs, and, in the case of men, beards, making it difficult for outsiders (especially police) to tell the difference between the two.
Some believe that these other groups are attracted by the mystique of the outlaw image while objecting to the suggestion that they are outlaws.
Charity events:
Outlaw clubs are often prominent at charity events, such as toy runs. Charitable giving is frequently cited as evidence that these clubs do not deserve their negative media image. Outlaw clubs have been accused of using charity rides to mask their criminal nature.
The American Motorcyclist Association has frequently complained of the bad publicity for motorcycling in general caused by outlaw clubs, and they have said that the presence of outlaw clubs at charity events has actually harmed the needy by driving down public participation and reducing donations.
Events such as a 2005 shootout between rival outlaw clubs in the midst of a charity toy drive in California have raised fears about the participation of outlaw biker clubs in charity events.
Authorities have attempted to ban outlaw clubs from charity events, or to restrict the wearing of colors at events in order to avert the sort of inter-club violence that has happened at previous charity runs. In 2002, the Warlocks MC of Pennsylvania sued over their exclusion from a charity event.
Identification:
Main article: Colors (motorcycling)
The primary visual identification of a member of an outlaw motorcycle club is the vest adorned with a large club-specific patch or patches, predominantly located in the middle of the back. The patch(es) will contain a club logo, the name of the club, and the letters MC, and a possible state, province, or other chapter identification.
This garment and the patches themselves are referred to as the colors or cut (a term taken from the early practice of cutting the collars and/or sleeves from a denim or leather jacket).
Many non-outlaw motorcycle riding clubs such as the Harley Owners Group also wear patches on the back of their vests, without including the letters MC.
The club patches always remain property of the club itself, not the member, and only members are allowed to wear the club's patches. Hang-arounds and/or support clubs wear support patches with the club's colors. A member must closely guard their colors, for allowing one's colors to fall into the hands of an outsider is an act of disgrace and may result in loss of membership in a club, or some other punishment.
One-, two-, and three-piece patches:
The colors worn by members of some motorcycle clubs will sometimes follow a convention of using either a one-piece patch for nonconformist social clubs, two-piece patch for clubs paying dues, a three-piece patch for outlaw clubs or side patches. The three-piece patch consists of the club logo and the top and bottom patches, usually crescent shaped, which are referred to as rockers.
The number and arrangement of patches is somewhat indicative of the nature of the club. Though many motorcycle clubs wear the three-piece patch arrangement, this is not necessarily an indication that a club is an outlaw motorcycle club.
Law enforcement agencies have confiscated colors and other club paraphernalia of these types of clubs when they raid a clubhouse or the home of a MC member, and they often display these items at press conferences. These items are then used at trial to support prosecution assertions that MC members perform criminal acts on behalf of their club.
Courts have found that the probative value of such items is far outweighed by their unfairly prejudicial effects on the defense.
One Percenter:
Some outlaw motorcycle clubs can be distinguished by a "1%" patch worn on the colors. This is said to refer to a comment by the American Motorcyclist Association (AMA) that 99% of motorcyclists were law-abiding citizens, implying the last one percent were outlaws.
The alleged AMA comment, supposedly a response to the Hollister riot in 1947, is denied by the AMA, who claim to have no record of such a statement to the press and that the story is a misquote.
Other patches:
Other patches may be worn by members, including phrases and symbols. The style or meaning of these other patches can vary between clubs. Some, such as a skull and crossbones patch, or the motto "Respect Few, Fear None", are worn in some clubs by members who commit murder or other acts of violence on behalf of the club.
There are also wings or biker's wings, which are earned something like jump wings or pilot's wings, but with various color-coded meanings, e.g. in some clubs, it is said that a member who has had sex with a woman with venereal disease can wear green wings.
It has also been suggested that these definitions are a hoax, intended to make fools of those outside the outlaw biker world, and also to serve the purpose of provoking outrage among conservative public and authorities.
Frequently, additional patches may involve symbols, such as the use of Nazi swastikas or the SS Totenkopf. These may not indicate Nazi sympathies, but serve to express the outlaw biker's total rejection of social constraints, and desire for the shock value among those who fail to understand the biker way.
Gender and Race:
Most outlaw motorcycle clubs do not allow women to become full-patch members. Rather, in some 1%er clubs, women have in the past been portrayed as submissive or victims to the men, treated as property, forced into prostitution or street-level drug trafficking, and often physically and sexually abused, their roles as being those of obedient followers and their status as objects.
These women are claimed to pass over any pay they receive to their partners or sometimes to the entire club. This appears to make these groups extremely gender segregated. This has not always been the case, as during the 1950s and 1960s, some Hells Angels chapters had women members.
Academic research has criticized the methodology of such previous studies as being "vague and hazy", and lacking in participant demographics.
Such reports may have made clear statements and authoritative analyses about the role of women associated with outlaw motorcycle clubs, but few state how they have come to such conclusions; one admitting that, "[his] interviews with biker women were limited lest [his] intentions were misinterpreted" by their male companions and that such views of women are mythic and "sexist research" in itself, using deeply flawed methodologies and serve two highly political purposes of maintaining a dominance myth of women by men and amplifying the deviance of the male club members.
These myths about the women are: that they are subservient working class women, used as objects for club sexual rituals; are hard-bitten, unattractive, and politically conservative; and that they are 'money makers' for the biker men and clubs, i.e., prostitutes, topless barmaids or strippers who are forced to hand over their money to the club.
A 1990 paper noted the changing role of women within outlaw motorcycle clubs and a 2000 paper stated that they now have agency and political savvy, and have reframed the narratives of their lives. "We did it. We showed them we are real women dealing with real men. I'd much prefer to be living with an OMC member than some dork who is a pawn in the system", said one woman who felt she and her peers had "set the record straight".
One woman in 2001 described the previous work done by men about women in the outlaw motorcycle club world by saying "the men that wrote that must be meatheads". They are part of the scene because they want to be and enjoy it. These women have broken from society's stereotypically defined roles and find freedom with the biker world.
Outlaw motorcycle clubs reflect their social roots and the demographics of motorcyclists in general. High-profile outlaw bikers have historically been white and their clubs are typically exclusively racially homogeneous. Other sources state outright, that "With few exceptions, blacks are excluded from membership or riding with one-percenter biker clubs."
The average age for a club studied was 34. There are black clubs, white clubs, and Mexican and other Spanish-speaking clubs. Although race does not appear to be important as a creed or philosophical orientation to them, virtually all of the clubs are racially unmixed.
Bikers in American prison, as prisoners generally do, band together along racial lines. It is claimed that racial discrimination within clubs has led to creation of rival clubs in the past, such as the Mongols Motorcycle Club after members were rejected by the local Hells Angels chapter. Some clubs or individual chapters are now multi-racial, but the number of "white supremacist biker clubs are growing nationwide".
Outlaw motorcycle clubs and crime:
Some members of outlaw motorcycle clubs engage in criminal activities and organized crime.
Despite their connection with motorcycles and the "one percenter" subculture, law enforcement agencies perceive such individuals and motorcycle clubs as being unique among criminal groups because they maintain websites and businesses, identify themselves through patches and tattoos, write and obey constitutions and bylaws, trademark their club names and logos, and even hold publicity campaigns aimed at improving their public image.
Outlaw motorcycle clubs as criminal enterprises:
The U.S. Federal Bureau of Investigation (FBI) and Criminal Intelligence Service Canada have designated four MCs as "outlaw motorcycle gangs": the Hells Angels, the Pagans, the Outlaws, and the Bandidos, known as the "Big Four".
These four have a large enough national impact to be prosecuted under the U.S. Federal Racketeer Influenced and Corrupt Organizations (RICO) statute. The California Attorney General also lists the Mongols and the Vagos Motorcycle Club as outlaw motorcycle gangs.
The FBI asserts that OMGs support themselves primarily through drug dealing, trafficking in stolen goods, and extortion, and that they fight over territory and the illegal drug trade and collect $1 billion in illegal income annually.
In 1985 a three-year, eleven-state FBI operation named Roughrider culminated in the largest OMG bust in history, with the confiscation of $2 million worth of illegal drugs, as well as an illegal arsenal of weapons, ranging from Uzi submachine guns to antitank weapons.
In October 2008, the FBI announced the end of a six-month undercover operation by agents into the narcotics trafficking by the Mongols Motorcycle Club. The bust went down with 160 search warrants and 110 arrest warrants.
Canada, especially, has in the late 20th century experienced a significant upsurge in crime involving outlaw motorcycle clubs, most notably in what has been dubbed the Quebec Biker War, which has involved more than 150 murders (plus a young bystander killed by an exploding car bomb), 84 bombings, and 130 cases of arson.
The increased violence in Canada has been attributed to turf wars over the illegal drug trafficking business, specifically relating to access to the Port of Montreal, but also as the Hells Angels have sought to obtain control of the street level trade from other rival and/or independent gangs in various regions of Canada.
The Royal Canadian Mounted Police Gazette, quoting from the Provincial Court of Manitoba, defines these groups as: "Any group of motorcycle enthusiasts who have voluntarily made a commitment to band together and abide by their organizations' rigorous rules enforced by violence, who engage in activities that bring them and their club into serious conflict with society and the law".
Members and supporters of these clubs insist that illegal activities are isolated occurrences and that they, as a whole, are not criminal organizations. They often compare themselves to police departments, wherein the occasional "bad cop" does not make a police department a criminal organization and the Hells Angels sponsors charitable events for Toys for Tots in an attempt to legitimize themselves with public opinion.
Contrary to other criminal organizations, OMGs operate on an individual basis instead of top-down, which is how supporters can claim that only some members are committing crimes. Belonging guarantees to each member the option of running criminal activity, using other members as support—the main characteristic of OMGs being "amoral individualism", in contrast to the hierarchical orders and bonds of "amoral familism" of other criminal organizations such as the Mafia.
U.S. Bureau of Alcohol, Tobacco, Firearms and Explosives (ATF) agent William Queen, who infiltrated the Mongols, wrote that what makes a group like them different from the Mafia is that crime and violence are not used as expedients in pursuit of profit, but that the priorities are reversed. Mayhem and lawlessness are inherent in living "The Life" and the money they obtain by illegal means is only wanted as a way to perpetuate that lifestyle.
Recently, authorities have tried tactics aimed at undermining the gang identity and breaking up the membership. But in June 2011 the High Court of Australia overturned a law that outlawed crime-focused motorcycle clubs and required members to avoid contact with one another.
In the U.S., a Federal judge rejected a prosecutor's request to seize ownership of the Mongols Motorcycle Club logo and name, saying the government had no right to the trademarks. Federal prosecutors had requested, as part of a larger criminal indictment, a court order giving the government ownership of the logo in order to prevent members from wearing the club's colors.
Relationships between outlaw motorcycle clubs:
Certain large one-percent MCs have rivalries between each other and will fight over territory and other issues. Sometimes smaller clubs are forced into or willingly accept supportive roles for a larger one-percent club and are sometimes required to wear a "support patch" on their vests that shows their affiliation with the dominant regional club.
Smaller clubs are often allowed to form with the permission of the dominant regional club. Clubs that resist have been forcibly disbanded by being told to hand over their colours on threat of aggression.
In Australia and the United States, many MCs have established statewide MC coalitions. These coalitions are composed of MCs who have chapters in the state, and the occasional interested third party organization, and hold periodic meetings on neutral ground where representatives from each club meet in closed session to resolve disputes between clubs and discuss issues of common interest.
Local coalitions or confederations of clubs have eliminated some of the inter-club rivalry and together they have acted to hire legal and PR representation.
Cultural Influence:
Outlaw motorcyclists and their clubs have been frequently portrayed and parodied in movies and the media generally, giving rise to an "outlaw biker film" genre.
It generally exists as a negative stereotype in the public's subconscious and yet has inspired fashion trends for both males and, as "biker babes", for females.
The appearance has even been exploited by the fashion industry bringing it into legal conflict with some clubs and simultaneously encouraging a cultural specific fetishistic look that conveys sex, danger, rebelliousness, masculinity, and working class values.
The biker style has influenced the look of other sub-cultures such as punk, heavy metal, leather subculture and cybergoth fashion, and, initially an American subculture, has had an international influence.
Bikers, their clothing, and motorcycles have become cultural icons of mythic status, their portrayal generally exaggerating a criminal or deviant association exploited by the media for their own often financial interests.
In popular culture
Main article: Outlaw biker film
Literature:
- The outlaw biker film genre really took off in the mid-1960s, after the Hells Angels club became prominent in the media, in particular, after Hunter S. Thompson's book Hell's Angels: The Strange and Terrible Saga of the Outlaw Motorcycle Gangs (1966) was published.
Television:
- The mini-series The Last Chapter (2002) was set in Toronto and Montreal, and portrayed a fictional feud reminiscent of the Quebec Biker War in which The Triple Sixers MC attempted to establish a chapter in the province of Ontario. This show predated Sons of Anarchy by six years.
- Sons of Anarchy portrays a fictional outlaw motorcycle club, founded mainly by Vietnam War veterans, which is involved in various criminal activity and associated with underworld gangs. The show's creator thought it was too obvious to have them be methamphetamine dealers, and so instead they traffic illegal guns.
- Bikie Wars: Brothers in Arms: The six-episode series dramatises the story of the Milperra massacre, when the Bandidos and the Comanchero motorcycle clubs went to war on Father's Day, Sunday 2 September 1984. The massacre had its beginnings after a group of Comancheros broke away and formed the first Bandidos Motorcycle Club chapter in Australia.
- This resulted in intense rivalry between the two chapters. At a public swap meet at the Viking Tavern at Milperra, New South Wales, a brief but violent battle ensued with seven people shot dead, including a 14-year-old innocent female bystander.
- A further 28 people were wounded with 20 requiring hospitalisation. Each episode starts with a quote stated by Justice Adrian Roden when the clubs went before the New South Wales Supreme Court; "As patriotism can lead to jingoism and mateship can lead to cronyism, so bikie club loyalty can lead to bikie club war."
- Gangland Undercover is an American dramatized series inspired by the true story of police informant Charles Falco, who infiltrated several bike clubs in the United States in the early 2000s.
- Mayans M.C. is a spin-off to Sons of Anarchy centered around the Sons rivals turned allies, the all Hispanic Mayans Motorcycle Club.
Video games:
- Grand Theft Auto: The Lost and Damned is the first of two episodic expansion packs developed for the Xbox 360, PlayStation 3 and PC versions of Grand Theft Auto IV, developed by Rockstar North and published by Rockstar Games. It was first released for the Xbox 360 on 17 February 2009 and on PlayStation 3 and Microsoft Windows on 13 April 2010. The protagonist of The Lost and Damned is Johnny Klebitz, a member of Liberty City's biker club The Lost. Other fictional motorcycle clubs also appear in the dlc; the Angels of Death and the Uptown Riders.
- Grand Theft Auto 5, characters from the Lost and Damned appear in the video game after the Lost MC expanded into San Andreas after the events of the expansion pack. They are an antagonistic force towards Trevor Philips, one of the three protagonists.
- Grand Theft Auto Online received an outlaw-biker themed update known as GTA Online: Bikers, released on October 4, 2016. It brought back several vehicles, the ability to hit people off of bikes from The Lost And Damned, and introduced new weapons, clothing, tattoos, vehicles, properties, missions and other new mechanics and various bug fixes.
- Days Gone is a 2019 post-apocalyptic survival game set in Oregon where the protagonist, Deacon St. John, is a former member of an outlaw motorcycle club; known as the Mongrels, and still wears colors.
See also:
- List of outlaw motorcycle clubs
- Outlaw biker film
- One percenter motorcycle clubs at Curlie
- Outlaw biker gangs at Curlie
Category:Motorcycling subculture
SubcategoriesThis category has the following 8 subcategories, out of 8 total:
B
- ► Biker bars (7 P)
- ► Motorcycle customization (3 C, 11 P)
- ► Motorcycle apparel (1 P)
- ► Motorcycle clubs (3 C, 11 P)
- ► Motorcycling films (3 C, 36 P)
- ► Outlaw motorcycle clubs (6 C, 61 P)
- ► Motorcycling subculture in the United Kingdom (1 C, 5 P)
- ► Motorcycling subculture in the United States (1 C, 10 P)
Pages in category "Motorcycling subculture"The following 21 pages are in this category, out of 21 total:
B C G H I M N P R S Τ
Gun Culture in the United States
- YouTube Video: Chilling animation: Parkland shooter's movements in school
- YouTube Video: El Paso shooting survivor recounts massacre at Walmart
- YouTube Video: Video and 911 call released in deadly Dayton, Ohio shooting
In the United States, gun culture encompasses the behaviors, attitudes, and beliefs about firearms and their usage by civilians.
Gun ownership in the United States is constitutionally protected by the United States Bill of Rights.
Firearms are widely used in the United States of America for self-defense, hunting, and recreational uses, such as target shooting. Gun politics in the United States tends to be polarized between advocates of gun rights, often conservative, and those who support stricter gun control, often liberal.
The gun culture of the United States can be considered unique among developed countries in terms of the large number of firearms owned by civilians, generally permissive regulations, and high levels of gun violence.
For more about the Gun Culture in the United States, click on any of the following blue hyperlinks:
Gun ownership in the United States is constitutionally protected by the United States Bill of Rights.
Firearms are widely used in the United States of America for self-defense, hunting, and recreational uses, such as target shooting. Gun politics in the United States tends to be polarized between advocates of gun rights, often conservative, and those who support stricter gun control, often liberal.
The gun culture of the United States can be considered unique among developed countries in terms of the large number of firearms owned by civilians, generally permissive regulations, and high levels of gun violence.
For more about the Gun Culture in the United States, click on any of the following blue hyperlinks:
- History
- Popular culture
- Political and cultural theories
- Terms applied to opponents
- Foreign perspective
- See also:
- Arming America: The Origins of a National Gun Culture, a discredited 2000 book by historian Michael A. Bellesiles
- Global gun cultures
- Gun control
- Gun ownership
- Gun politics
- Gun show loophole
- Gun violence in the United States
- Index of gun politics articles
- "Gun Culture". TIME. March 18, 2009.
- Pilkington, Ed (10 January 2011). "US gun crime: death for sale". The Guardian. London.
- DeBrabander, Firmin (2015). Do Guns Make Us Free?: Democracy and the Armed Society. Yale University Press. ASIN: B07CGH7R79
Western Culture
- YouTube Video about Western Culture
- YouTube Video: Foundations of Western Civilization II I The Great Courses
- YouTube Video: The Basis of Western Civilization
Western culture, sometimes equated with Western civilization, Occidental culture, the Western world, Western society, and European civilization, is the heritage of social norms, ethical values, traditional customs, belief systems, political systems, artifacts and technologies that originated in or are associated with Europe.
The term also applies beyond Europe to countries and cultures whose histories are strongly connected to Europe by immigration, colonization, or influence. For example, Western culture includes countries in the Americas and Australasia, whose language and demographic ethnicity majorities are of European descent. Western culture has its roots in Greco-Roman culture from classical antiquity (see Western canon).
Ancient Greece is considered the birthplace of many elements of Western culture, including the development of a democratic system of government and major advances in philosophy, science and mathematics.
The expansion of Greek culture into the Hellenistic world of the eastern Mediterranean led to a synthesis between Greek and Near-Eastern cultures, and major advances in literature, engineering, and science, and provided the culture for the expansion of early Christianity and the Greek New Testament. This period overlapped with and was followed by Rome, which made key contributions in law, government, engineering and political organization.
The concept of a "West" dates back to the Roman Empire, where there was a cultural divide between the Greek East and Latin West, a divide that later continued in Medieval Europe between the Catholic Latin Church west and the "Greek" Eastern Orthodox east.
Western culture is characterized by a host of artistic, philosophic, literary and legal themes and traditions. Christianity, including the Roman Catholic Church, Protestantism the Eastern Orthodox Church, and Oriental Orthodoxy, has played a prominent role in the shaping of Western civilization since at least the 4th century, as did Judaism.
Before the Cold War era, the traditional English viewpoint identified Western civilization with the Western Christian (Catholic–Protestant) countries and culture. A cornerstone of Western thought, beginning in ancient Greece and continuing through the Middle Ages and Renaissance, is the idea of rationalism in various spheres of life, especially religion, developed by Hellenistic philosophy, scholasticism and humanism.
The Catholic Church was for centuries at the center of the development of the values, ideas, science, laws and institutions which constitute Western civilization. Empiricism later gave rise to the scientific method, the scientific revolution, and the Age of Enlightenment.
Western culture continued to develop with the Christianisation of Europe during the Middle Ages, the reforms triggered by the Renaissance of the 12th century and 13th century under the influence of the Islamic world via Spain and Sicily (including the transfer of technology from the East, and Latin translations of Arabic texts on science and philosophy), and the Italian Renaissance as Greek scholars fleeing the fall of the Byzantine Empire brought classical traditions and philosophy.
Medieval Christianity is credited with creating the modern university, the modern hospital system, scientific economics, and natural law (which would later influence the creation of international law). Christianity played a role in ending practices common among pagan societies, such as human sacrifice, slavery, infanticide and polygamy.
The globalization by successive European colonial empires spread European ways of life and European educational methods around the world between the 16th and 20th centuries. European culture developed with a complex range of philosophy, medieval scholasticism, mysticism and Christian and secular humanism. Rational thinking developed through a long age of change and formation, with the experiments of the Enlightenment and breakthroughs in the sciences.
Tendencies that have come to define modern Western societies include the concept of political pluralism, individualism, prominent subcultures or countercultures (such as New Age movements) and increasing cultural syncretism resulting from globalization and human migration.
Click on any of the following blue hyperlinks for more about Western Culture:
The term also applies beyond Europe to countries and cultures whose histories are strongly connected to Europe by immigration, colonization, or influence. For example, Western culture includes countries in the Americas and Australasia, whose language and demographic ethnicity majorities are of European descent. Western culture has its roots in Greco-Roman culture from classical antiquity (see Western canon).
Ancient Greece is considered the birthplace of many elements of Western culture, including the development of a democratic system of government and major advances in philosophy, science and mathematics.
The expansion of Greek culture into the Hellenistic world of the eastern Mediterranean led to a synthesis between Greek and Near-Eastern cultures, and major advances in literature, engineering, and science, and provided the culture for the expansion of early Christianity and the Greek New Testament. This period overlapped with and was followed by Rome, which made key contributions in law, government, engineering and political organization.
The concept of a "West" dates back to the Roman Empire, where there was a cultural divide between the Greek East and Latin West, a divide that later continued in Medieval Europe between the Catholic Latin Church west and the "Greek" Eastern Orthodox east.
Western culture is characterized by a host of artistic, philosophic, literary and legal themes and traditions. Christianity, including the Roman Catholic Church, Protestantism the Eastern Orthodox Church, and Oriental Orthodoxy, has played a prominent role in the shaping of Western civilization since at least the 4th century, as did Judaism.
Before the Cold War era, the traditional English viewpoint identified Western civilization with the Western Christian (Catholic–Protestant) countries and culture. A cornerstone of Western thought, beginning in ancient Greece and continuing through the Middle Ages and Renaissance, is the idea of rationalism in various spheres of life, especially religion, developed by Hellenistic philosophy, scholasticism and humanism.
The Catholic Church was for centuries at the center of the development of the values, ideas, science, laws and institutions which constitute Western civilization. Empiricism later gave rise to the scientific method, the scientific revolution, and the Age of Enlightenment.
Western culture continued to develop with the Christianisation of Europe during the Middle Ages, the reforms triggered by the Renaissance of the 12th century and 13th century under the influence of the Islamic world via Spain and Sicily (including the transfer of technology from the East, and Latin translations of Arabic texts on science and philosophy), and the Italian Renaissance as Greek scholars fleeing the fall of the Byzantine Empire brought classical traditions and philosophy.
Medieval Christianity is credited with creating the modern university, the modern hospital system, scientific economics, and natural law (which would later influence the creation of international law). Christianity played a role in ending practices common among pagan societies, such as human sacrifice, slavery, infanticide and polygamy.
The globalization by successive European colonial empires spread European ways of life and European educational methods around the world between the 16th and 20th centuries. European culture developed with a complex range of philosophy, medieval scholasticism, mysticism and Christian and secular humanism. Rational thinking developed through a long age of change and formation, with the experiments of the Enlightenment and breakthroughs in the sciences.
Tendencies that have come to define modern Western societies include the concept of political pluralism, individualism, prominent subcultures or countercultures (such as New Age movements) and increasing cultural syncretism resulting from globalization and human migration.
Click on any of the following blue hyperlinks for more about Western Culture:
- Terminology
- History
- Arts and humanities
- Scientific and technological inventions and discoveries
- Media
- Religion
- Sport
- Themes and traditions
- See also:
LGBT Community including LGBT Demographics of the United States
- YouTube Video about LGBT Rights
- YouTube Video: Top 10 Most Inspirational LGBTQ+ Celebrities
- YouTube Video: LGBTQ+ Documentary | Are People Born Gay? | Survival of the Fabulous | Only Human
Click here for LGBT Demographics of the United States by Year: 1990 - Present
The following demonstrates the difference in demographics between the starting year of 1990 up to February, 2021:
The LGBT community (also known as the LGBTQ community, GLBT community, or the gay community) is a loosely defined grouping of
These communities generally celebrate pride, diversity, individuality, and sexuality. LGBT activists and sociologists see LGBT community-building as a counterweight to heterosexism, homophobia, biphobia, transphobia, sexualism, and conformist pressures that exist in the larger society.
The term pride or sometimes gay pride expresses the LGBT community's identity and collective strength; pride parades provide both a prime example of the use and a demonstration of the general meaning of the term. The LGBT community is diverse in political affiliation. Not all people who are lesbian, gay, bisexual, or transgender consider themselves part of the LGBT community.
Groups that may be considered part of the LGBT community include the following:
LGBT communities may organize themselves into, or support, movements for civil rights promoting LGBT rights in various places around the world.
Terminology:
Main article: LGBT
LGBT, or GLBT, is an initialism that stands for lesbian, gay, bisexual, and transgender. In use since the 1990s, the term is an adaptation of the initialism LGB, which was used to replace the term gay - when referring to the community as a whole - beginning in about the mid-to-late 1980s.
While the movement had always included all LGBT people, the one-word unifying term in the 1950s through the early 1980s was gay (see Gay liberation). Later, this was expanded by many groups to lesbian and gay, to be more representative. In the late eighties and early nineties, queer was also reclaimed as a one-word alternative to the ever-lengthening string of initials, especially when used by radical political groups.
The initialism, as well as common variants such as LGBTQ, have been adopted into the mainstream in the 1990s as an umbrella term for use when labeling topics about sexuality and gender identity. For example, the LGBT Movement Advancement Project termed community centers, which have services specific to those members of the LGBT community, as "LGBT community centers" in comprehensive studies of such centers around the United States.
The initialism LGBT is intended to emphasize a diversity of sexuality and gender identity-based cultures. It may refer to anyone who is non-heterosexual or non-cisgender, instead of exclusively to people who are lesbian, gay, bisexual, or transgender
Recognize this inclusion as a popular variant that adds the letter Q for those who identify as queer or are questioning their sexual identity; LGBTQ has been recorded since 1996.
Symbols:
Main article: LGBT symbols
The gay community is frequently associated with certain symbols, especially the rainbow or rainbow flags. The Greek lambda symbol ("L" for liberation), triangles, ribbons, and gender symbols are also used as "gay acceptance" symbol. There are many types of flags to represent subdivisions in the gay community, but the most commonly recognized one is the rainbow flag.
According to Gilbert Baker, creator of the commonly known rainbow flag, each color represents a value in the community:
Later, pink and indigo were removed from the flag, resulting in the present-day flag which was first presented at the 1979 Pride Parade. Other flags include the Victory over AIDS flag, the Leather Pride flag, and the Bear Pride flag.
The lambda symbol was originally adopted by Gay Activists Alliance of New York in 1970 after they broke away from the larger Gay Liberation Front. Lambda was chosen because people might confuse it for a college symbol and not recognize it as a gay community symbol unless one was actually involved in the community. "Back in December of 1974, the lambda was officially declared the international symbol for gay and lesbian rights by the International Gay Rights Congress in Edinburgh, Scotland."
The triangle became a symbol for the gay community after the Holocaust. Not only did it represent Jews, but homosexuals who were killed because of German law. During the Holocaust, homosexuals were labeled with pink triangles to distinguish between them, Jews, regular prisoners, and political prisoners. The black triangle is similarly a symbol for females only to represent lesbian sisterhood.
The pink and yellow triangle was used to label Jewish homosexuals. Gender symbols have a much longer list of variations of homosexual or bisexual relationships which are clearly recognizable but may not be as popularly seen as the other symbols. Other symbols that relate to the gay community or gay pride include the gay-teen suicide awareness ribbon, AIDS awareness ribbon, labrys, and purple rhinoceros.
In the fall of 1995, the Human Rights Campaign adopted a logo (yellow equal sign on deep blue square) that has become one of the most recognizable symbols of the lesbian, gay, bisexual and transgender community. The logo can be spotted the world over and has become synonymous with the fight for equal rights for LGBT people.
One of the most notable recent changes was made in Philadelphia, Pennsylvania, on June 8, 2017. They added two new stripes to the traditional LGBT+ flag, one black and one brown. These were intended to highlight members of color within the LGBTQIA community.
Human and legal rights:
Main articles: LGBT rights by country or territory and LGBT social movements
The LGBT community represented by a social component of the global community that is believed by many, including heterosexual allies, to be underrepresented in the area of civil rights.
The current struggle of the gay community has been largely brought about by globalization.
In the United States, World War II brought together many closeted rural men from around the nation and exposed them to more progressive attitudes in parts of Europe. Upon returning home after the war, many of these men decided to band together in cities rather than return to their small towns.
Fledgling communities would soon become political in the beginning of the gay rights movement, including monumental incidents at places like Stonewall. Today, many large cities have gay and lesbian community centers. Many universities and colleges across the world have support centers for LGBT students. The Human Rights Campaign, Lambda Legal, the Empowering Spirits Foundation, and GLAAD advocate for LGBT people on a wide range of issues in the United States.
There is also an International Lesbian and Gay Association. In 1947, when the United Kingdom adopted the Universal Declaration of Human Rights (UDHR), LGBT activists clung to its concept of equal, inalienable rights for all people, regardless of their race, gender, or sexual orientation. The declaration does not specifically mention gay rights, but discusses equality and freedom from discrimination.
In 1962, Clark Polak joined The Janus Society in Philadelphia, Pennsylvania. Only a year after, he became president. In 1968, he announced that the Society would be changing their name to Homosexual Law Reform Society; “Homosexuals are now willing to fly under their own colors” (Stewart, 1968).
Same-sex marriage:
Main articles:
In some parts of the world, partnership rights or marriage have been extended to same-sex couples. Advocates of same-sex marriage cite a range of benefits that are denied to people who cannot marry, including immigration, health care, inheritance and property rights, and other family obligations and protections, as reasons why marriage should be extended to same-sex couples.
Opponents of same-sex marriage within the gay community argue that fighting to achieve these benefits by means of extending marriage rights to same-sex couples privatizes benefits (e.g., health care) that should be made available to people regardless of their relationship status. They further argue that the same-sex marriage movement within the gay community discriminates against families that are composed of three or more intimate partners.
Opposition to the same-sex marriage movement from within the gay community should not be confused with opposition from outside that community.
Media:
The contemporary lesbian and gay community has a growing and complex place in the American & Western European media. Lesbians and gay men are often portrayed inaccurately in television, films, and other media. The gay community is often portrayed as many stereotypes, such as gay men being portrayed as flamboyant and bold. Like other minority groups, these caricatures are intended to ridicule this marginalized group.
There is currently a widespread ban of references in child-related entertainment, and when references do occur, they almost invariably generate controversy. In 1997, when American comedian Ellen DeGeneres came out of the closet on her popular sitcom, many sponsors, such as the Wendy's fast food chain, pulled their advertising.
Also, a portion of the media has attempted to make the gay community included and publicly accepted with television shows such as Will & Grace or Queer Eye for the Straight Guy. This increased publicity reflects the Coming out movement of the LGBT community. As more celebrities came out, more shows developed, such as the 2004 show The L Word.
These depictions of the LGBT community have been controversial, but beneficial for the community. The increase in visibility of LGBT people allowed for the LGBT community to unite to organize and demand change, and it has also inspired many LGBT people to come out.
In the United States, gay people are frequently used as a symbol of social decadence by celebrity evangelists and by organizations such as Focus on the Family. Many LGBT organizations exist to represent and defend the gay community. For example, the Gay and Lesbian Alliance Against Defamation in the United States and Stonewall in the UK work with the media to help portray fair and accurate images of the gay community.
As companies are advertising more and more to the gay community, LGBT activists are using ad slogans to promote gay community views. Subaru marketed its Forester and Outback with the slogan "It's Not a Choice. It's the Way We're Built", which was later used in eight U.S. cities on streets or in gay rights events.
Social Media:
Social media is often used as a platform for the LGBT community to congregate and share resources. Search engines and social networking sites provide numerous opportunities for LGBT people to connect with one another; additionally, they play a key role in identity creation and self-presentation.
Social networking sites allow for community building as well as anonymity, allowing people to engage as much or as little as they would like. The variety of social media platforms, including Facebook, Tiktok, Tumblr, Twitter, and YouTube, have differing associated audiences, affordances and norms.
These varying platforms allow for more inclusivity as members of the LGBT community have the agency to decide where to engage and how to self-present themselves. The existence of the LGBT community and discourse on social media platforms is essential to disrupt the reproduction of hegemonic cis-heteronormativity and represent the wide variety of identities that exist.
Before its ban on adult content in 2018, Tumblr was a platform uniquely suited for sharing trans stories and building community. Mainstream social media platforms like Tiktok have also been beneficial for the trans community by creating spaces for folks to share resources and transition stories, normalizing trans identity. It has been found that access to LGBT content, peers, and community on search engines and social networking sites has allowed for identity acceptance and pride within LGBT individuals.
Algorithms and evaluative criteria control what content is recommended to users on search engines and social networking site. These can reproduce stigmatizing discourses that are dominant within society, and result in negatively impacting LGBT self-perception.
Social media algorithms have a significant impact on the formation of the LGBT community and culture. Algorithmic exclusion occurs when exclusionary practices are reinforced by algorithms across technological landscapes, directly resulting in excluding marginalized identities.
The exclusion of these identity representations causes identity insecurity for LGBT people, while further perpetuating cis-heteronormative identity discourse. LGBT users and allies have found methods of subverting algorithms that may suppress content in order to continue to build these online communities.
Buying power:
Main article: Pink money
According to Witeck-Combs Communications, Inc. and Marketresearch.com, the 2006 buying power of United States gays and lesbians was approximately $660 billion and was then expected to exceed $835 billion by 2011. Gay consumers can be very loyal to specific brands, wishing to support companies that support the gay community and also provide equal rights for LGBT workers. In the UK, this buying power is sometimes abbreviated to "the pink pound."
According to an article by James Hipps, LGBT Americans are more likely to seek out companies that advertise to them and are willing to pay higher prices for premium products and services. This can be attributed to the median household income compared to same-sex couples to opposite-sex couples. "...studies show that GLBT Americans are twice as likely to have graduated from college, twice as likely to have an individual income over $60,000 and twice as likely to have a household income of $250,000 or more."
Consumerism:
Main article: Pink capitalism
Although many claims that the LGBT community is more affluent when compared to heterosexual consumers, research has proved that false. However, the LGBT community is still an important segment of consumer demographics because of the spending power and loyalty to brands that they have.
Witeck-Combs Communications calculated the adult LGBT buying power at $830 billion for 2013. Same-sex partnered households spend slightly more than the average home on any given shopping trip. But, they also make more shopping trips compared to the non-LGBT households. On average, the difference in spending with same-sex partnered home is 25 percent higher than the average United States household.
According to the University of Maryland gay male partners earn $10,000 less on average compared to heterosexual men. However, partnered lesbians receive about $7,000 more a year than heterosexual married women. Hence, same-sex partners and heterosexual partners are about equal concerning consumer affluence.
The LGBT community has been recognized for being one of the largest consumers in travel. Travel includes annual trips, and sometimes even multiple annual trips. Annually, the LGBT community spends around $65 billion on travel, totaling 10 percent of the United States travel market.
Many common travel factors play into LGBT travel decisions, but if there is a destination that is especially tailored to the LGBT community, then they are more likely to travel to those places.
Demographics:
Main article: LGBT demographics of the United States
In a survey conducted in 2012, younger Americans are more likely to identify as gay. Statistics continue to decrease with age, as adults between ages 18–29 are three times more likely to identify as LGBT than seniors older than 65. These statistics for the LGBT community are taken into account just as they are with other demographics to find trend patterns for specific products.
Consumers who identify as LGBT are more likely to regularly engage in various activities as opposed to those who identify as heterosexual. According to Community Marketing, Inc., 90 percent of lesbians and 88 percent of gay men will dine out with friends regularly.
And similarly, 31 percent of lesbians and 50 percent of gay men will visit a club or a bar.
And at home, the likelihood of LGBT women having children at home as non-LGBT women is equal. However, LGBT men are half as likely when compared with non-LGBT men to have children at home.
Household incomes for sixteen percent of LGBT Americans range above $90,000 per year, in comparison with 21 percent of the overall adult population. However, a key difference is that those who identify as LGBT have fewer children collectively in comparison to heterosexual partners. Another factor at hand is that LGBT populations of color continue to face income barriers along with the rest of the race issues, so they will expectedly earn less and not be as affluent as predicted.
An analysis of a Gallup survey shows detailed estimates that – during the years 2012 through 2014 – the metropolitan area with the highest percentage of LGBT community in San Francisco, California. The next highest are Portland, Oregon, and Austin, Texas.
A 2019 survey of the Two-Spirit and LGBTQ+ population in the Canadian city of Hamilton, Ontario, called Mapping the Void: Two-Spirit and LGBTQ+ Experiences in Hamilton showed that out of 906 respondents, when it came to sexual orientation, 48.9% identified as bisexual/pansexual, 21.6% identified as gay, 18.3% identified as lesbian, 4.9% identified as queer, and 6.3% identified as other (a category consisting of those who indicated they were asexual, heterosexual, or questioning, and those who gave no response for their sexual orientation).
A 2019 survey of trans and non-binary people in Canada called Trans PULSE Canada showed that out of 2,873 respondents. When it came to sexual orientation, 13% identified as asexual, 28% identified as bisexual, 13% identified as gay, 15% identified as lesbian, 31% identified as pansexual, 8% identified as straight or heterosexual, 4% identified as two-spirit, and 9% identified as unsure or questioning.
Marketing:
Main article: LGBT marketing
Marketing towards the LGBT community was not always a strategy among advertisers. For the last three to four decades, Corporate America has created a market niche for the LGBT community. Three distinct phases define the marketing turnover: 1) shunning in the 1980s, 2) curiosity and fear in the 1990s, and 3) pursuit in the 2000s.
Just recently, marketers have picked up the LGBT demographic. With a spike in same-sex marriage in 2014, marketers are figuring out new ways to tie in a person's sexual orientation to a product being sold. In efforts to attract members of the LGBT community to their products, market researchers are developing marketing methods that reach these new families.
Advertising history has shown that when marketing to the family, it was always the wife, the husband, and the children. But today, that is not necessarily the case. There could be families of two fathers or two mothers with one child or six children. Breaking away from the traditional family setting, marketing researchers notice the need to recognize these different family configurations.
One area that marketers are subject to fall under is stereotyping the LGBT community. When marketing towards the community, they may corner their target audience into an "alternative" lifestyle category that ultimately "others" the LGBT community. Sensitivity is of importance when marketing towards the community. When marketing towards the LGBT community, advertisers respect the same boundaries.
Marketers also refer to LGBT as a single characteristic that makes an individual. Other areas can be targeted along with the LGBT segment such as race, age, culture, and income levels. Knowing the consumer gives these marketers power.
Along with attempts to engage with the LGBT community, researchers have found gender disagreements among products with respective consumers. For instance, a gay male may want a more feminine product, whereas a lesbian female may be interested in a more masculine product. This does not hold for the entire LGBT community, but the possibilities of these differences are far greater.
In the past, gender was seen as fixed, and a congruent representation of an individual's sex. It is understood now that sex and gender are fluid separately. Researchers also noted that when evaluating products, a person's biological sex is as equal is a determinant as their self-concept.
As a customer response, when the advertisement is directed towards them, gay men and women are more likely to have an interest in the product. This is an important factor and goal for marketers because it indicates future loyalty to the product or brand.
Health:
Main article: Healthcare and the LGBT community
Discrimination and mental health:
See also: Homophobia
In a 2001 study that examined possible root causes of mental disorders in lesbian, gay and bisexual people, Cochran and psychologist Vickie M. Mays, of the University of California, explored whether ongoing discrimination fuels anxiety, depression and other stress-related mental health problems among LGB people.
The authors found strong evidence of a relationship between the two. The team compared how 74 LGB and 2,844 heterosexual respondents rated lifetime and daily experiences with discrimination such as not being hired for a job or being denied a bank loan, as well as feelings of perceived discrimination. LGB respondents reported higher rates of perceived discrimination than heterosexuals in every category related to discrimination, the team found.
However, while gay youth are considered to be at higher risk for suicide, a literature review published in the journal Adolescence states, "Being gay in-and-of-itself is not the cause of the increase in suicide." Rather, the review notes that the findings of previous studies suggested the, "...suicide attempts were significantly associated with psychosocial stressors, including:
Some of these stressors are also experienced by heterosexual adolescents, but they have been shown to be more prevalent among gay adolescents." Despite recent progress in LGBT rights, gay men continue to experience high rates of loneliness and depression after coming out.
LGBT multiculturalism:
General:
LGBT multiculturalism is the diversity within the LGBT (lesbian, gay, bisexual, transgender) community as a representation of different sexual orientations, gender identities—as well as different ethnic, language, religious groups within the LGBT community.
At the same time as LGBT and multiculturalism relation, we may consider the inclusion of LGBT community into a larger multicultural model, as for example in universities, such multicultural model includes the LGBT community together and equal representation with other large minority groups such as African Americans in the United States.
The two movements have much in common politically. Both are concerned with tolerance for real differences, diversity, minority status, and the invalidity of value judgments applied to different ways of life.
Researchers have identified the emergence of gay and lesbian communities during several progressive time periods across the world including: the Renaissance, Enlightenment, and modern Westernization. Depending on geographic location, some of these communities experienced more opposition to their existence than others; nonetheless, they began to permeate society both socially and politically.
European cities past and present:
City spaces in Early Modern Europe were host to a wealth of gay activity; however, these scenes remained semi-secretive for a long period of time. Dating back to the 1500s, city conditions such as apprenticeship labor relations and living arrangements, abundant student and artist activity, and hegemonic norms surrounding female societal status were typical in Venice and Florence, Italy.
Under these circumstances, many open minded young people were attracted to these city settings. Consequently, an abundance of same-sex interactions began to take place. Many of the connections formed then often led to the occurrence of casual romantic and sexual relationships, the prevalence of which increased quite rapidly over time until a point at which they became a subculture and community of their own. Literature and ballroom culture gradually made their way onto the scene and became integrated despite transgressive societal views.
Perhaps the most well-known of these are the balls of Magic-City. Amsterdam and London have also been recognized as leading locations for LGBT community establishment. By the 1950s, these urban spaces were booming with gay venues such as bars and public saunas where community members could come together.
Paris and London were particularly attracting to the lesbian population as platforms for not only socialization, but education as well. A few other urban occasions that are important to the LGBT community include Carnival in Rio de Janeiro, Brazil, Mardi Gras in Sydney, Australia, as well as the various other pride parades hosted in bigger cities around the world.
Urban spaces in America:
In the same way in which LGBT people used the city backdrop to join together socially, they were able to join forces politically as well. This new sense of collectivity provided somewhat of a safety net for individuals when voicing their demands for equal rights.
In the United States specifically, several key political events have taken place in urban contexts. Some of these include, but are not limited to:
Independence Hall, Philadelphia - gay and lesbian protest movement in 1965
During and following these events, LGBT community subculture began to grow and stabilize into a nationwide phenomenon. Gay bars became more and more popular in large cities. For gays particularly, increasing numbers of cruising areas, public bath houses, and YMCAs in these urban spaces continued to welcome them to experience a more liberated way of living.
For lesbians, this led to the formation of literary societies, private social clubs, and same-sex housing. The core of this community-building took place in New York City and San Francisco, but cities like St. Louis, Lafayette Park in WA, and Chicago quickly followed suit.
City:
Cities afford a host of prime conditions that allow for better individual development as well as collective movement that are not otherwise available in rural spaces. First and foremost, urban landscapes offer LGBTs better prospects to meet other LGBTs and form networks and relationships.
One ideal platform within this framework was the free labor market of many capitalistic societies which enticed people to break away from their often damaging traditional nuclear families in order to pursue employment in bigger cities. Making the move to these spaces afforded them new liberty in the realms of sexuality, identity, and also kinship.
Some researchers describe this as a phase of resistance against the confining expectations of normativity. Urban LGBTs demonstrated this push back through various outlets including their style of dress, the way they talked and carried themselves, and how they chose to build community.
From a social science perspective, the relationship between the city and LGBT community is not a one-way street. LGBTs give back as much, if not more, in terms of economic contributions (i.e. "pink money"), activism and politics too.
Intersections of race:
Compared to white LGBT individuals, LGBT people of color often experience prejudice, stereotyping, and discrimination on the basis of not only their sexual orientation and gender identity, but also on the basis of race.
Nadal and colleagues discuss LGBTQ people of Color and their experience of intersectional microaggressions which target various aspects of their social identities.
These negative experiences and microaggressions can come from cisgender and heterosexual white individuals, cisgender and heterosexual individuals of their own race, and from the LGBT community themselves, which is usually dominated by white people.
Some LGBT people of color do not feel comfortable and represented within LGBT spaces. A comprehensive and systematic review of the existing published research literature around the experiences of LGBT individuals of color finds a common theme of exclusion in largely white LGBT spaces.
These spaces are typically dominated by white LGBT individuals, promote White and Western values, and often leave LGBT individuals of color feeling as though they must choose between their racial community or their gender and sexual orientation community. In general, Western society will often subtly code “gay” as white; white LGBT folks are often seen as the face of LGBT culture and values.
The topic of coming out and revealing one’s sexual orientation and gender identity to the public is associated with white values and expectations in mainstream discussions. Where white Western culture places value on the ability to speak openly about one’s identity with family, one particular study found that LGBT participants of color viewed their family's silence about their identity as supportive and accepting. For example, collectivist cultures view the coming out process as a family affair rather than an individual one.
Furthermore, the annual National Coming Out Day centers white perspectives as an event meant to help an LGBT person feel liberated and comfortable in their own skin. However, for some LGBT people of color, National Coming Out Day is viewed in a negative light.
In communities of color, coming out publicly can have adverse consequences, risking their personal sense of safety as well as that of their familial and communal relationships. White LGBT people tend to collectively reject these differences in perspective on coming out resulting in possibly further isolating their LGBTQ siblings of Color.
Criticism of the term:
Eleanor Formby writes that the notion of "LGBT community" is problematic, because community belonging is not a given just because people share a gender or sexual identity.
Formby cites an interviewee who argued that "The idea doesn’t exist, it’s a kind of big myth – a bit like saying there’s a brown-eyed community or a blonde community." According to Formby, research shows that many LGBT individuals do not at all feel there is a real "LGBT community", as they keep experiencing discrimination from other LGBT people relating to their age, body, disability, ethnicity, faith, HIV status, or perceived social class.
Formby clarifies that she does not suggest abandoning the phrase altogether, but that using "LGBT people" would be more accurate in most instances, and would not risk alienation felt by an already (at times) marginalized group of people.
See also:
The following demonstrates the difference in demographics between the starting year of 1990 up to February, 2021:
- 1990: An extensive study on sexuality in general was conducted in the United States. A significant portion of the study was geared towards homosexuality. The results found that 8.6% of women and 10.1% of men had at one point in their life experienced some form of homosexuality. Of this group, 87% of women and 76% of men reported current same-sex attractions, 41% of women and 52% of men had sex with someone of the same gender, and 16% of women and 27% of men identified as LGBT
- 2021: A February 2021 Gallup poll reported that 5.6% of US adults identify as lesbian, gay, bisexual, or transgender. 86.7% said that they were heterosexual or straight, and 7.6% refused to answer. More than half of all LGBT adults identify as bisexual (54.6%), while around a quarter (24.5%) identify as gay, 11.7% as lesbian, and 11.3% as transgender. Additionally, 3.3% of respondents chose another term to describe their orientation (e.g queer). As a percentage of all US adults, 3.1% identify as bisexual, 1.4% as gay, 0.7% as lesbian, and 0.6% as transgender.
The LGBT community (also known as the LGBTQ community, GLBT community, or the gay community) is a loosely defined grouping of
- lesbian,
- gay,
- bisexual,
- transgender,
- LGBT organizations and subcultures,
- united by a common culture and social movements.
These communities generally celebrate pride, diversity, individuality, and sexuality. LGBT activists and sociologists see LGBT community-building as a counterweight to heterosexism, homophobia, biphobia, transphobia, sexualism, and conformist pressures that exist in the larger society.
The term pride or sometimes gay pride expresses the LGBT community's identity and collective strength; pride parades provide both a prime example of the use and a demonstration of the general meaning of the term. The LGBT community is diverse in political affiliation. Not all people who are lesbian, gay, bisexual, or transgender consider themselves part of the LGBT community.
Groups that may be considered part of the LGBT community include the following:
- gay villages,
- LGBT rights organizations,
- LGBT employee groups at companies,
- LGBT student groups in schools and universities,
- and LGBT-affirming religious groups.
LGBT communities may organize themselves into, or support, movements for civil rights promoting LGBT rights in various places around the world.
Terminology:
Main article: LGBT
LGBT, or GLBT, is an initialism that stands for lesbian, gay, bisexual, and transgender. In use since the 1990s, the term is an adaptation of the initialism LGB, which was used to replace the term gay - when referring to the community as a whole - beginning in about the mid-to-late 1980s.
While the movement had always included all LGBT people, the one-word unifying term in the 1950s through the early 1980s was gay (see Gay liberation). Later, this was expanded by many groups to lesbian and gay, to be more representative. In the late eighties and early nineties, queer was also reclaimed as a one-word alternative to the ever-lengthening string of initials, especially when used by radical political groups.
The initialism, as well as common variants such as LGBTQ, have been adopted into the mainstream in the 1990s as an umbrella term for use when labeling topics about sexuality and gender identity. For example, the LGBT Movement Advancement Project termed community centers, which have services specific to those members of the LGBT community, as "LGBT community centers" in comprehensive studies of such centers around the United States.
The initialism LGBT is intended to emphasize a diversity of sexuality and gender identity-based cultures. It may refer to anyone who is non-heterosexual or non-cisgender, instead of exclusively to people who are lesbian, gay, bisexual, or transgender
Recognize this inclusion as a popular variant that adds the letter Q for those who identify as queer or are questioning their sexual identity; LGBTQ has been recorded since 1996.
Symbols:
Main article: LGBT symbols
The gay community is frequently associated with certain symbols, especially the rainbow or rainbow flags. The Greek lambda symbol ("L" for liberation), triangles, ribbons, and gender symbols are also used as "gay acceptance" symbol. There are many types of flags to represent subdivisions in the gay community, but the most commonly recognized one is the rainbow flag.
According to Gilbert Baker, creator of the commonly known rainbow flag, each color represents a value in the community:
- pink = sexuality
- red = life
- orange = healing
- yellow = the sun
- green = nature
- blue = art
- indigo = harmony
- violet = spirit
Later, pink and indigo were removed from the flag, resulting in the present-day flag which was first presented at the 1979 Pride Parade. Other flags include the Victory over AIDS flag, the Leather Pride flag, and the Bear Pride flag.
The lambda symbol was originally adopted by Gay Activists Alliance of New York in 1970 after they broke away from the larger Gay Liberation Front. Lambda was chosen because people might confuse it for a college symbol and not recognize it as a gay community symbol unless one was actually involved in the community. "Back in December of 1974, the lambda was officially declared the international symbol for gay and lesbian rights by the International Gay Rights Congress in Edinburgh, Scotland."
The triangle became a symbol for the gay community after the Holocaust. Not only did it represent Jews, but homosexuals who were killed because of German law. During the Holocaust, homosexuals were labeled with pink triangles to distinguish between them, Jews, regular prisoners, and political prisoners. The black triangle is similarly a symbol for females only to represent lesbian sisterhood.
The pink and yellow triangle was used to label Jewish homosexuals. Gender symbols have a much longer list of variations of homosexual or bisexual relationships which are clearly recognizable but may not be as popularly seen as the other symbols. Other symbols that relate to the gay community or gay pride include the gay-teen suicide awareness ribbon, AIDS awareness ribbon, labrys, and purple rhinoceros.
In the fall of 1995, the Human Rights Campaign adopted a logo (yellow equal sign on deep blue square) that has become one of the most recognizable symbols of the lesbian, gay, bisexual and transgender community. The logo can be spotted the world over and has become synonymous with the fight for equal rights for LGBT people.
One of the most notable recent changes was made in Philadelphia, Pennsylvania, on June 8, 2017. They added two new stripes to the traditional LGBT+ flag, one black and one brown. These were intended to highlight members of color within the LGBTQIA community.
Human and legal rights:
Main articles: LGBT rights by country or territory and LGBT social movements
The LGBT community represented by a social component of the global community that is believed by many, including heterosexual allies, to be underrepresented in the area of civil rights.
The current struggle of the gay community has been largely brought about by globalization.
In the United States, World War II brought together many closeted rural men from around the nation and exposed them to more progressive attitudes in parts of Europe. Upon returning home after the war, many of these men decided to band together in cities rather than return to their small towns.
Fledgling communities would soon become political in the beginning of the gay rights movement, including monumental incidents at places like Stonewall. Today, many large cities have gay and lesbian community centers. Many universities and colleges across the world have support centers for LGBT students. The Human Rights Campaign, Lambda Legal, the Empowering Spirits Foundation, and GLAAD advocate for LGBT people on a wide range of issues in the United States.
There is also an International Lesbian and Gay Association. In 1947, when the United Kingdom adopted the Universal Declaration of Human Rights (UDHR), LGBT activists clung to its concept of equal, inalienable rights for all people, regardless of their race, gender, or sexual orientation. The declaration does not specifically mention gay rights, but discusses equality and freedom from discrimination.
In 1962, Clark Polak joined The Janus Society in Philadelphia, Pennsylvania. Only a year after, he became president. In 1968, he announced that the Society would be changing their name to Homosexual Law Reform Society; “Homosexuals are now willing to fly under their own colors” (Stewart, 1968).
Same-sex marriage:
Main articles:
In some parts of the world, partnership rights or marriage have been extended to same-sex couples. Advocates of same-sex marriage cite a range of benefits that are denied to people who cannot marry, including immigration, health care, inheritance and property rights, and other family obligations and protections, as reasons why marriage should be extended to same-sex couples.
Opponents of same-sex marriage within the gay community argue that fighting to achieve these benefits by means of extending marriage rights to same-sex couples privatizes benefits (e.g., health care) that should be made available to people regardless of their relationship status. They further argue that the same-sex marriage movement within the gay community discriminates against families that are composed of three or more intimate partners.
Opposition to the same-sex marriage movement from within the gay community should not be confused with opposition from outside that community.
Media:
The contemporary lesbian and gay community has a growing and complex place in the American & Western European media. Lesbians and gay men are often portrayed inaccurately in television, films, and other media. The gay community is often portrayed as many stereotypes, such as gay men being portrayed as flamboyant and bold. Like other minority groups, these caricatures are intended to ridicule this marginalized group.
There is currently a widespread ban of references in child-related entertainment, and when references do occur, they almost invariably generate controversy. In 1997, when American comedian Ellen DeGeneres came out of the closet on her popular sitcom, many sponsors, such as the Wendy's fast food chain, pulled their advertising.
Also, a portion of the media has attempted to make the gay community included and publicly accepted with television shows such as Will & Grace or Queer Eye for the Straight Guy. This increased publicity reflects the Coming out movement of the LGBT community. As more celebrities came out, more shows developed, such as the 2004 show The L Word.
These depictions of the LGBT community have been controversial, but beneficial for the community. The increase in visibility of LGBT people allowed for the LGBT community to unite to organize and demand change, and it has also inspired many LGBT people to come out.
In the United States, gay people are frequently used as a symbol of social decadence by celebrity evangelists and by organizations such as Focus on the Family. Many LGBT organizations exist to represent and defend the gay community. For example, the Gay and Lesbian Alliance Against Defamation in the United States and Stonewall in the UK work with the media to help portray fair and accurate images of the gay community.
As companies are advertising more and more to the gay community, LGBT activists are using ad slogans to promote gay community views. Subaru marketed its Forester and Outback with the slogan "It's Not a Choice. It's the Way We're Built", which was later used in eight U.S. cities on streets or in gay rights events.
Social Media:
Social media is often used as a platform for the LGBT community to congregate and share resources. Search engines and social networking sites provide numerous opportunities for LGBT people to connect with one another; additionally, they play a key role in identity creation and self-presentation.
Social networking sites allow for community building as well as anonymity, allowing people to engage as much or as little as they would like. The variety of social media platforms, including Facebook, Tiktok, Tumblr, Twitter, and YouTube, have differing associated audiences, affordances and norms.
These varying platforms allow for more inclusivity as members of the LGBT community have the agency to decide where to engage and how to self-present themselves. The existence of the LGBT community and discourse on social media platforms is essential to disrupt the reproduction of hegemonic cis-heteronormativity and represent the wide variety of identities that exist.
Before its ban on adult content in 2018, Tumblr was a platform uniquely suited for sharing trans stories and building community. Mainstream social media platforms like Tiktok have also been beneficial for the trans community by creating spaces for folks to share resources and transition stories, normalizing trans identity. It has been found that access to LGBT content, peers, and community on search engines and social networking sites has allowed for identity acceptance and pride within LGBT individuals.
Algorithms and evaluative criteria control what content is recommended to users on search engines and social networking site. These can reproduce stigmatizing discourses that are dominant within society, and result in negatively impacting LGBT self-perception.
Social media algorithms have a significant impact on the formation of the LGBT community and culture. Algorithmic exclusion occurs when exclusionary practices are reinforced by algorithms across technological landscapes, directly resulting in excluding marginalized identities.
The exclusion of these identity representations causes identity insecurity for LGBT people, while further perpetuating cis-heteronormative identity discourse. LGBT users and allies have found methods of subverting algorithms that may suppress content in order to continue to build these online communities.
Buying power:
Main article: Pink money
According to Witeck-Combs Communications, Inc. and Marketresearch.com, the 2006 buying power of United States gays and lesbians was approximately $660 billion and was then expected to exceed $835 billion by 2011. Gay consumers can be very loyal to specific brands, wishing to support companies that support the gay community and also provide equal rights for LGBT workers. In the UK, this buying power is sometimes abbreviated to "the pink pound."
According to an article by James Hipps, LGBT Americans are more likely to seek out companies that advertise to them and are willing to pay higher prices for premium products and services. This can be attributed to the median household income compared to same-sex couples to opposite-sex couples. "...studies show that GLBT Americans are twice as likely to have graduated from college, twice as likely to have an individual income over $60,000 and twice as likely to have a household income of $250,000 or more."
Consumerism:
Main article: Pink capitalism
Although many claims that the LGBT community is more affluent when compared to heterosexual consumers, research has proved that false. However, the LGBT community is still an important segment of consumer demographics because of the spending power and loyalty to brands that they have.
Witeck-Combs Communications calculated the adult LGBT buying power at $830 billion for 2013. Same-sex partnered households spend slightly more than the average home on any given shopping trip. But, they also make more shopping trips compared to the non-LGBT households. On average, the difference in spending with same-sex partnered home is 25 percent higher than the average United States household.
According to the University of Maryland gay male partners earn $10,000 less on average compared to heterosexual men. However, partnered lesbians receive about $7,000 more a year than heterosexual married women. Hence, same-sex partners and heterosexual partners are about equal concerning consumer affluence.
The LGBT community has been recognized for being one of the largest consumers in travel. Travel includes annual trips, and sometimes even multiple annual trips. Annually, the LGBT community spends around $65 billion on travel, totaling 10 percent of the United States travel market.
Many common travel factors play into LGBT travel decisions, but if there is a destination that is especially tailored to the LGBT community, then they are more likely to travel to those places.
Demographics:
Main article: LGBT demographics of the United States
In a survey conducted in 2012, younger Americans are more likely to identify as gay. Statistics continue to decrease with age, as adults between ages 18–29 are three times more likely to identify as LGBT than seniors older than 65. These statistics for the LGBT community are taken into account just as they are with other demographics to find trend patterns for specific products.
Consumers who identify as LGBT are more likely to regularly engage in various activities as opposed to those who identify as heterosexual. According to Community Marketing, Inc., 90 percent of lesbians and 88 percent of gay men will dine out with friends regularly.
And similarly, 31 percent of lesbians and 50 percent of gay men will visit a club or a bar.
And at home, the likelihood of LGBT women having children at home as non-LGBT women is equal. However, LGBT men are half as likely when compared with non-LGBT men to have children at home.
Household incomes for sixteen percent of LGBT Americans range above $90,000 per year, in comparison with 21 percent of the overall adult population. However, a key difference is that those who identify as LGBT have fewer children collectively in comparison to heterosexual partners. Another factor at hand is that LGBT populations of color continue to face income barriers along with the rest of the race issues, so they will expectedly earn less and not be as affluent as predicted.
An analysis of a Gallup survey shows detailed estimates that – during the years 2012 through 2014 – the metropolitan area with the highest percentage of LGBT community in San Francisco, California. The next highest are Portland, Oregon, and Austin, Texas.
A 2019 survey of the Two-Spirit and LGBTQ+ population in the Canadian city of Hamilton, Ontario, called Mapping the Void: Two-Spirit and LGBTQ+ Experiences in Hamilton showed that out of 906 respondents, when it came to sexual orientation, 48.9% identified as bisexual/pansexual, 21.6% identified as gay, 18.3% identified as lesbian, 4.9% identified as queer, and 6.3% identified as other (a category consisting of those who indicated they were asexual, heterosexual, or questioning, and those who gave no response for their sexual orientation).
A 2019 survey of trans and non-binary people in Canada called Trans PULSE Canada showed that out of 2,873 respondents. When it came to sexual orientation, 13% identified as asexual, 28% identified as bisexual, 13% identified as gay, 15% identified as lesbian, 31% identified as pansexual, 8% identified as straight or heterosexual, 4% identified as two-spirit, and 9% identified as unsure or questioning.
Marketing:
Main article: LGBT marketing
Marketing towards the LGBT community was not always a strategy among advertisers. For the last three to four decades, Corporate America has created a market niche for the LGBT community. Three distinct phases define the marketing turnover: 1) shunning in the 1980s, 2) curiosity and fear in the 1990s, and 3) pursuit in the 2000s.
Just recently, marketers have picked up the LGBT demographic. With a spike in same-sex marriage in 2014, marketers are figuring out new ways to tie in a person's sexual orientation to a product being sold. In efforts to attract members of the LGBT community to their products, market researchers are developing marketing methods that reach these new families.
Advertising history has shown that when marketing to the family, it was always the wife, the husband, and the children. But today, that is not necessarily the case. There could be families of two fathers or two mothers with one child or six children. Breaking away from the traditional family setting, marketing researchers notice the need to recognize these different family configurations.
One area that marketers are subject to fall under is stereotyping the LGBT community. When marketing towards the community, they may corner their target audience into an "alternative" lifestyle category that ultimately "others" the LGBT community. Sensitivity is of importance when marketing towards the community. When marketing towards the LGBT community, advertisers respect the same boundaries.
Marketers also refer to LGBT as a single characteristic that makes an individual. Other areas can be targeted along with the LGBT segment such as race, age, culture, and income levels. Knowing the consumer gives these marketers power.
Along with attempts to engage with the LGBT community, researchers have found gender disagreements among products with respective consumers. For instance, a gay male may want a more feminine product, whereas a lesbian female may be interested in a more masculine product. This does not hold for the entire LGBT community, but the possibilities of these differences are far greater.
In the past, gender was seen as fixed, and a congruent representation of an individual's sex. It is understood now that sex and gender are fluid separately. Researchers also noted that when evaluating products, a person's biological sex is as equal is a determinant as their self-concept.
As a customer response, when the advertisement is directed towards them, gay men and women are more likely to have an interest in the product. This is an important factor and goal for marketers because it indicates future loyalty to the product or brand.
Health:
Main article: Healthcare and the LGBT community
Discrimination and mental health:
See also: Homophobia
In a 2001 study that examined possible root causes of mental disorders in lesbian, gay and bisexual people, Cochran and psychologist Vickie M. Mays, of the University of California, explored whether ongoing discrimination fuels anxiety, depression and other stress-related mental health problems among LGB people.
The authors found strong evidence of a relationship between the two. The team compared how 74 LGB and 2,844 heterosexual respondents rated lifetime and daily experiences with discrimination such as not being hired for a job or being denied a bank loan, as well as feelings of perceived discrimination. LGB respondents reported higher rates of perceived discrimination than heterosexuals in every category related to discrimination, the team found.
However, while gay youth are considered to be at higher risk for suicide, a literature review published in the journal Adolescence states, "Being gay in-and-of-itself is not the cause of the increase in suicide." Rather, the review notes that the findings of previous studies suggested the, "...suicide attempts were significantly associated with psychosocial stressors, including:
- gender nonconformity,
- early awareness of being gay,
- victimization,
- lack of support,
- school dropout,
- family problems,
- acquaintances' suicide attempts,
- homelessness,
- substance abuse,
- and other psychiatric disorders.
Some of these stressors are also experienced by heterosexual adolescents, but they have been shown to be more prevalent among gay adolescents." Despite recent progress in LGBT rights, gay men continue to experience high rates of loneliness and depression after coming out.
LGBT multiculturalism:
General:
LGBT multiculturalism is the diversity within the LGBT (lesbian, gay, bisexual, transgender) community as a representation of different sexual orientations, gender identities—as well as different ethnic, language, religious groups within the LGBT community.
At the same time as LGBT and multiculturalism relation, we may consider the inclusion of LGBT community into a larger multicultural model, as for example in universities, such multicultural model includes the LGBT community together and equal representation with other large minority groups such as African Americans in the United States.
The two movements have much in common politically. Both are concerned with tolerance for real differences, diversity, minority status, and the invalidity of value judgments applied to different ways of life.
Researchers have identified the emergence of gay and lesbian communities during several progressive time periods across the world including: the Renaissance, Enlightenment, and modern Westernization. Depending on geographic location, some of these communities experienced more opposition to their existence than others; nonetheless, they began to permeate society both socially and politically.
European cities past and present:
City spaces in Early Modern Europe were host to a wealth of gay activity; however, these scenes remained semi-secretive for a long period of time. Dating back to the 1500s, city conditions such as apprenticeship labor relations and living arrangements, abundant student and artist activity, and hegemonic norms surrounding female societal status were typical in Venice and Florence, Italy.
Under these circumstances, many open minded young people were attracted to these city settings. Consequently, an abundance of same-sex interactions began to take place. Many of the connections formed then often led to the occurrence of casual romantic and sexual relationships, the prevalence of which increased quite rapidly over time until a point at which they became a subculture and community of their own. Literature and ballroom culture gradually made their way onto the scene and became integrated despite transgressive societal views.
Perhaps the most well-known of these are the balls of Magic-City. Amsterdam and London have also been recognized as leading locations for LGBT community establishment. By the 1950s, these urban spaces were booming with gay venues such as bars and public saunas where community members could come together.
Paris and London were particularly attracting to the lesbian population as platforms for not only socialization, but education as well. A few other urban occasions that are important to the LGBT community include Carnival in Rio de Janeiro, Brazil, Mardi Gras in Sydney, Australia, as well as the various other pride parades hosted in bigger cities around the world.
Urban spaces in America:
In the same way in which LGBT people used the city backdrop to join together socially, they were able to join forces politically as well. This new sense of collectivity provided somewhat of a safety net for individuals when voicing their demands for equal rights.
In the United States specifically, several key political events have taken place in urban contexts. Some of these include, but are not limited to:
Independence Hall, Philadelphia - gay and lesbian protest movement in 1965
- Activists led by Barbara Gittings started some of the first picket lines here. These protests continued on and off until 1969. Gittings went on to run the Gay and Lesbian Task Force of the American Library Association for 15 years.
- For the first time, a group of gay men and drag queens fought back against police during a raid on this small bar in Greenwich Village. The place is now a national historic landmark.
- Almost of equal importance as Christopher Street (site of Stonewall Riot) when it comes to historic landmarks, this urban spot was an oasis of hopefulness. Home to the first openly gay elected official Harvey Milk and the legendary Castro Theater, this cityscape remains iconic to the LGBT community.
- In the years following this event, attempts by religious groups in the area to ban it have been stifled and many more states have joined the Commonwealth.
- An office that's goal is to provide proper administrative components, direct assistance, and education on HIV/AIDs
During and following these events, LGBT community subculture began to grow and stabilize into a nationwide phenomenon. Gay bars became more and more popular in large cities. For gays particularly, increasing numbers of cruising areas, public bath houses, and YMCAs in these urban spaces continued to welcome them to experience a more liberated way of living.
For lesbians, this led to the formation of literary societies, private social clubs, and same-sex housing. The core of this community-building took place in New York City and San Francisco, but cities like St. Louis, Lafayette Park in WA, and Chicago quickly followed suit.
City:
Cities afford a host of prime conditions that allow for better individual development as well as collective movement that are not otherwise available in rural spaces. First and foremost, urban landscapes offer LGBTs better prospects to meet other LGBTs and form networks and relationships.
One ideal platform within this framework was the free labor market of many capitalistic societies which enticed people to break away from their often damaging traditional nuclear families in order to pursue employment in bigger cities. Making the move to these spaces afforded them new liberty in the realms of sexuality, identity, and also kinship.
Some researchers describe this as a phase of resistance against the confining expectations of normativity. Urban LGBTs demonstrated this push back through various outlets including their style of dress, the way they talked and carried themselves, and how they chose to build community.
From a social science perspective, the relationship between the city and LGBT community is not a one-way street. LGBTs give back as much, if not more, in terms of economic contributions (i.e. "pink money"), activism and politics too.
Intersections of race:
Compared to white LGBT individuals, LGBT people of color often experience prejudice, stereotyping, and discrimination on the basis of not only their sexual orientation and gender identity, but also on the basis of race.
Nadal and colleagues discuss LGBTQ people of Color and their experience of intersectional microaggressions which target various aspects of their social identities.
These negative experiences and microaggressions can come from cisgender and heterosexual white individuals, cisgender and heterosexual individuals of their own race, and from the LGBT community themselves, which is usually dominated by white people.
Some LGBT people of color do not feel comfortable and represented within LGBT spaces. A comprehensive and systematic review of the existing published research literature around the experiences of LGBT individuals of color finds a common theme of exclusion in largely white LGBT spaces.
These spaces are typically dominated by white LGBT individuals, promote White and Western values, and often leave LGBT individuals of color feeling as though they must choose between their racial community or their gender and sexual orientation community. In general, Western society will often subtly code “gay” as white; white LGBT folks are often seen as the face of LGBT culture and values.
The topic of coming out and revealing one’s sexual orientation and gender identity to the public is associated with white values and expectations in mainstream discussions. Where white Western culture places value on the ability to speak openly about one’s identity with family, one particular study found that LGBT participants of color viewed their family's silence about their identity as supportive and accepting. For example, collectivist cultures view the coming out process as a family affair rather than an individual one.
Furthermore, the annual National Coming Out Day centers white perspectives as an event meant to help an LGBT person feel liberated and comfortable in their own skin. However, for some LGBT people of color, National Coming Out Day is viewed in a negative light.
In communities of color, coming out publicly can have adverse consequences, risking their personal sense of safety as well as that of their familial and communal relationships. White LGBT people tend to collectively reject these differences in perspective on coming out resulting in possibly further isolating their LGBTQ siblings of Color.
Criticism of the term:
Eleanor Formby writes that the notion of "LGBT community" is problematic, because community belonging is not a given just because people share a gender or sexual identity.
Formby cites an interviewee who argued that "The idea doesn’t exist, it’s a kind of big myth – a bit like saying there’s a brown-eyed community or a blonde community." According to Formby, research shows that many LGBT individuals do not at all feel there is a real "LGBT community", as they keep experiencing discrimination from other LGBT people relating to their age, body, disability, ethnicity, faith, HIV status, or perceived social class.
Formby clarifies that she does not suggest abandoning the phrase altogether, but that using "LGBT people" would be more accurate in most instances, and would not risk alienation felt by an already (at times) marginalized group of people.
See also:
- List of LGBT Periodicals
- Bisexual community
- Gay friendly
- Gay male culture
- Homosocialization
- Lesbian
- LGBT culture
- LGBT history
- LGBT symbols
- List of gay villages
- Sexuality and gender identity-based cultures
- Transgender
- Taimi LGBTQI+ community
27 Club: those Popular Icons Who Died at the Age of 27
- YouTube Video: 10 Unforgettable Jimi Hendrix Moments
- YouTube Video Best Songs Of Janis Joplin || Janis Joplin Collection
- YouTube Video: One of the greatest moments of Jim Morrison (The Doors)
The 27 Club is a list consisting mostly of popular musicians, artists, or actors who died at age 27. Although the claim of a "statistical spike" for the death of musicians at that age has been refuted by scientific research, it remains a cultural phenomenon, documenting the deaths of celebrities, many noted for their high-risk lifestyles. Because the club is entirely notional, there is no official membership.
Cultural phenomenon:
The 27 Club includes popular musicians, artists and actors who died at age 27, often as a result of drug and alcohol abuse or violent means such as homicide, suicide, or transportation-related accidents. The "club" has been repeatedly cited in music magazines, journals and the daily press. Several exhibitions have been devoted to the idea, as well as novels, films and stage plays.
The deaths of several 27-year-old popular musicians between 1969 and 1971 led to the belief that deaths are more common at this age. Music biographer Charles R. Cross wrote: "The number of musicians who died at 27 is truly remarkable by any standard. [Although] humans die regularly at all ages, there is a statistical spike for musicians who die at 27."
History:
Brian Jones, Jimi Hendrix, Janis Joplin, and Jim Morrison all died at the age of 27 between 1969 and 1971. At the time, the coincidence gave rise to some comment, but it was not until Kurt Cobain's 1994 death, at age 27, that the idea of a "27 Club" began to catch on in public perception. Blues musician Robert Johnson, who died in 1938, is one of the earliest popular musicians to be included in lists of 27 Club members.
According to Hendrix and Cobain's biographer Charles R. Cross, the growing importance of the media—Internet, magazines, and television—and the response to an interview of Cobain's mother were jointly responsible for such theories.
An excerpt from a statement that Cobain's mother, Wendy Fradenburg Cobain O'Connor, made in the Aberdeen, Washington, newspaper The Daily World—"Now he's gone and joined that stupid club. I told him not to join that stupid club."—referred to Hendrix, Joplin, and Morrison dying at the same age, according to Cross. Other authors share his view.
On the other hand, Eric Segalstad, writer of The 27s: The Greatest Myth of Rock & Roll, assumed that Cobain's mother referred to the death of his two uncles and his great-uncle, all of whom had also committed suicide. According to Cross, the events have led a "set of conspiracy theorists [to suggest] the absurd notion that Kurt Cobain intentionally timed his death so he could join the 27 Club".
In 2011, seventeen years after Cobain's death, Amy Winehouse died at the age of 27, prompting a renewed swell of media attention devoted to the club once again. Three years earlier, she had expressed a fear of dying at that age.
An individual does not necessarily have to be a musician to qualify as a "member" of the 27 Club. Rolling Stone included television actor Jonathan Brandis, who committed suicide in 2003, in a list of 27 Club members. Anton Yelchin, who had played in a punk rock band but was primarily known as a film actor, was also described as a member of the club upon his death in 2016.
Likewise, Jean-Michel Basquiat has been included in 27 Club lists, despite the relative brevity of his music career, and his prominence as a graffiti artist and painter.
Scientific studies:
A study by university academics published in the British Medical Journal in December 2011 concluded that there was no increase in the risk of death for musicians at the age of 27, stating that there were equally small increases at ages 25 and 32. The study noted that young adult musicians have a higher death rate than the general young adult population, surmising that "fame may increase the risk of death among musicians, but this risk is not limited to age 27".
The selection criteria for the musicians included in the study, based on having scored a UK No. 1 album between 1956 and 2007, excluded several notable members of the 27 Club, including Hendrix, Joplin, Morrison, Ham, and McKernan.
A 2014 article at The Conversation suggested that statistical evidence shows popular musicians are most likely to die at the age of 56 (2.2% compared to 1.3% at 27).
In popular culture:
Music
Video games:
Comics:
Identified members:
Because the 27 Club is entirely notional, there is no official membership. The following table lists people described as "members" of the club in reliable published sources, in the opinion of their respective authors.
See also:
Cultural phenomenon:
The 27 Club includes popular musicians, artists and actors who died at age 27, often as a result of drug and alcohol abuse or violent means such as homicide, suicide, or transportation-related accidents. The "club" has been repeatedly cited in music magazines, journals and the daily press. Several exhibitions have been devoted to the idea, as well as novels, films and stage plays.
The deaths of several 27-year-old popular musicians between 1969 and 1971 led to the belief that deaths are more common at this age. Music biographer Charles R. Cross wrote: "The number of musicians who died at 27 is truly remarkable by any standard. [Although] humans die regularly at all ages, there is a statistical spike for musicians who die at 27."
History:
Brian Jones, Jimi Hendrix, Janis Joplin, and Jim Morrison all died at the age of 27 between 1969 and 1971. At the time, the coincidence gave rise to some comment, but it was not until Kurt Cobain's 1994 death, at age 27, that the idea of a "27 Club" began to catch on in public perception. Blues musician Robert Johnson, who died in 1938, is one of the earliest popular musicians to be included in lists of 27 Club members.
According to Hendrix and Cobain's biographer Charles R. Cross, the growing importance of the media—Internet, magazines, and television—and the response to an interview of Cobain's mother were jointly responsible for such theories.
An excerpt from a statement that Cobain's mother, Wendy Fradenburg Cobain O'Connor, made in the Aberdeen, Washington, newspaper The Daily World—"Now he's gone and joined that stupid club. I told him not to join that stupid club."—referred to Hendrix, Joplin, and Morrison dying at the same age, according to Cross. Other authors share his view.
On the other hand, Eric Segalstad, writer of The 27s: The Greatest Myth of Rock & Roll, assumed that Cobain's mother referred to the death of his two uncles and his great-uncle, all of whom had also committed suicide. According to Cross, the events have led a "set of conspiracy theorists [to suggest] the absurd notion that Kurt Cobain intentionally timed his death so he could join the 27 Club".
In 2011, seventeen years after Cobain's death, Amy Winehouse died at the age of 27, prompting a renewed swell of media attention devoted to the club once again. Three years earlier, she had expressed a fear of dying at that age.
An individual does not necessarily have to be a musician to qualify as a "member" of the 27 Club. Rolling Stone included television actor Jonathan Brandis, who committed suicide in 2003, in a list of 27 Club members. Anton Yelchin, who had played in a punk rock band but was primarily known as a film actor, was also described as a member of the club upon his death in 2016.
Likewise, Jean-Michel Basquiat has been included in 27 Club lists, despite the relative brevity of his music career, and his prominence as a graffiti artist and painter.
Scientific studies:
A study by university academics published in the British Medical Journal in December 2011 concluded that there was no increase in the risk of death for musicians at the age of 27, stating that there were equally small increases at ages 25 and 32. The study noted that young adult musicians have a higher death rate than the general young adult population, surmising that "fame may increase the risk of death among musicians, but this risk is not limited to age 27".
The selection criteria for the musicians included in the study, based on having scored a UK No. 1 album between 1956 and 2007, excluded several notable members of the 27 Club, including Hendrix, Joplin, Morrison, Ham, and McKernan.
A 2014 article at The Conversation suggested that statistical evidence shows popular musicians are most likely to die at the age of 56 (2.2% compared to 1.3% at 27).
In popular culture:
Music
- The name of the song "27" by Fall Out Boy from their 2008 album Folie à Deux is a reference to the club. The lyrics explore the hedonistic lifestyles common in rock and roll. Pete Wentz, the primary lyricist of Fall Out Boy, wrote the song because he felt that he was living a similarly dangerous lifestyle.
- John Craigie's song "28", which appeared on his 2009 album Montana Tale, and 2018 live album Opening for Steinbeck, is written from the perspective of 27 Club members Jim Morrison, Janis Joplin, and Kurt Cobain, as each contemplates their respective mortality and imagines what they would do differently "if I could only make it to 28."
- The theme is referenced in the song "27 Forever" by Eric Burdon, on his 2013 album 'Til Your River Runs Dry.
- The band Letlive featured a song named "27 Club" on its 2013 album The Blackest Beautiful.
- Magenta's 2013 studio album The Twenty Seven Club directly references the club. Each track is a tribute to a member of the club.
- Daughtry's song "Long Live Rock & Roll" from their 2013 album Baptized references the club with the lyrics "they're forever 27 – Jimmy, Janis, Brian Jones".
- Rapper Watsky references the club on his 2014 song "All You Can Do" with the lyric, "I tried to join the 27 Club; they kicked me out." The song then goes on to reference some famous members of the club, namely Amy Winehouse, Janis Joplin, Jimi Hendrix, Kurt Cobain, Jim Morrison, and Brian Jones.
- Mac Miller's 2015 song "Brand Name" contains the lyric "To everyone who sell me drugs: Don't mix it with that bullshit, I'm hoping not to join the 27 Club". Miller died aged 26, after consuming counterfeit oxycodone pills that contained fentanyl.
- The song "27 Club" by Ivy Levan, released as a promotional single for her 2015 album No Good, refers to the club.
- JPEGMafia's 2016 album Black Ben Carson includes a song titled "The 27 Club", which the song refers to the infamous club. He explicitly references fallen members Jimi Hendrix, Janis Joplin, and Kurt Cobain.
- The Halsey song "Colors" includes the line "I hope you make it to the day you're 28 years old." The song was written about someone with a serious drug problem, and is widely rumored to be about rock singer Matty Healy of the band The 1975.
- Adore Delano released a song called "27 Club" on her 2017 studio album Whatever, with the repeated lyric "All of the legends die at twenty seven." Delano was aged 27 at the time of release.
- In 2017 the MonaLisa Twins released "Club 27", a song on their album "Orange", about the 27 Club.
- Juice Wrld referenced the club on his 2018 song "Legends" where he says "What's the 27 Club? We ain't making it past 21." The song was dedicated to XXXTentacion, who was killed at 20, and Lil Peep, who died from an overdose at 21. Juice Wrld himself died at the age of 21 from an accidental overdose.
- The Pretty Reckless released a song titled "Rock and Roll Heaven" on its 2021 studio album Death by Rock and Roll. The song is about the club and mentions explicitly in the lyrics Jimi Hendrix, Janis Joplin and Jim Morrison. Frontwoman Taylor Momsen wrote the song after falling into a depressive state from the deaths of her producer Kato Khandwala and Chris Cornell, the latter of whom her band had opened for the night before his death.
- The Blind Channel song "Dark Side", the Finnish entry for the Eurovision Song Contest 2021, includes the lyrics "Like the 27 Club, headshot, we don't wanna grow up".
Video games:
- In the 2016 video game Hitman, one of the in-game missions, Club 27, involves killing an indie musician who is celebrating his 27th birthday.
Comics:
- Cartoonist Luke McGarry created The 27 Club comic series for MAD Magazine, debuting in its relaunch's first issue in 2018. The comics featured Jimi Hendrix, Janis Joplin, Brian Jones, Robert Johnson, Amy Winehouse, Jim Morrison, and Kurt Cobain as paranormal pop stars descending from Rock & Roll Heaven to save the planet with the aid of mortal medium Keith Richards. The series continued in subsequent issues until Potrzebie Comics (the section in which the comic appeared) was retired upon the magazine's 2019 format switch to the reprinting of classic articles for the majority of most new issues.
Identified members:
Because the 27 Club is entirely notional, there is no official membership. The following table lists people described as "members" of the club in reliable published sources, in the opinion of their respective authors.
See also:
- Biography portal
- Music portal
- 23 enigma
- Apophenia
- Curse of the ninth
- List of deaths in rock and roll
- List of murdered hip hop musicians
- Saturn return
- White lighter myth
Festivals, including both an Outline of Festivals and List of Festivals
TOP ROW (L-R): Top 20 Music Festivals in the USA 2022 & These are the most popular festivals in the US according to Google
BOTTOM ROW (L-R) 10 Best Cultural Festivals in America & Top 5 festivals in USA
- YouTube Video of California Festival Ocean Beach, San Diego
- YouTube Video: JAZZ FEST: A New Orleans Story | "In Margaritaville" Official Clip
- YouTube Video: Vegas @ Garden Music Festival 2022 - FULL VIDEO
TOP ROW (L-R): Top 20 Music Festivals in the USA 2022 & These are the most popular festivals in the US according to Google
BOTTOM ROW (L-R) 10 Best Cultural Festivals in America & Top 5 festivals in USA
Click Here for a List of Festivals Around the World and by Country.
A Festival is an event ordinarily celebrated by a community and centering on some characteristic aspect of that community and its religion or cultures. It is often marked as a local or national holiday, mela, or eid. A festival constitutes typical cases of glocalization, as well as the high culture-low culture interrelationship.
Next to religion and folklore, a significant origin is agricultural. Food is such a vital resource that many festivals are associated with harvest time. Religious commemoration and thanksgiving for good harvests are blended in events that take place in autumn, such as Halloween in the northern hemisphere and Easter in the southern.
Festivals often serve to fulfill specific communal purposes, especially in regard to commemoration or thanking to the gods, goddesses or saints: they're called patronal festivals. They may also provide entertainment, which was particularly important to local communities before the advent of mass-produced entertainment.
Festivals that focus on cultural or ethnic topics also seek to inform community members of their traditions; the involvement of elders sharing stories and experience provides a means for unity among families.
Attendants of festivals are often motivated by a desire for escapism, socialization and camaraderie; the practice has been seen as a means of creating geographical connection, belonging and adaptability. Scholarly literature notes that festivals functionally disseminate political values and meaning, such as ownership of place. Festivals may be used to rehabilitate or elevate the image of a city. Albert Piette wrote that in deviating from social routine, festivals reinforce the convention.
Etymology:
The word "festival" was originally used as an adjective from the late fourteenth century, deriving from Latin via Old French. In Middle English, a "festival dai" was a religious holiday. Its first recorded used as a noun was in 1589 (as "Festifall"). Feast first came into usage as a noun circa 1200, and its first recorded use as a verb was circa 1300. The term "feast" is also used in common secular parlance as a synonym for any large or elaborate meal.
When used as in the meaning of a festival, most often refers to a religious festival rather than a film or art festival. In the Philippines and many other former Spanish colonies, the Spanish word fiesta is used to denote a communal religious feast to honor a patron saint.
The word gala comes from Arabic word khil'a, meaning robe of honor. The word gala was initially used to describe "festive dress", but came to be a synonym of festival starting in the 18th century.
History:
Festivals have long been significant in human culture and is found in virtually all cultures. The importance of festivals, to the present, is found in private and public; secular and religious life. Ancient Greek and Roman societies relied heavily upon festivals, both communal and administrative. Saturnalia was likely influential to Christmas and Carnival. Celebration of social occasions, religion and nature were common.
Specific festivals have century-long histories and festivals in general have developed over the last few centuries – some traditional festivals in Ghana, for example, predate European colonization of the 15th century.
Festivals prospered following the Second World War. Both established in 1947, Avignon Festival and the Edinburgh Festival Fringe have been notable in shaping the modern model of festivals. Art festivals became more prominent by the turn of the 21st century. In modern times, festivals are commodified as a global tourist prospect although they're commonly public or not-for-profit.
Traditions:
Many festivals have religious origins and entwine cultural and religious significance in traditional activities. The most important religious festivals such as Christmas, Rosh Hashanah, Diwali, Eid al-Fitr and Eid al-Adha serve to mark out the year. Others, such as harvest festivals, celebrate seasonal change.
Events of historical significance, such as important military victories or other nation-building events also provide the impetus for a festival. An early example is the festival established by Ancient Egyptian Pharaoh Ramesses III celebrating his victory over the Libyans. In many countries, royal holidays commemorate dynastic events just as agricultural holidays are about harvests. Festivals are often commemorated annually.
There are numerous types of festivals in the world and most countries celebrate important events or traditions with traditional cultural events and activities. Most culminate in the consumption of specially prepared food (showing the connection to "feasting") and they bring people together. Festivals are also strongly associated with national holidays. Lists of national festivals are published to make participation easier.
Types of festivals:
The scale of festivals varies; in location and attendance, they may range from a local to national level. Music festivals, for example, often bring together disparate groups of people, such that they're both localized and global. The abundance of festivals significantly hinders quantifying the total, there of.
Religious festivals:
Main article: Religious festival
Among many religions, a feast is a set of celebrations in honor of God or gods. A feast and a festival are historically interchangeable. Most religions have festivals that recur annually and some, such as Passover, Easter and Eid al-Adha are moveable feasts – that is, those that are determined either by lunar or agricultural cycles or the calendar in use at the time.
The Sed festival, for example, celebrated the thirtieth year of an Egyptian pharaoh's rule and then every three (or four in one case) years after that. Among the Ashantis, most of their traditional festivals are linked to gazette sites which are believed to be sacred with several rich biological resources in their pristine forms. Thus, the annual commemoration of the festivals helps in maintaining the buoyancy of the conserved natural site, assisting in biodiversity conservation.
In the Christian liturgical calendar, there are two principal feasts, properly known as the Feast of the Nativity of our Lord (Christmas) and the Feast of the Resurrection (Easter), but minor festivals in honor of local patron saints are celebrated in almost all countries influenced by Christianity.
In the Catholic, Eastern Orthodox, and Anglican liturgical calendars there are a great number of lesser feasts throughout the year commemorating saints, sacred events or doctrines. In the Philippines, each day of the year has at least one specific religious festival, either from Catholic, Islamic, or indigenous origins.
Buddhist religious festivals, such as Esala Perahera are held in Sri Lanka and Thailand. Hindu festivals, such as Holi are very ancient. The Sikh community celebrates the Vaisakhi festival marking the new year and birth of the Khalsa.
Arts festivals:
Main article: Arts festival
Among the many offspring of general arts festivals are also more specific types of festivals, including ones that showcase intellectual or creative achievement such as:
In the Philippines, aside from numerous art festivals scattered throughout the year, February is known as national arts month, the culmination of all art festivals in the entire archipelago. The modern model of music festivals began in the 1960s-70s and have become a lucrative global industry. Predecessors extend back to the 11th century and some, such as the Three Choirs Festival, remain to this day.
Film festivals involve the screenings of several different films and are usually held annually. Some of the most significant film festivals include the Berlin International Film Festival, the Venice Film Festival and the Cannes Film Festival. With the increasing importance of digitalization, one of the online film festival is Best Istanbul Film Festival.
Food Festival:
Main article: Food festival
A food festival is an event celebrating food or drink. These often highlight the output of producers from a certain region. Some food festivals are focused on a particular item of food, such as the National Peanut Festival in the United States, or the Galway International Oyster Festival in Ireland. There are also specific beverage festivals, such as the famous Oktoberfest in Germany for beer.
Many countries hold festivals to celebrate wine. One example is the global celebration of the arrival of Beaujolais nouveau, which involves shipping the new wine around the world for its release date on the third Thursday of November each year. Both Beaujolais nouveau and the Japanese rice wine sake are associated with harvest time. In the Philippines, there are at least two hundred festivals dedicated to food and drinks.
Seasonal and harvest festivals:
Seasonal festivals, such as Beltane, are determined by the solar and the lunar calendars and by the cycle of the seasons, especially because of its effect on food supply, as a result of which there is a wide range of ancient and modern harvest festivals.
Ancient Egyptians relied upon the seasonal inundation caused by the Nile River, a form of irrigation, which provided fertile land for crops.
In the Alps, in autumn the return of the cattle from the mountain pastures to the stables in the valley is celebrated as Almabtrieb. A recognized winter festival, the Chinese New Year, is set by the lunar calendar, and celebrated from the day of the second new moon after the winter solstice. Dree Festival of the Apatanis living in Lower Subansiri District of Arunachal Pradesh is celebrated every year from July 4 to 7 by praying for a bumper crop harvest.
Midsummer or St John's Day, is an example of a seasonal festival, related to the feast day of a Christian saint as well as a celebration of the time of the summer solstice in the northern hemisphere, where it is particularly important in Sweden. Winter carnivals also provide the opportunity to celebrate creative or sporting activities requiring snow and ice.
In the Philippines, each day of the year has at least one festival dedicated to harvesting of crops, fishes, crustaceans, milk, and other local goods.
Study of festivals:
Festive ecology – explores the relationships between the symbolism and the ecology of the plants, fungi and animals associated with cultural events such as festivals, processions and special occasions.
See also:
Outline of Festivals:
Festival – celebration that focuses upon a theme, and may run for hours to weeks. The theme of a festival might be an area of interest such as art, or an aspect of the community in which the festival is being held, such as the community's history or culture. Festivals are often periodical, for example, held annually.
The Types of festivals:
Festival activities:
The activities or events of a festival may be primarily of the spectator or participatory variety, or a mixture of these. A festival may include spectator or participatory variations of one or more of the following types of events or activities, among others.
History of festivals
Specific festivals:
The following are festivals that are events (similar to a fair). They are typically hosted, and are held at a specific location.
Specific festivals by theme:
Specific festivals by region:
A Festival is an event ordinarily celebrated by a community and centering on some characteristic aspect of that community and its religion or cultures. It is often marked as a local or national holiday, mela, or eid. A festival constitutes typical cases of glocalization, as well as the high culture-low culture interrelationship.
Next to religion and folklore, a significant origin is agricultural. Food is such a vital resource that many festivals are associated with harvest time. Religious commemoration and thanksgiving for good harvests are blended in events that take place in autumn, such as Halloween in the northern hemisphere and Easter in the southern.
Festivals often serve to fulfill specific communal purposes, especially in regard to commemoration or thanking to the gods, goddesses or saints: they're called patronal festivals. They may also provide entertainment, which was particularly important to local communities before the advent of mass-produced entertainment.
Festivals that focus on cultural or ethnic topics also seek to inform community members of their traditions; the involvement of elders sharing stories and experience provides a means for unity among families.
Attendants of festivals are often motivated by a desire for escapism, socialization and camaraderie; the practice has been seen as a means of creating geographical connection, belonging and adaptability. Scholarly literature notes that festivals functionally disseminate political values and meaning, such as ownership of place. Festivals may be used to rehabilitate or elevate the image of a city. Albert Piette wrote that in deviating from social routine, festivals reinforce the convention.
Etymology:
The word "festival" was originally used as an adjective from the late fourteenth century, deriving from Latin via Old French. In Middle English, a "festival dai" was a religious holiday. Its first recorded used as a noun was in 1589 (as "Festifall"). Feast first came into usage as a noun circa 1200, and its first recorded use as a verb was circa 1300. The term "feast" is also used in common secular parlance as a synonym for any large or elaborate meal.
When used as in the meaning of a festival, most often refers to a religious festival rather than a film or art festival. In the Philippines and many other former Spanish colonies, the Spanish word fiesta is used to denote a communal religious feast to honor a patron saint.
The word gala comes from Arabic word khil'a, meaning robe of honor. The word gala was initially used to describe "festive dress", but came to be a synonym of festival starting in the 18th century.
History:
Festivals have long been significant in human culture and is found in virtually all cultures. The importance of festivals, to the present, is found in private and public; secular and religious life. Ancient Greek and Roman societies relied heavily upon festivals, both communal and administrative. Saturnalia was likely influential to Christmas and Carnival. Celebration of social occasions, religion and nature were common.
Specific festivals have century-long histories and festivals in general have developed over the last few centuries – some traditional festivals in Ghana, for example, predate European colonization of the 15th century.
Festivals prospered following the Second World War. Both established in 1947, Avignon Festival and the Edinburgh Festival Fringe have been notable in shaping the modern model of festivals. Art festivals became more prominent by the turn of the 21st century. In modern times, festivals are commodified as a global tourist prospect although they're commonly public or not-for-profit.
Traditions:
Many festivals have religious origins and entwine cultural and religious significance in traditional activities. The most important religious festivals such as Christmas, Rosh Hashanah, Diwali, Eid al-Fitr and Eid al-Adha serve to mark out the year. Others, such as harvest festivals, celebrate seasonal change.
Events of historical significance, such as important military victories or other nation-building events also provide the impetus for a festival. An early example is the festival established by Ancient Egyptian Pharaoh Ramesses III celebrating his victory over the Libyans. In many countries, royal holidays commemorate dynastic events just as agricultural holidays are about harvests. Festivals are often commemorated annually.
There are numerous types of festivals in the world and most countries celebrate important events or traditions with traditional cultural events and activities. Most culminate in the consumption of specially prepared food (showing the connection to "feasting") and they bring people together. Festivals are also strongly associated with national holidays. Lists of national festivals are published to make participation easier.
Types of festivals:
The scale of festivals varies; in location and attendance, they may range from a local to national level. Music festivals, for example, often bring together disparate groups of people, such that they're both localized and global. The abundance of festivals significantly hinders quantifying the total, there of.
Religious festivals:
Main article: Religious festival
Among many religions, a feast is a set of celebrations in honor of God or gods. A feast and a festival are historically interchangeable. Most religions have festivals that recur annually and some, such as Passover, Easter and Eid al-Adha are moveable feasts – that is, those that are determined either by lunar or agricultural cycles or the calendar in use at the time.
The Sed festival, for example, celebrated the thirtieth year of an Egyptian pharaoh's rule and then every three (or four in one case) years after that. Among the Ashantis, most of their traditional festivals are linked to gazette sites which are believed to be sacred with several rich biological resources in their pristine forms. Thus, the annual commemoration of the festivals helps in maintaining the buoyancy of the conserved natural site, assisting in biodiversity conservation.
In the Christian liturgical calendar, there are two principal feasts, properly known as the Feast of the Nativity of our Lord (Christmas) and the Feast of the Resurrection (Easter), but minor festivals in honor of local patron saints are celebrated in almost all countries influenced by Christianity.
In the Catholic, Eastern Orthodox, and Anglican liturgical calendars there are a great number of lesser feasts throughout the year commemorating saints, sacred events or doctrines. In the Philippines, each day of the year has at least one specific religious festival, either from Catholic, Islamic, or indigenous origins.
Buddhist religious festivals, such as Esala Perahera are held in Sri Lanka and Thailand. Hindu festivals, such as Holi are very ancient. The Sikh community celebrates the Vaisakhi festival marking the new year and birth of the Khalsa.
Arts festivals:
Main article: Arts festival
Among the many offspring of general arts festivals are also more specific types of festivals, including ones that showcase intellectual or creative achievement such as:
- science festivals,
- literary festivals
- and music festivals.
- Sub-categories include
- comedy festivals,
- rock festivals,
- jazz festivals and buskers festivals;
- poetry festivals,
- theatre festivals,
- and storytelling festivals;
- and re-enactment festivals such as Renaissance fairs.
In the Philippines, aside from numerous art festivals scattered throughout the year, February is known as national arts month, the culmination of all art festivals in the entire archipelago. The modern model of music festivals began in the 1960s-70s and have become a lucrative global industry. Predecessors extend back to the 11th century and some, such as the Three Choirs Festival, remain to this day.
Film festivals involve the screenings of several different films and are usually held annually. Some of the most significant film festivals include the Berlin International Film Festival, the Venice Film Festival and the Cannes Film Festival. With the increasing importance of digitalization, one of the online film festival is Best Istanbul Film Festival.
Food Festival:
Main article: Food festival
A food festival is an event celebrating food or drink. These often highlight the output of producers from a certain region. Some food festivals are focused on a particular item of food, such as the National Peanut Festival in the United States, or the Galway International Oyster Festival in Ireland. There are also specific beverage festivals, such as the famous Oktoberfest in Germany for beer.
Many countries hold festivals to celebrate wine. One example is the global celebration of the arrival of Beaujolais nouveau, which involves shipping the new wine around the world for its release date on the third Thursday of November each year. Both Beaujolais nouveau and the Japanese rice wine sake are associated with harvest time. In the Philippines, there are at least two hundred festivals dedicated to food and drinks.
Seasonal and harvest festivals:
Seasonal festivals, such as Beltane, are determined by the solar and the lunar calendars and by the cycle of the seasons, especially because of its effect on food supply, as a result of which there is a wide range of ancient and modern harvest festivals.
Ancient Egyptians relied upon the seasonal inundation caused by the Nile River, a form of irrigation, which provided fertile land for crops.
In the Alps, in autumn the return of the cattle from the mountain pastures to the stables in the valley is celebrated as Almabtrieb. A recognized winter festival, the Chinese New Year, is set by the lunar calendar, and celebrated from the day of the second new moon after the winter solstice. Dree Festival of the Apatanis living in Lower Subansiri District of Arunachal Pradesh is celebrated every year from July 4 to 7 by praying for a bumper crop harvest.
Midsummer or St John's Day, is an example of a seasonal festival, related to the feast day of a Christian saint as well as a celebration of the time of the summer solstice in the northern hemisphere, where it is particularly important in Sweden. Winter carnivals also provide the opportunity to celebrate creative or sporting activities requiring snow and ice.
In the Philippines, each day of the year has at least one festival dedicated to harvesting of crops, fishes, crustaceans, milk, and other local goods.
Study of festivals:
Festive ecology – explores the relationships between the symbolism and the ecology of the plants, fungi and animals associated with cultural events such as festivals, processions and special occasions.
- Heortology – the study of religious festivals. It was originally only used in respect of Christian festivals, but it now covers all religions, in particular those of Ancient Greece. See list of foods with religious symbolism for some topical overlap.
See also:
- All pages with titles containing Festival
- Philippine fiestas
- Convention
- Event planning
- Fair
- Festive ecology
- Holiday
- Lists of festivals
- Patronal festival
- Procession
- Trade show
- Media related to Festivals at Wikimedia Commons
- The dictionary definition of Festival at Wiktionary
Outline of Festivals:
Festival – celebration that focuses upon a theme, and may run for hours to weeks. The theme of a festival might be an area of interest such as art, or an aspect of the community in which the festival is being held, such as the community's history or culture. Festivals are often periodical, for example, held annually.
The Types of festivals:
- Beer festival –
- Comedy festival –
- Esala Perahera festival –
- Film festival –
- Fire festival (Beltane) –
- Fire festival (the Japanese festival) –
- Folk festival – celebrates traditional folk crafts and folk music.
- Food festival –
- Harvest festival –
- Language festivals
- Literary festival –
- Japanese Cultural Festival –
- Mela Festival –
- Music festival –
- Peanut Festival –
- Religious festival –
- Calendar of saints (Feast days) –
- Hindu festivals –
- List of Sikh festivals –
- Renaissance festival –
- Rock festival – a large-scale rock music concert, featuring multiple acts. Also called a "rock fest".
- Science festival –
- Sindhi festivals –
- Storytelling festival –
- Theatre festival –
- Vegetarian festivals and vegan food fests –
- Video gaming festival –
- Winter festivals –
Festival activities:
The activities or events of a festival may be primarily of the spectator or participatory variety, or a mixture of these. A festival may include spectator or participatory variations of one or more of the following types of events or activities, among others.
- Ceremonies –
- Concerts, music –
- Competitions –
- Contests –
- Dancing events –
- Meals, eating, and drinking –
- Parades –
- Parties –
- Performances –
- Races –
- Singing –
- Speeches –
- Sports –
History of festivals
Specific festivals:
The following are festivals that are events (similar to a fair). They are typically hosted, and are held at a specific location.
Specific festivals by theme:
- List of celtic festivals
- List of dogwood festivals
- List of film festivals
- Animation festivals:
- List of music festivals
- List of opera festivals
Specific festivals by region:
- List of festivals in Australia
- List of festivals in Canada
- List of festivals in Colombia
- List of festivals in Costa Rica
- List of festivals in Fiji
- List of festivals in Iran
- List of festivals in Japan
- List of festivals in Laos
- List of festivals in Macedonia
- List of festivals in Morocco
- List of festivals in Nepal
- List of festivals in the Philippines, also known as Philippine Fiestas
- List of festivals in Romania
- List of festivals in Turkey
- List of festivals in the United Kingdom
- List of festivals in Vietnam
- List of festivals in the United States
- Festivals in California:
- Festivals in Florida;
- Festivals in Illinois:
- List of festivals in Louisiana
- List of festivals in New Jersey
- List of festivals in Pennsylvania
- Festivals in the United States
- Festivals and events in the United States
- FestivalTrek - searchable database of festivals around the world
- Film festivals
- Movie festivals and events worldwide at the Internet Movie Database
- International Film Festival Database
- FilmFestivals.com
- Joobili - Timely Travel Inspirations for European festivals and events
- Music festivals:
- Music festivals in North America
- Festival Family - Community Through Music - Listing of all North American Music Festivals
- Festivalsearcher - Listing of all European Summer Music Festivals
- Festival Network Online - searchable database of festivals for entertainers, vendors, etc.
- Music Festivals - List of Music Festivals in Melbourne and Tickets
- JamBase Festival Guide - List of Music Festivals on JamBase
- MTV Iggy - 2010 Must See Music Festivals